# [Official] AMD R9 280X / 280 & 270X / 270 Owners Club



## Durvelle27

It's the beginning of a new era. Radeon™ is gaming.
*Source Link*

*Ultra Resolution Gaming*

GPUs built for the demands of ultra-high resolution gaming in single and multi-display configurations: get the most out of every square inch of screen.

*Mantle*

There's optimization, and then there's Mantle. Games enabled with Mantle speak the language of the Graphics Core Next architecture to unlock revolutionary performance and image quality. It's a game-changing innovation developed by AMD.

*GCN Architecture*
Designed from the ground up for extraordinary performance with the most demanding games, this is the pinnacle of desktop graphics technology.

*AMD Gaming Facebook*
*AMD Radeon Twitter*
*AMD YouTube*


*AMD Gaming Evolved App, powered by Raptr Link*

*Keep your games optimized*

Auto-optimization provides the best quality and performance settings for your rig based on data collected from the AMD community

*Supported Games Link*

*Real rewards just for playing games*

Earn real rewards just for using the Gaming Evolved app while you play. The more you play, the more rewards you'll unlock!

*Broadcast, Watch, and Chat While You Play*

Broadcast live video via Twitch, watch streams, take screenshots, and share them on Raptr, Facebook, and Twitter without ever leaving your game!

*Gamers Come First*

Join AMD's Gaming Evolved program that strives to create the best possible PC gaming experience by delivering innovative technologies, nurturing open industry standards, and helping the gaming industry maintain the PC platform as the world's premier environment.


----------



## Durvelle27

**Will post Reviews when card releases**

*R9 280X Reviews*

http://www.hardocp.com/article/2013/10/08/asus_r9_280x_directcu_ii_top_video_card_review#.UlOUvVAcRMw

http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-280x-vapor-x-review/

http://www.bit-tech.net/hardware/graphics/2013/10/08/amd-280x-270x-260x-reviews/4

http://techreport.com/review/25466/amd-radeon-r9-280x-and-270x-graphics-cards

http://www.anandtech.com/show/7400/the-radeon-r9-280x-review-feat-asus-xfx

http://www.guru3d.com/articles_pages/radeon_r7_260x_r9_270x_280x_review_benchmarks,1.html

http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-280X-R9-270X-and-R7-260X-Review

http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-280X-R9-270X-and-R7-260X-Review

http://hothardware.com/Reviews/AMD-Radeon-R7-260X-R9-270X-and-R9-280X-Tested/?page=4

http://www.expertreviews.co.uk/graphics-cards/1302859/amd-r9-280x

http://hexus.net/tech/reviews/graphics/60961-amd-radeon-r9-280x-r9-270x-r7-260x/

http://www.overclock3d.net/reviews/gpu_displays/asus_radeon_r9_280x_review/1

http://www.hardwareheaven.com/reviews/1851/pg1/asus-radeon-r9-280x-directcu-ii-top-graphics-card-review-introduction.html

http://www.tomshardware.com/reviews/radeon-r9-280x-r9-270x-r7-260x,3635.html

http://hothardware.com/Reviews/AMD-Radeon-R7-260X-R9-270X-and-R9-280X-Tested/

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/63522-amd-radeon-r9-280x-3gb-review.html

http://www.guru3d.com/articles_pages/asus_radeon_r9_280x_top_review,1.html

http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/

http://www.pcworld.com/article/2052312/msi-radeon-r9-280x-video-card-review-amd-teaches-an-old-gpu-some-new-tricks.html

http://us.hardware.info/reviews/4889/amd-radeon-r7-260x-r9-270x-and-r9-280x-review-new-name-better-performance?utm_source=rss-mixed&utm_medium=rssfeed&utm_campaign=hardwareinfo

*R9 280 Reviews*

*R9 270X Reviews*

http://www.bit-tech.net/hardware/graphics/2013/10/08/amd-280x-270x-260x-reviews/4

http://techreport.com/review/25466/amd-radeon-r9-280x-and-270x-graphics-cards

http://www.guru3d.com/articles_pages/radeon_r7_260x_r9_270x_280x_review_benchmarks,1.html

http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-280X-R9-270X-and-R7-260X-Review

http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-280X-R9-270X-and-R7-260X-Review

http://hothardware.com/Reviews/AMD-Radeon-R7-260X-R9-270X-and-R9-280X-Tested/?page=4

http://hexus.net/tech/reviews/graphics/60961-amd-radeon-r9-280x-r9-270x-r7-260x/

http://hothardware.com/Reviews/AMD-Radeon-R7-260X-R9-270X-and-R9-280X-Tested/

http://www.techspot.com/review/722-radeon-r9-270x-r7-260x/

http://www.techpowerup.com/reviews/AMD/R9_270X/

http://us.hardware.info/reviews/4889/amd-radeon-r7-260x-r9-270x-and-r9-280x-review-new-name-better-performance?utm_source=rss-mixed&utm_medium=rssfeed&utm_campaign=hardwareinfo


----------



## raghu78

OP i think the R7 cards deserve a separate thread. anyway the decision is yours.


----------



## Durvelle27

Quote:


> Originally Posted by *raghu78*
> 
> OP i think the R7 cards deserve a separate thread. anyway the decision is yours.


I thinks its fine


----------



## Slomo4shO

This thread covers the entire lineup bar the Hawaii GPUs. You should consider moving the 260X to a different thread with the 290 series since it is the only card in this thread that actually has TrueAudio.


----------



## f0rteOC

Can't wait till AMD releases these cards!


----------



## raghu78

Quote:


> Originally Posted by *Durvelle27*
> 
> I thinks its fine


in the OP change causal to casual .


----------



## Durvelle27

Quote:


> Originally Posted by *raghu78*
> 
> in the OP change causal to casual .


Lol thx


----------



## Durvelle27

Quote:


> Originally Posted by *Slomo4shO*
> 
> This thread covers the entire lineup bar the Hawaii GPUs. You should consider moving the 260X to a different thread with the 290 series since it is the only card in this thread that actually has TrueAudio.


Not a bad idea Thx.


----------



## Slomo4shO

The specs of the Sapphire lineup seem to be available:

http://wccftech.com/sapphire-radeon-r9-radeon-r7-graphic-card-lineup-leaked-includes-radeon-r9-280x-toxic-vaporx-models/


Spoiler: Warning: Spoiler!


----------



## Durvelle27

Quote:


> Originally Posted by *Slomo4shO*
> 
> The specs of the Sapphire lineup seem to be available:
> 
> http://wccftech.com/sapphire-radeon-r9-radeon-r7-graphic-card-lineup-leaked-includes-radeon-r9-280x-toxic-vaporx-models/


Added to OP


----------



## Durvelle27

From this point on this will be only for the 280X, 280, & 270X GPUs


----------



## Arizonian

Quote:


> Originally Posted by *Durvelle27*
> 
> From this point on this will be only for the 280X, 280, & 270X GPUs


Great idea keeping this thread specific to the R9 rebrands. Will keep things more focused information and easier for R9 owners of this series. Another member will be allowed to create a thread specifically for the R7 Series.

Quote:


> Originally Posted by *Slomo4shO*
> 
> This thread covers the entire lineup bar the Hawaii GPUs. You should consider moving the 260X to a different thread with the 290 series since it is the only card in this thread that actually has TrueAudio.


Good suggestion but even though the 260X will also feature TrueAudio it's in an entirely different league from the 290X and 290 GPU's. It would be better served to be outlined in whomever decides to start up the R7 Series thread for owners it's difference to the rest of the R7 line.


----------



## motherpuncher

Man I'm getting impatient waiting for the Asus Matrix 280x, especially since the egg just put up an open box Matrix 7970 for $328. It's really hard not to pull the trigger on that, but for some reason, even though I know there isn't a difference, I want to hold out for the 280x.


----------



## skitz9417

i cant wait them to hit australia i hope the price is ok and i could get a hd 7950 for 225 dollars


----------



## cloudzeng

How do you already have a R9 280X in your Sig Rig?


----------



## jinxjx

reviews of the R9 280x and the 270x are poping up at .... overclockers...


----------



## skitz9417

http://www.youtube.com/watch?v=ydhfeAvcRBU

http://www.youtube.com/watch?v=CljK1bUT_oU&feature=em-uploademail


----------



## candy_van

Up on AnandTech


----------



## skitz9417

3doverclock.net http://www.overclock3d.net/reviews/gpu_displays/asus_radeon_r9_280x_review/1


----------



## candy_van

G3D's up too:

R7-260X / R9270X / 280X (overview)

ASUS DCII TOP 280X & Gigabyte 280X Windforce


----------



## squad

When will the R9 280x be released, yo?


----------



## skitz9417

today


----------



## squad

Quote:


> Originally Posted by *skitz9417*
> 
> today


please explain where to buy ti right now this moment necessito si


----------



## Rustynails

proof of your 280x with waterblock?


----------



## Slomo4shO

They are available for preorder for a 10/18/13 release on some UK sites:

http://www.overclockers.co.uk/productlist.php?groupid=701&catid=56&subid=1842

However, Asus has suggested that "ASUS R9 280X, R9 270X and R7 260X DirectCU II graphics cards will be available worldwide from 11 October." I am sure the same is true for the other manufacturers.


----------



## Justinbaileyman

I'm in my peeps,will make the purchase soon as Newegg or Amazon list them for preorder. Cant wait... all this excitement is killing me.
I have been eyeing up the stores like a hawk every day to snatch one up but its the same no show day after day.
Whens the Expected Release Date for these beasts??


----------



## candy_van

IIRC ASUS said their cards (so I'm assuming all/most other brands) will be available for purchase 10/11

I got my sights set on the DCII Top (finally back to 2x slots)


----------



## Durvelle27

Sorry guys i know i said i would post my own results for the R9 280X but i'm having some terrible driver issues right now but i will have them up tomorrow


----------



## candy_van

Um how did you get one already?


----------



## Thoth420

Price point on the Reference R9 270X really 199? Where can I get a *reference* preorder? Preferably Sapphire or ASUS....MSI would also do.
I want to surprise a very poor friend with a really nice gift.


----------



## MicroAMD

planning on upgrading my 5850 to 280x. Will I gain 100% performance playing bf 3 or csgo?


----------



## rdr09

Quote:


> Originally Posted by *MicroAMD*
> 
> planning on upgrading my 5850 to 280x. Will I gain 100% performance playing bf 3 or csgo?


maybe 120% no less. you gotta oc the i5 if your sig is current.


----------



## FeelKun

Quote:


> Originally Posted by *candy_van*
> 
> IIRC ASUS said their cards (so I'm assuming all/most other brands) will be available for purchase 10/11
> 
> I got my sights set on the DCII Top (finally back to 2x slots)


Link to preorder if you don't mind kind-sir.


----------



## Slomo4shO

Kitguru has a review up for the Sapphire R9 280X Toxic:


----------



## ironhide138

part of me wants to get the asus 280x and sli them later on...... but I want to see more from the 290 first.


----------



## DizzlePro

Ok i'm stuck between 3 R9 280x

Asus DCUII
Msi tf4
Sapphire Toxic

prices dont matter


----------



## saelz8

Since price doesn't matter, Toxic is getting the best Benches. If you want as much power as possible, I'd look at that one.


----------



## crun

what is the shortest r9 280x?
SAPPHIRE 100363L Radeon R9 280X is about 26.1cm and it should fit my case. any reviews of it? any shorter or similiar size cards?


----------



## candy_van

Quote:


> Originally Posted by *Resme*
> 
> Link to preorder if you don't mind kind-sir.


280Xs

Can't pre-order on Newegg (can auto notify), but they should be available tomorrow









Wondering how late I might stay up in the evening to order one if that's the case lol.
Quote:


> Originally Posted by *saelz8*
> 
> Since price doesn't matter, Toxic is getting the best Benches. If you want as much power as possible, I'd look at that one.


Actually the ASUS Matrix card would most likely take that crown.
Gotta deal with a triple-wide card though.
Quote:


> Originally Posted by *Thoth420*
> 
> Price point on the Reference R9 270X really 199? Where can I get a *reference* preorder? Preferably Sapphire or ASUS....MSI would also do.
> I want to surprise a very poor friend with a really nice gift.


Yep should be $199.

Honestly though I'd consider getting your friend a deal on a 7950 now though before launch.
A lot of them are actually under $199 after rebate, and will be a better card overall (3GB VRAM, 384-bit, more shaders etc).


----------



## Roaches

Subbed....I might find myself a future owner of the 280X/270X in CFX or 290X/290 once I sell my XFX 7850 to my brother....








I'll likely wait for more aftermarket designs rather than make an early purchase....


----------



## Arizonian

The R9 280X are on sale now on Newegg for $299.99 for those interested.









*CLICK HERE* for complete 280X List.

EDIT: Seems the ASUS Matrix is the only one not up as of yet.


----------



## Theroty

Quote:


> Originally Posted by *Arizonian*
> 
> The R9 280X are on sale now on Newegg for $299.99 for those interested.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *CLICK HERE* for complete 280X List.
> 
> EDIT: Seems the ASUS Matrix is the only one not up as of yet.


The ASUS DCII TOP is not up yet either. I think I will send my two 7850s back for 1 of the tops(or even matrix if the price is right). I was going to order it tonight and then sighed when they were not listed.... LOL!


----------



## Jesse D

For those buying a 280x from newegg, use

Code:



Code:


SAVE25OCT11R

for 25 bucks off your purchase of 250 or more.

GOOD TODAY ONLY.


----------



## candy_van

Rrrrrgh, why is the DCII Top not on there yet :/

Quote:


> Originally Posted by *Jesse D*
> 
> For those buying a 280x from newegg, use
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> SAVE25OCT11R
> 
> for 25 bucks off your purchase of 250 or more.
> 
> GOOD TODAY ONLY.


Pretty sure that's not going to work if the card comes with free games.
People tried using codes like that before and the code is void with any kind of bundle (which the free games count as).


----------



## Jesse D

Quote:


> Originally Posted by *candy_van*
> 
> Rrrrrgh, why is the DCII Top not on there yet :/
> Pretty sure that's not going to work if the card comes with free games.
> People tried using codes like that before and the code is void with any kind of bundle (which the free games count as).


I think I stated quite clearly 280x, and AFAIK none of the 280x come with a free game and they are all over $250.

That and the fact I already added one to my cart and made sure it would show reduced prices before I bothered posting...


----------



## candy_van

They just had a code for the exact % off the other day, which wasn't specifically for it. You stating to use it for a 280X didn't necessarily mean it was for it.

Regardless it's good to see that it does then


----------



## xSneak

Which one of these cards would have the better cooler on it? I want to do xfire, and my other card would be a sapphire 7970 oc w/boost. I need something that is 2 slots only to fit my sound card.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814161440&ignorebbr=1 or http://www.newegg.com/Product/Product.aspx?Item=N82E16814121803&ignorebbr=1

When are we going to see a game bundle with these cards also? Do we need to wait till Oct 15 for it to be unveiled with the 290 series?


----------



## Jesse D

Probably the asus... Not only will you have cooler temps, but they also tend to be less noisy and have a better warranty IMHO.


----------



## Offender_Mullet

Jesse D, thanks for the coupon code! +rep









I don't overclock and I was growing impatient for the Asus models to show up, so I ordered the Gigabyte R9 280x GV-R928XOC-3GD instead.

Came out to $283 shipped, with their 3-day shipping option added.


----------



## ArbyWan

Quote:


> Originally Posted by *Jesse D*
> 
> For those buying a 280x from newegg, use
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> SAVE25OCT11R
> 
> for 25 bucks off your purchase of 250 or more.
> 
> GOOD TODAY ONLY.


Code no worky for me







only for US site


----------



## black7hought

I'm stuck between ordering the XFX DD 280X or the Sapphire Vapor-X 280X. Is the lifetime warranty and general quality of the XFX DD 280X worth it? I know it must be registered within 30 days. Sapphire seems to have a better reputation and the Vapor-X is a decent cooler. I'm just curious about the general opinion on OCN between XFX and Sapphire.


----------



## Theroty

Me too. I am looking at the sapphire as well but I really really want the asus model.

Edit: I want to get one and send my 2 7850s back.


----------



## FeelKun

both asus up for order

http://www.newegg.com/Product/Product.aspx?Item=N82E16814121803

http://www.newegg.com/Product/Product.aspx?Item=N82E16814121805


----------



## Chalupa

I'm debating on the Sapphire Toxic and the Asus Matrix.

Which one is better?


----------



## ArbyWan

Quote:


> Originally Posted by *Chalupa*
> 
> I'm debating on the Sapphire Toxic and the Asus Matrix.
> 
> Which one is better?


Sapphire toxic I really like the look of, but not sure about that or MSI's 280x









ASUS's will probably be good too, but wonder how much of the price is because of the ASUS name


----------



## Chalupa

Quote:


> Originally Posted by *ArbyWan*
> 
> Sapphire toxic I really like the look of, but not sure about that or MSI's 280x
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ASUS's will probably be good too, but wonder how much of the price is because of the ASUS name


The Asus Matrix is 349.99 on Newegg.

I really wanted to know if there's a way to compare the Matrix and Toxic performance wise or do I just go in blind? I've heard the Toxic is almost on the same level as a 780, which is very impressive for $300 less.


----------



## Theroty

Just pulled the trigger on the Matrix 280x. Time to send the 7850s back and get out of this madness of crossfire.


----------



## ironhide138

Quote:


> Originally Posted by *black7hought*
> 
> I'm stuck between ordering the XFX DD 280X or the Sapphire Vapor-X 280X. Is the lifetime warranty and general quality of the XFX DD 280X worth it? I know it must be registered within 30 days. Sapphire seems to have a better reputation and the Vapor-X is a decent cooler. I'm just curious about the general opinion on OCN between XFX and Sapphire.


I've heard pretty mediocre things about the DD coolers


----------



## Chalupa

I think I've made up my mind on the Sapphire Toxic as it's only a 2 slot card unlike the Matrix (I thought they were both triple slot cards).

Now I just need to wait for it to be available.


----------



## black7hought

Quote:


> Originally Posted by *ironhide138*
> 
> I've heard pretty mediocre things about the DD coolers


Thanks. What about the ASUS DCII 280X? The description says it has a 256-bit bus.


----------



## FeelKun

Quote:


> Originally Posted by *black7hought*
> 
> Thanks. What about the ASUS DCII 280X? The description says it has a 256-bit bus.


Pretty sure newegg has the description wrong.

https://www.asus.com/Graphics_Cards/R9280XDC2T3GD5/#specifications

https://www.asus.com/Graphics_Cards/R9280XDC2T3GD5/#overview

I ordered the DCII myself.


----------



## candy_van

DCII ordered as well.
Newegg's description is wrong, it's 384-bit.

EDIT:

Can also confirm that promo code works (+REP)
I got mine for $293 after bumping it to 3-day shipping (egg saver, shmeg saver).


----------



## ironhide138

Damn 290 reviews! Why cant you come out now so I can decide!


----------



## TriplePlay

Dang, wish I saw that there was a promo code before! Oh well, I also bought the Asus 280x DCII. Newegg processed my order super fast, I wasn't able to cancel and apply the promo code. I spoke with support, and they would only offer me a $15 gift card. Oh well, better than nothing. I need to buy a headset for my friend anyway, so I'll use that.

Still excited for my new card! This 6850 has been getting by, but not quite really cutting it for some games at higher graphical quality/framerate.


----------



## EXVAS3221

i looking forward to these cards!!!!!








Thanks


----------



## eldukay20

ordered an Asus TOP card myself


----------



## FeelKun

Quote:


> Originally Posted by *candy_van*
> 
> DCII ordered as well.
> Newegg's description is wrong, it's 384-bit.
> 
> EDIT:
> 
> Can also confirm that promo code works (+REP)
> I got mine for $293 after bumping it to 3-day shipping (egg saver, shmeg saver).


Agreed.







3 Day shipping myself. 293$ also


----------



## Theroty

I have to pay sales tax. My Matrix 280x shipped was 364 after the promo.

Already have one of my 7850s boxed up for return.


----------



## codepink

What are peoples thought about the ASUS DC II card vs the MSI Twin Frozr IV. I've always loved the build grade of MSI...


----------



## dimwit13

Well, I have a week or so before I order mine.
Looking for the best 280x, with a black pcb (Sapphire blue one, wont cut it).
I don't really care about the cooler, its going under water.
So I figure, I will lurk around and see which ones will be best.

-dimwit-

I wish we would hear more on the 290!?!?!


----------



## MooseHead

So is the Sapphire Toxic just not available yet to order or is it sold out?!?!?


----------



## motherpuncher

I believe I saw it available earlier today so I guess just sold out.


----------



## saelz8

Quote:


> Originally Posted by *codepink*
> 
> What are peoples thought about the ASUS DC II card vs the MSI Twin Frozr IV. I've always loved the build grade of MSI...


The MSI cards cooler isn't that great according to TechPowerup's review. VRM was overheating and temps were high. Link.


----------



## MooseHead

Quote:


> Originally Posted by *motherpuncher*
> 
> I believe I saw it available earlier today so I guess just sold out.


Saddening..Just missed it then.. question now is... Do I get a Asus DCIIT, Matrix Platinum, or wait for the Toxic to come back in stock...


----------



## Nevk

nice article


----------



## Siezureboy

I'm a little hesitant about getting the toxic as I've read that it's voltage locked and you can't surpass it with any sort of oc software.


----------



## Phixit

ASUS R9280X-DC2T-3GD5 Radeon R9 280X 3GB $319.99 CAD

or

MSI R9 280X GAMING 3G Radeon R9 280X 3GB $329.99 CAD

or

GIGABYTE GV-R928XOC-3GD Radeon R9 280X 3GB $329.99 CAD










I'll order mine next week.


----------



## xSneak

So if i buy a card now, and they come out with a new never settle bundle next week, am i going to qualify for it?


----------



## AgentHydra

Dam, just bought a 280X DC2 a couple minutes ago, wish I had seen the coupon earlier, couldn't cancel in time lol.

Still a great deal and coming from a 6870 should be a pretty epic upgrade, I am excite.


----------



## raghu78

Quote:


> Originally Posted by *Phixit*
> 
> *ASUS R9280X-DC2T-3GD5 Radeon R9 280X 3GB $319.99 CAD*


go for the dcii







fantastic cooler. very low noise

http://www.techpowerup.com/forums/showthread.php?t=191904&page=4
http://hexus.net/tech/reviews/graphics/61013-asus-radeon-r9-280x-directcu-ii-top/?page=10
http://www.anandtech.com/show/7400/the-radeon-r9-280x-review-feat-asus-xfx/20


----------



## Chalupa

Quote:


> Originally Posted by *motherpuncher*
> 
> I believe I saw it available earlier today so I guess just sold out.


Unless Newegg's Auto-Notify feature doesn't work - it hasn't came out yet.


----------



## motherpuncher

Quote:


> Originally Posted by *Chalupa*
> 
> Unless Newegg's Auto-Notify feature doesn't work - it hasn't came out yet.


Nah it's probably working, I was in and out a lot today and looking at the page a little infrequently and easily could have mistaken it for something else. I was trying to keep up with when the ASUS' came out and saw the Toxic price pop up and probably just assumed it was available since the ASUS cards still said coming soon.


----------



## Shurtugal

I've ordered my 280x Matrix ($470 AUD), won't get it until next weekend though because i'm going away for a while. I'll post some pics when it arrives!!


----------



## MooseHead

Quote:


> Originally Posted by *Chalupa*
> 
> Unless Newegg's Auto-Notify feature doesn't work - it hasn't came out yet.


I hope that's the case, and I hope it gets released soon so I can apply the Newegg coupon.


----------



## Scorpion87

If anyone gets an MSI R9 280X Gaming 3G, can he/she upload the bios ? Because the card seems to be using an AMD reference PCB. Thanks in advance


----------



## Genma

Sapphire R9 280x Toxic on the way! Ordered mine from amazon. It will be a wait though but well worth it!


----------



## ironhide138

I really wish the Toxic wasn't orange







If theres one thing that bugs me more than coloured PCBs (blue, red etc) its when companys put weird colours on things to stand out







( im looking at you Noctua.This isnt the 1970s)


----------



## MooseHead

Quote:


> Originally Posted by *ironhide138*
> 
> I really wish the Toxic wasn't orange
> 
> 
> 
> 
> 
> 
> 
> If theres one thing that bugs me more than coloured PCBs (blue, red etc) its when companys put weird colours on things to stand out
> 
> 
> 
> 
> 
> 
> 
> ( im looking at you Noctua.This isnt the 1970s)


I wish Gigabyte used the matte black pcb's they use for their motherboards on their gpu's


----------



## AgentHydra

Looks like the Asus 280X DC2 is already OOS on Newegg, glad I grabbed one.

For me it was between the MSI and Asus, I eliminated HIS, Sapphire, Powercolor, and Gigabyte just because of the PCB color







XFX card looks great but I've heard too many bad things about their shoddy custom PCBs.


----------



## ironhide138

Quote:


> Originally Posted by *MooseHead*
> 
> I wish Gigabyte used the matte black pcb's they use for their motherboards on their gpu's


that 3 fan cooler looks great, but the blue PCB..... yuck


----------



## Roaches

Has anyone got their hands on a Toxic 280X yet?....Newegg is still out of stock.....


----------



## MooseHead

Quote:


> Originally Posted by *Roaches*
> 
> Has anyone got their hands on a Toxic 280X yet?....Newegg is still out of stock.....


You can order from Amazon... doesn't ship for a while though so I'm assuming that's when Newegg will get their's?


----------



## Roaches

Quote:


> Originally Posted by *MooseHead*
> 
> You can order from Amazon... doesn't ship for a while though so I'm assuming that's when Newegg will get their's?


>ships within 2 to 4 weeks

No thanks, even though I have an amazon account. Though its $10 cheaper than Newegg's offering....


----------



## Arizonian

Congrats for getting the [Official] tag on the club from Staryoshi our Graphic Cards Editor.









To the members - Enjoy your cards when you get'em guys. Subbed to see.


----------



## lidunchaa

a different thread with the 290 series since it is the only card in this thread that actually has TrueAudio.


----------



## Durvelle27

There is no R9 290 in this thread. This thread is specifically for the R9 280X, R9 280 & R9 270X


----------



## Theroty

AH! I'm so impatient! I was hoping it would be here on Monday but apparently the Memphis warehouse is not stocking them yet. My card shipping out of CA and is estimated to be here Wed.


----------



## Durvelle27

Quote:


> Originally Posted by *Theroty*
> 
> AH! I'm so impatient! I was hoping it would be here on Monday but apparently the Memphis warehouse is not stocking them yet. My card shipping out of CA and is estimated to be here Wed.


Nice to see another Tennessee resident on here


----------



## Theroty

Quote:


> Originally Posted by *Durvelle27*
> 
> Nice to see another Tennessee resident on here


That is the truth! I grew up in Cookeville and now I live in Sparta.


----------



## Durvelle27

Quote:


> Originally Posted by *Theroty*
> 
> That is the truth! I grew up in Cookeville and now I live in Sparta.


Memphis guy here


----------



## D3TH.GRUNT

I bought an Asus R9 280X DCII the other day, it should hopefully be here monday or tuesday. I will post pics when it arrives.


----------



## Jesse D

For any that missed the 25 off 250 code the other day there's a new one...

Code:



Code:


MBLOCT12

It's good up to midnight tonight and will work on the 280x again or 270/60 if you add more to your cart.

I really wish the 290x was out as I'm having a hell of a time deciding if I'm going to pick up one of them or shoot for a 280x (if 290x come in at over 600 as msi eu prices have shown then I won't be touching it) With the asus 280x already being oos though I'm willing to wait.


----------



## Blinky7

Is there any news on a R9-280 non-X ? Basically the 7950 equivalent?

I sold my GTX760 about 2 weeks ago and I am growing impatient being with the intel HD3000 for so long








I want to get a 7950 ideally but the prices are not right. I mean, the prices of 7950 and 7970 have not gone down enough considering the 280X can be had for as low as 250 euros in Germany...

If I take the plunge and buy a 280X , is there any news on which cards are voltage unlocked?


----------



## candy_van

Quote:


> Originally Posted by *Blinky7*
> 
> Is there any news on a R9-280 non-X ? Basically the 7950 equivalent?
> 
> I sold my GTX760 about 2 weeks ago and I am growing impatient being with the intel HD3000 for so long
> 
> 
> 
> 
> 
> 
> 
> 
> I want to get a 7950 ideally but the prices are not right. I mean, the prices of 7950 and 7970 have not gone down enough considering the 280X can be had for as low as 250 euros in Germany...
> 
> If I take the plunge and buy a 280X , is there any news on which cards are voltage unlocked?


The 280 is probably going to be about $250 I'd guess since the 270X and 280X are $199 and $299 respectively (US prices at least, not 100% on EU pricing, but sure it's fairly close).
IMO if the price difference is that small and you can't get a good deal on a 7950 over there (there are still some stateside) may as well go for the gold.

Check out the reviews posted in the thread for voltage unlocked ones, the ASUS definitely is and I'd assume MSI too, but would have to double check on some other ones.


----------



## Blinky7

The thing is, will there be a 280? I haven't seen any official confirmation, and it's already weird that it wasn't anounced together with the 280X...
And I guess, eventually there will probbaly be one, to fill the 100$ gap between the 270X and 280X, but if that's gonna be in Xmas it's too late for me to wait...


----------



## nastytime

Long time lurker on this site...decided to make a post.

I just picked up a MSI Twin Frozr R9 280x from frys yesterday for 314.00. I finally upgraded from my crossfire 5850's and this card is awesome. I recently purchased a dell U3011 so gaming at 2560x1600 was putting the hurt on my older cards. I plan to add one more 280x this week. Running the latest driver release 13.11 beta 1 and BF4 beta on high settings seems to play pretty freaking well for one card. Cant wait to see what official drivers might improve.

http://s239.photobucket.com/user/na...r Stuff/2013-10-13152018_zpsc57d804a.jpg.html
http://s239.photobucket.com/user/na...r Stuff/2013-10-12183356_zps9b22dad0.jpg.html

My old cards
http://s239.photobucket.com/user/na...r Stuff/2013-10-12121012_zps10640959.jpg.html

Sorry for the crappy pic was in a hurry to play.
http://s239.photobucket.com/user/na...r Stuff/2013-10-13154758_zps142645a0.jpg.html


----------



## Arizonian

Quote:


> Originally Posted by *nastytime*
> 
> Long time lurker on this site...decided to make a post.
> 
> I just picked up a MSI Twin Frozr R9 280x from frys yesterday for 314.00. I finally upgraded from my crossfire 5850's and this card is awesome. I recently purchased a dell U3011 so gaming at 2500x1600 was putting the hurt on my older cards. I plan to add one more 280x this week. Running the latest driver release 13.11 beta 1 and BF4 beta on high settings seems to play pretty freaking well for one card. Cant wait to see what official drivers might improve.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s239.photobucket.com/user/na...r Stuff/2013-10-13152018_zpsc57d804a.jpg.html
> http://s239.photobucket.com/user/na...r Stuff/2013-10-12183356_zps9b22dad0.jpg.html
> 
> My old cards
> http://s239.photobucket.com/user/na...r Stuff/2013-10-12121012_zps10640959.jpg.html
> 
> Sorry for the crappy pic was in a hurry to play.
> http://s239.photobucket.com/user/na...r Stuff/2013-10-13151616_zpsb1ef120e.jpg.html


Well welcome to OCN with your first post and pic of new MSI TwinFrozr 280X out.









Post a pic of your cards with your OCN name in screen shot, etc... and Durvelle27 will add you to the OP members list. Be nice if you post back with what kind of FPS your getting on your 1600p monitor with single card playing BF4 beta high settings if you should be monitoring that.


----------



## Slomo4shO

Quote:


> Originally Posted by *nastytime*
> 
> Long time lurker on this site...decided to make a post.
> 
> I just picked up a MSI Twin Frozr R9 280x from frys yesterday for 314.00. I finally upgraded from my crossfire 5850's and this card is awesome. I recently purchased a dell U3011 so gaming at 2500x1600 was putting the hurt on my older cards. I plan to add one more 280x this week. Running the latest driver release 13.11 beta 1 and BF4 beta on high settings seems to play pretty freaking well for one card. Cant wait to see what official drivers might improve.


You may have wanted to wait until Tuesday to see how the 290 and 290X does if you were planning on spending $600-650 on your GPU upgrade (280X crossfire).


----------



## Cid

Rumor has it the 290(X) launch and review embargo has been pushed back a bit. Anyway, that's just rumors.

So, is it just the Benelux that doesn't get any 280X stock, or are there other people who also just keep seeing empty digital shelves? I want my DCII Top, dagnabbit.


----------



## Durvelle27

Quote:


> Originally Posted by *nastytime*
> 
> Long time lurker on this site...decided to make a post.
> 
> I just picked up a MSI Twin Frozr R9 280x from frys yesterday for 314.00. I finally upgraded from my crossfire 5850's and this card is awesome. I recently purchased a dell U3011 so gaming at 2500x1600 was putting the hurt on my older cards. I plan to add one more 280x this week. Running the latest driver release 13.11 beta 1 and BF4 beta on high settings seems to play pretty freaking well for one card. Cant wait to see what official drivers might improve.
> 
> http://s239.photobucket.com/user/na...r Stuff/2013-10-13152018_zpsc57d804a.jpg.html
> http://s239.photobucket.com/user/na...r Stuff/2013-10-12183356_zps9b22dad0.jpg.html
> 
> My old cards
> http://s239.photobucket.com/user/na...r Stuff/2013-10-12121012_zps10640959.jpg.html
> 
> Sorry for the crappy pic was in a hurry to play.
> http://s239.photobucket.com/user/na...r Stuff/2013-10-13154758_zps142645a0.jpg.html


Upload a copy of the BIOs


----------



## nastytime

here ya go... yeah i was gonna wait for the 290x but the date has been pushed back due to shipping reasons I believe.

Running BF4 beta / 64players on high settings at 2560x1600, I'm getting around on average 46fps and higher depending on areas etc... If i knock that down to medium it jumps to 58+. Its still very playable on either settings and this is with beta drivers.

http://s239.photobucket.com/user/na...r Stuff/2013-10-13155511_zpsc0d2f64c.jpg.html

http://s239.photobucket.com/user/nastytyme/media/Computer Stuff/Capture_zps288f9d8b.jpg.html

http://s239.photobucket.com/user/na...r Stuff/2013-10-13165351_zpsc61944d8.jpg.html

http://s239.photobucket.com/user/na...r Stuff/2013-10-13164903_zpsbb3dc27f.jpg.html


----------



## nastytime

https://db.tt/DnMQB2a0

Heres a copy of the Stock MSI Twin Frozr R9 280x Bios

I also uploaded it to techpowerup's database as well. Should be up there shortly after they verify it.


----------



## Slomo4shO

Quote:


> Originally Posted by *nastytime*
> 
> http://s239.photobucket.com/user/nastytyme/media/Computer Stuff/Capture_zps288f9d8b.jpg.html


Odd, I thought the stock boost speed of the MSI 280X Gaming was 1050MHz...


----------



## nastytime

Quote:


> Originally Posted by *Slomo4shO*
> 
> Odd, I thought the stock boost speed of the MSI 280X Gaming was 1050MHz...


Well I found out why... you have to enable the MSI gamin app to get the "boost",

http://s239.photobucket.com/user/nastytyme/media/Computer Stuff/Capture_zpsa0c21370.jpg.html


----------



## Shurtugal

Well, if it works that good on 2560x1600, it should work like a charm on my 1600x900








Still need to upgrade my monitor but that can wait, getting sick of playing on my gt 430


----------



## Chalupa

I wanted to buy a Sapphire Toxic but with Newegg's $25 off promo code ending tonight I decided to get a Matrix because it was in stock. My only concern is not having the ability to run crossfire in the future.


----------



## FeelKun

Quote:


> Originally Posted by *nastytime*
> 
> here ya go... yeah i was gonna wait for the 290x but the date has been pushed back due to shipping reasons I believe.
> 
> Running BF4 beta / 64players on high settings at 2560x1600, I'm getting around on average 46fps and higher depending on areas etc... If i knock that down to medium it jumps to 58+. Its still very playable on either settings and this is with beta drivers.


Great news... Gonna be running 1440p myself. Planning on getting a second one later after i see how this one goes.


----------



## Siezureboy

gnugghhhh you guys make me want to to get one of these so bad. At least with the asus and sapphire they're outta stock so ill have an excuse to wait.


----------



## Theroty

AH! I can't stand how UPS handles packages sometimes. My video card has went from CA->TX->GA-> and now in SC! It could have made it here twice already! LOL!


----------



## nastytime

Well I pulled the trigger and purchased a second Twin Frozr 280x from the frys here in Houston on my lunch break. I will post screen shots and fps benchmarks when I get home this evening.


----------



## FeelKun

Quote:


> Originally Posted by *nastytime*
> 
> Well I pulled the trigger and purchased a second Twin Frozr 280x from the frys here in Houston on my lunch break. I will post screen shots and fps benchmarks when I get home this evening.


great dude!!! Make sure to post your resolution


----------



## Theroty

Quote:


> Originally Posted by *Chalupa*
> 
> I wanted to buy a Sapphire Toxic but with Newegg's $25 off promo code ending tonight I decided to get a Matrix because it was in stock. My only concern is not having the ability to run crossfire in the future.


Judging by the appearance of your mobo in the pics, I'm not sure crossfire will be possible with your slot spacing.


----------



## Wappo

Quote:


> Originally Posted by *nastytime*
> 
> Well I pulled the trigger and purchased a second Twin Frozr 280x from the frys here in Houston on my lunch break. I will post screen shots and fps benchmarks when I get home this evening.


I was wondering if you tested the MSI for heat during load yet. The techpowerup review for this card said it was running really hot and that the VRM circuitry overheats.

Link: http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/31.html

Edit: Just saw that this was mentioned earlier, but it would be nice to hear from someone who has it.


----------



## nastytime

Quote:


> Originally Posted by *Wappo*
> 
> I was wondering if you tested the MSI for heat during load yet. The techpowerup review for this card said it was running really hot and that the VRM circuitry overheats.
> 
> Link: http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/31.html
> 
> Edit: Just saw that this was mentioned earlier, but it would be nice to hear from someone who has it.


I will post some screen shots of the temps when I get home this evening. I know last night the gpu temp was around 75ish while running furmark under load. Playing BF4 beta it was lower if i can recall, did not check on the VRMs.


----------



## ducknukem86

Someone on the Cooling forum mentioned the Gigabyte 280x being voltage locked


----------



## nastytime

Okay so i have run into a snag with the beta drivers and crossfire. I just got home and installed the second card. Cataylast Control center does not install correctly. I have fully cleaned the drivers off this machine and installed the beta again. Under GPUZ 0.7.3 it shows ATI crossfire enabled which it is from testing Heaven Benchmark.

Tested BF4 with ultra settings on 2560x1600 and I was avg right around 60 and above. Now with that said.... I believe a lot will hing on driver updates for these cards in CF.

Heres the benchmark at 2560x1600:
Tessellation Normal
http://s239.photobucket.com/user/nastytyme/media/Computer Stuff/heavenbench_zpsee1faae1.jpg.html
Tessellation Off
http://s239.photobucket.com/user/nastytyme/media/Computer Stuff/tessloff_zpsfdfbed33.jpg.html

Here is also a temp shot of just a single card running under load, the VRMs are getting pretty hot:
http://s239.photobucket.com/user/na...burnintestonecardoldbios_zpsdd5d81e8.jpg.html

And just for fun both the cards:
http://s239.photobucket.com/user/na...r Stuff/2013-10-14172358_zps7278b605.jpg.html

***On another note GPUZ detects my second card as stuck on 1020/1500 at idle? while the first card is back at 300/150.
http://s239.photobucket.com/user/nastytyme/media/Computer Stuff/2ndcard_zps35877309.jpg.html

I am going to test the drivers that came with the disc and see what happens. I am not sure about all these weird issues at the moment.


----------



## ironhide138

For us poor bastards, you mind giving us benchmarks for a single card at 1080p ? mostly wana know how BF4 runs


----------



## navit

http://search.yahoo.com/r/_ylt=A0SO8xn_jVxSvHwAvblXNyoA;_ylu=X3oDMTEwbWoybXBjBHNlYwNzcgRwb3MDMQRjb2xvA2dxMQR2dGlkAzc5NV8x/SIG=126f2r46e/EXP=1381826175/**http%3a//www.asus.com/Graphics_Cards/R9280XDC2T3GD5/
I am looking at this card and cant find the boost clock, Even the Asus site just lists it at 1070 and no boost.
Does this sound right? I would love it to be.... I want to crossfire it with my lightning.


----------



## VegetarianEater

unfortunate that the msi 280x has such high temps, because it's the best looking card IMO. oh well if i get one of these it'll be asus then...


----------



## ducknukem86

Quote:


> Originally Posted by *VegetarianEater*
> 
> unfortunate that the msi 280x has such high temps, because it's the best looking card IMO. oh well if i get one of these it'll be asus then...


As far as i know it was only the TPU review that showed this situation. From other sites, i haven't seen any complaints about its cooling. I'm still trying to decide if the MSI would be a good purchase, as it would match my MSI G45 motherboard.

ASUS DCU II TOP seems to be everyone's favorite


----------



## nastytime

Well no matter what drivers I try 13.10 or 13.11, still having issues with crossfire and the second card showing max under idle. Temps are still okay staying around 50 ish on second card, but driver support needs to be updated soon.

This is during a round of BF4 beta ultra settings 2560x1600:

Second GPU is on the left
First one on the right.

This is how the temps look even with the confused / jacked up drivers 13.11 beta drivers.
http://s239.photobucket.com/user/nastytyme/media/2013-10-14210517_zps0d8f03c0.jpg.html


----------



## VegetarianEater

Quote:


> Originally Posted by *ducknukem86*
> 
> As far as i know it was only the TPU review that showed this situation. From other sites, i haven't seen any complaints about its cooling. I'm still trying to decide if the MSI would be a good purchase, as it would match my MSI G45 motherboard.
> 
> ASUS DCU II TOP seems to be everyone's favorite


yeah but the guy in this thread showed his temps, they weren't that great... and seeing as i'm putting my cards in a case with little airflow i need cooler cards... DC2T just seems to be a better choice currently. i had a twin frozr 7850, it was great, temps never went above 60C while overclocked to 1050mhz, and all the reviews for the gtx 700 series twin frozrs had them being the coolest, but this 280x is at best average, at worst the worst cooled 280x...

and yeah i was also thinking about getting MSI graphics cards and a g45 or gd65 gaming mobo, a few months ago i was set on sli 760s, now crossfire 280x is looking like a much better deal, unless they drop the 4gb 760s to 250-260. or maybe i'll just get a 290x *shrugs*


----------



## AgentHydra

Loving this card, blows my 6870 out of the water. Max temp with the DC2 after 3DMark/11/Vantage and some Farcry 3 was 74C with fan on auto (~35% or thereabouts, still very quiet).


----------



## nastytime

Sorry guys for all the updates. But I am returning both cards in the morning. I just had my first card blow a cap while i was playing BF4. It was loud and I immediately shut the pc to figure out what had happened. Well took the first card out and I heard something rolling around.....yep found it.

Used to look like this:
http://s239.photobucket.com/user/na...r Stuff/2013-10-14213742_zps6fa0fc7d.jpg.html

Then like this:
http://s239.photobucket.com/user/na...r Stuff/2013-10-14213306_zps868fa342.jpg.html
and
http://s239.photobucket.com/user/na...r Stuff/2013-10-14213231_zpsb42fd366.jpg.html

Well looks like I will just wait for the 290x. If anyone sees a reduced price sticker on this model at frys in Houston beware.


----------



## Wappo

Quote:


> Originally Posted by *ducknukem86*
> 
> As far as i know it was only the TPU review that showed this situation. From other sites, i haven't seen any complaints about its cooling. I'm still trying to decide if the MSI would be a good purchase, as it would match my MSI G45 motherboard.
> 
> ASUS DCU II TOP seems to be everyone's favorite


That same Techpowerup review also had an update at the bottom of his review saying that MSI sent him a bios update that fixed the issues (somewhat). Not sure if there's an official link for the bios update though.


----------



## ducknukem86

Quote:


> Originally Posted by *nastytime*
> 
> Sorry guys for all the updates. But I am returning both cards in the morning. I just had my first card blow a cap while i was playing BF4. It was loud and I immediately shut the pc to figure out what had happened. Well took the first card out and I heard something rolling around.....yep found it.
> 
> Used to look like this:
> http://s239.photobucket.com/user/na...r Stuff/2013-10-14213742_zps6fa0fc7d.jpg.html
> 
> Then like this:
> http://s239.photobucket.com/user/na...r Stuff/2013-10-14213306_zps868fa342.jpg.html
> and
> http://s239.photobucket.com/user/na...r Stuff/2013-10-14213231_zpsb42fd366.jpg.html
> 
> Well looks like I will just wait for the 290x. If anyone sees a reduced price sticker on this model at frys in Houston beware.


Wow, that's a shame!


----------



## Theroty

GAH! My Matrix has sat in SC since 4 this morning! AAAHHH!!! LOL!

Sorry about your luck with the CAP! That is a real shame.

And earlier in the other pic... 106 C on the VRM.. that is crazy!


----------



## FeelKun

Quote:


> Originally Posted by *Theroty*
> 
> GAH! My Matrix has sat in SC since 4 this morning! AAAHHH!!! LOL!
> 
> Sorry about your luck with the CAP! That is a real shame.
> 
> And earlier in the other pic... 106 C on the VRM.. that is crazy!


10/14/2013 17:04:00 ARRIVAL SCAN _LOUISVILLE, KY, US
10/14/2013 16:37:00 DEPARTURE SCAN_ LOUISVILLE, KY, US
10/14/2013 11:25:00 ARRIVAL SCAN _LOUISVILLE, KY, US
10/12/2013 22:17:00 DEPARTURE SCAN_ ONTARIO, CA, US
10/11/2013 21:54:00 ARRIVAL SCAN _ONTARIO, CA, US
10/11/2013 21:21:00 DEPARTURE SCAN_ BALDWIN PARK, CA, US
10/11/2013 14:05:00 ORIGIN SCAN _BALDWIN PARK, CA, US

I live in Virginia... Says expected delivery the 16th._


----------



## MooseHead

Toxic r9 280x or fork out the extra $$ for the new but reference 290/290x


----------



## ironhide138

Quote:


> Originally Posted by *MooseHead*
> 
> Toxic r9 280x or fork out the extra $$ for the new but reference 290/290x


wait for a non reference 290. Honestly, I want to jump on one too... but I've learned from the past not to be an early adopter


----------



## Crowe98

Hey guys, i've made a thread already but im debating on whether i should get the ASUS R9 280x DCUII


Spoiler: Link:



http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1559&products_id=25324


or the MSI R9 280x Gaming.


Spoiler: Link:



http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1559&products_id=25332



I know these cards are both red, and my build is mainly blue/black themed


Spoiler: Photos of rig:







but, the red colourings will be very hard to see when the card is installed. So, i need your opinions, which would be better?


----------



## FeelKun

Quote:


> Originally Posted by *Crowe98*
> 
> Hey guys, i've made a thread already but im debating on whether i should get the ASUS R9 280x DCUII
> 
> 
> Spoiler: Link:
> 
> 
> 
> http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1559&products_id=25324
> 
> 
> or the MSI R9 280x Gaming.
> 
> 
> Spoiler: Link:
> 
> 
> 
> http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1559&products_id=25332
> 
> 
> 
> I know these cards are both red, and my build is mainly blue/black themed
> 
> 
> Spoiler: Photos of rig:
> 
> 
> 
> 
> 
> 
> 
> but, the red colourings will be very hard to see when the card is installed. So, i need your opinions, which would be better?


Once I get my Asus dc2 I'll post pics here and give a personal review.


----------



## Crowe98

Quote:


> Originally Posted by *Resme*
> 
> Once I get my Asus dc2 I'll post pics here and give a personal review.


What date are you getting your 280?


----------



## ducknukem86

This sucks, while waiting for the money to buy the 280x, Newegg has sold out almost all of them! msi, asus, asus matrix, sapphire vapor x!


----------



## candy_van

They'll have plenty more in stock soon enough, it's not like these are "new" cards with limited production issues or anything.
By the time you've got your scratch together someone will have the card you want


----------



## ducknukem86

Quote:


> Originally Posted by *candy_van*
> 
> They'll have plenty more in stock soon enough, it's not like these are "new" cards with limited production issues or anything.
> By the time you've got your scratch together someone will have the card you want


I can even dream of the 290 if the price is right


----------



## candy_van

Yea don't think it is though (at least not for me), word on the street is 290X = $699, 290 = $599


----------



## eAT5

280x compared to 2 7970's? almost same price...


----------



## ironhide138

Quote:


> Originally Posted by *eAT5*
> 
> 280x compared to 2 7970's? almost same price...


The 280x and 7970 are the same price more or less.


----------



## ducknukem86

Quote:


> Originally Posted by *candy_van*
> 
> Yea don't think it is though (at least not for me), word on the street is 290X = $699, 290 = $599


599 would be too much, i was think more like 450 or something.


----------



## eAT5

Quote:


> Originally Posted by *ironhide138*
> 
> The 280x and 7970 are the same price more or less.


my 2 7970's were 618$


----------



## ironhide138

Quote:


> Originally Posted by *eAT5*
> 
> my 2 7970's were 618$


ah, you said 280x, you meant 290x


----------



## eAT5

Quote:


> Originally Posted by *ironhide138*
> 
> ah, you said 280x, you meant 290x


im not fimiliar with the new Cards. wont be getting one for at least 2 years, my 7970's are only 2 month old...


----------



## ironhide138

Quote:


> Originally Posted by *eAT5*
> 
> im not fimiliar with the new Cards. wont be getting one for at least 2 years, my 7970's are only 2 month old...


the 280x is just a rebranded 7970ghz. the 290x is the new gem. 600ish$ card.


----------



## Offender_Mullet

Finally arrived.


----------



## ducknukem86

Please tell me if it's voltage locked


----------



## Offender_Mullet

Quote:


> Originally Posted by *ducknukem86*
> 
> Please tell me if it's voltage locked


I don't oc my hardware, otherwise I'd check for ya.


----------



## ironhide138

Dumb question, but if the 280x is a rebranded 7970, will 7970 backplates fit the card? like would a 7970 dcii backplate fit a 280x dcii?


----------



## candy_van

That's actually a pretty good question.
I can't throw up picks right now, but TPU usually had nice PCB layout shots for hard mods you could use to compare them.


----------



## Phixit

I just ordered a SAPPHIRE VAPOR-X R9 280X, it was the limit of my budget (I still need to pay for my 4670k upgrade)


----------



## ironhide138

Quote:


> Originally Posted by *candy_van*
> 
> That's actually a pretty good question.
> I can't throw up picks right now, but TPU usually had nice PCB layout shots for hard mods you could use to compare them.


I guess only time will tell. I really hate non backplated gpus







even worse is when they use coloured PCBs...... blue, red.... yuck


----------



## Skylark71

Quote:


> Originally Posted by *ducknukem86*
> 
> Please tell me if it's voltage locked


I have Gigabyte R9 280X oc, and sadly i can tell that it`s voltage locked...


----------



## candy_van

Quote:


> Originally Posted by *ironhide138*
> 
> I guess only time will tell. I really hate non backplated gpus
> 
> 
> 
> 
> 
> 
> 
> even worse is when they use coloured PCBs...... blue, red.... yuck


Well good news there is the DCII has a black PCB








Also you could probably get one custom made if Dwood still does them here.


----------



## MLJS54

I'm about to pull the trigger on a ASUS DCII 280X. I game in 1920x1080 and I think it makes the most sense relative to my GTX 570 vs. what is available out there right now around the $300 mark.

Any final words?


----------



## candy_van

Game on?


----------



## MLJS54

Quote:


> Originally Posted by *candy_van*
> 
> Game on?












Should be here tomorrow. As I have not used a AMD card in a very, very long time, could someone kindly point me to the optimal drivers to use for the 280X - should I just DL the latest from AMD website or?

Thanks


----------



## ducknukem86

Quote:


> Originally Posted by *MLJS54*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Should be here tomorrow. As I have not used a AMD card in a very, very long time, could someone kindly point me to the optimal drivers to use for the 280X - should I just DL the latest from AMD website or?
> 
> Thanks


Where did you get it from? the Vapor X is in stock now on newegg


----------



## MLJS54

Quote:


> Originally Posted by *ducknukem86*
> 
> Where did you get it from? the Vapor X is in stock now on newegg


Newegg


----------



## candy_van

Should have mine tomorrow too









As for drivers, yea just get them directly from AMD's site.
Right now the 10.11 betas are the first which officially support the 280X that they list (though I'd have figured the WHQL ones would've worked)


----------



## D3TH.GRUNT

It came in today!


----------



## Arizonian

Congrats.







That DCUII TOP looks damn nice.


----------



## navit

Quote:


> Originally Posted by *D3TH.GRUNT*
> 
> It came in today!


Can you tell me if the 1070 clock is the default or a boost clock?


----------



## D3TH.GRUNT

Quote:


> Originally Posted by *navit*
> 
> Can you tell me if the 1070 clock is the default or a boost clock?


I can confirm that it is the default clock speed


----------



## ducknukem86

Thanks for the pictures! the card looks awesome! Newegg already ran out of Asus DCU TOP again, in like 2 hours! either they are selling them like crazy or they don't get enough product


----------



## candy_van

I love how its std clock too, boost clocks are for the birds.


----------



## navit

Quote:


> Originally Posted by *ducknukem86*
> 
> Thanks for the pictures! the card looks awesome! Newegg already ran out of Asus DCU TOP again, in like 2 hours! either they are selling them like crazy or they don't get enough product


They are still there, I just got one. Page said out of stock but when I clicked the link it was add to cart.









Now to see if they really play nice with a 7970. Although since they are really just 7970's with a new name I cant see why not.


----------



## candy_van

Should be fine, already been confirmed for CF from a few sources


----------



## navit

Quote:


> Originally Posted by *candy_van*
> 
> Should be fine, already been confirmed for CF from a few sources


Yes I have read that as well


----------



## Durvelle27

For all members who own a R9 280X or R9 270X please resubmit required information so you can be added to the list. Thx


----------



## cyph3rz

Hey everybody joining the club! I bought a Gigabyte Radeon R9 280X 3GB (GV-R928XOC-3GD) from Newegg and got it just today. I haven't tried any games with it yet. I was looking at the MSI Twin Frozr 280X (OUT OF STOCK ON NEWEGG!!) but I'm a fan of Gigabyte so I bought it instead. The design is identical to the Gigabyte 7970.


----------



## This calling

Ugh that blue pcb =/ Seriously, the exact same equivalent on the green team the 770 gigabyte windforce x3 looks so much better. Why do the custom parties always cheap out on the amd models


----------



## rdr09

Quote:


> Originally Posted by *This calling*
> 
> Ugh that blue pcb =/ Seriously, the exact same equivalent on the green team the 770 gigabyte windforce x3 looks so much better. Why do the custom parties always cheap out on the amd models


$100 - $150 cheaper. lol


----------



## This calling

Indeed it is $100 cheaper, with lack of physx, higher temperatures/much higher power draw and a driver team about the 10th of the size of nvidia's. I still sent the 770 back to get the 280x and save the $100, but I am kind of regretting it as all the stuff I said is worth the $100 premium they ask for. Mantle better be worth it.

No wait ..........

http://www.amazon.com/Gigabyte-GDDR5-2GB-WINDFORCE-GV-N760OC-2GD-REV2-0/dp/B00DGM8B6O/ref=sr_1_2?ie=UTF8&qid=1381889283&sr=8-2&keywords=gtx+760

the gtx 760 at $250, still a better looking black pcb that is better constructed than that one. Why do the third parties cheap out on making amd cards? =s

So there goes you $100 cheaper, and that is why they cheaped out on the looks argument.


----------



## rdr09

Quote:


> Originally Posted by *This calling*
> 
> Indeed it is $100 cheaper, with lack of physx, higher temperatures/much higher power draw and a driver team about the 10th of the size of nvidia's. I still sent the 770 back to get the 280x and save the $100, but I am kind of regretting it as all the stuff I said is worth the $100 premium they ask for. Mantle better be worth it.
> 
> No wait ..........
> 
> http://www.amazon.com/Gigabyte-GDDR5-2GB-WINDFORCE-GV-N760OC-2GD-REV2-0/dp/B00DGM8B6O/ref=sr_1_2?ie=UTF8&qid=1381889283&sr=8-2&keywords=gtx+760
> 
> the gtx 760 at $250, still a better looking black pcb that is better constructed than that one. Why do the third parties cheap out on making amd cards? =s
> 
> So there goes you $100 cheaper, and that is why they cheaped out on the looks argument.


$200 tops for a 2GB card.

and here is your driver team . . .

http://www.overclock.net/t/1399104/mc-nvidia-320-18-whql-display-driver-is-damaging-gpus

go back to the nvidia forum.


----------



## This calling

How does that change what you said about the price of a $300 card looking really cheap and having an awful pcb while the nvidia card of the same model/cooler at $50 cheaper is better built and has the better pcb?

Or are you basically trying misdirection.

P.S atleast the asus direct cuII top 280x looks amazing, ty for the picture will be the card I go for.


----------



## rdr09

Quote:


> Originally Posted by *This calling*
> 
> How does that change what you said about the price of a $300 card looking really cheap and having an awful pcb while the nvidia card of the same model/cooler at $50 cheaper is better built and has the better pcb?
> 
> Or are you basically trying misdirection.
> 
> P.S atleast the asus direct cuII top 280x looks amazing, ty for the picture will be the card I go for.


maybe. but your claim to having a bigger driver team does not translate to better drivers.

. . . Anything 306.97 works well for me. Just that for many 306.97 is known as the official last good driver for BF3.

I do play some DayZ and Deus Ex and get lower FPS. If I decide to give BF3 a two day break, I just uninstall 306.97 and install 314.07. I have probably installed and reinstalled at least 20 times this week alone lol.

from an owner.

so, you got a better looking card inside the case but can't play games. lol


----------



## Clukos

Quote:


> Originally Posted by *This calling*
> 
> How does that change what you said about the price of a $300 card looking really cheap and having an awful pcb while the nvidia card of the same model/cooler at $50 cheaper is better built and has the better pcb?
> 
> Or are you basically trying misdirection.
> 
> P.S atleast the asus direct cuII top 280x looks amazing, ty for the picture will be the card I go for.


If you don't like the Gigabyte get the Asus DC2, jeez.


----------



## This calling

Being an mmo hopper that plays alot of mmos in early alpha/beta's throughout the years, I can say nvidia are a much better experience. They get out beta drivers the day before beta's sometimes even go live or on the same day. While sometimes amd can't even get out a driver till the full game goes live, then you see about 100 forum posts popping up with amd related issues in the mmo beta forums.

I had a sapphire 6970 for over a year, and it wasn't a pleasant experience in beta's.


----------



## This calling

Quote:


> Originally Posted by *Clukos*
> 
> If you don't like the Gigabyte get the Asus DC2, jeez.


I am not saying me personally, I never wanted to get the gygabite version anyway. It's just why cheap out like that, when they are making the exact same windforce x3 models on the other camp but better.


----------



## Durvelle27

Lets stay on track guys and not derail this thread


----------



## This calling

Back on track then, I personally do not like the look of the gygabite one at all, its horrid. Will be ordering the asus top one though, unless they get the toxic in stock first.


----------



## ironhide138

Why is the toxic so ugly


----------



## cyph3rz

I'm playing Metro Last Light and my Gigabyte is handling the game smoother than my XFX 7950 I used to have. The temp at idle is 41c and under load 60c. It's also quieter than my XFX 7950.

Yeah Gigabyte could do better with the design by the way but I don't mind the blue pcb and I like the three fans. Like I said it's quiet under load.


----------



## ironhide138

Quote:


> Originally Posted by *cyph3rz*
> 
> I'm playing Metro Last Light and my Gigabyte is handling the game smoother than my XFX 7950 I used to have. The temp at idle is 41c and under load 60c. It's also quieter than my XFX 7950.
> 
> Yeah Gigabyte could do better with the design by the way but I don't mind the blue pcb and I like the three fans. Like I said it's quiet under load.


what Res/settings/fps?


----------



## cyph3rz

Quote:


> Originally Posted by *ironhide138*
> 
> what Res/settings/fps?


My settings are:

1680X1050
Very High
SSAA 4X
AF 16X
FPS: 30 average (I'm playing the Sniper Team in the tower pack right now)


----------



## ironhide138

Quote:


> Originally Posted by *cyph3rz*
> 
> My settings are:
> 
> 1680X1050
> Very High
> SSAA 4X
> AF 16X
> FPS: 30 average (I'm playing the Sniper Team in the tower pack right now)


30 seems kinda lackluster... but then again metro LL is a beast.


----------



## cyph3rz

Quote:


> Originally Posted by *ironhide138*
> 
> 30 seems kinda lackluster... but then again metro LL is a beast.


Yeah Metro Last Light is a graphics intensive game in my opinion. I was playing Battlefield Bad Company 2 with advanced settings @1280 and the FPS is 140 average.


----------



## This calling

Quote:


> Originally Posted by *cyph3rz*
> 
> My settings are:
> 
> 1680X1050
> Very High
> SSAA 4X
> AF 16X
> FPS: 30 average (I'm playing the Sniper Team in the tower pack right now)


Holy crap seriously?! That is terrible =s The 770 I sent back would average ALOT higher than that at 1920x1200.

Going to hopefully chalk that up to immature drivers I guess(even though the card is closing in on 2 years old now).

Ah ok an fx 6300, I am kinda hoping that's whats holding you back, as 30 fps is really bad for the entry enthusiast gpu market i.e the 770/280x.


----------



## VegetarianEater

Quote:


> Originally Posted by *This calling*
> 
> Holy crap seriously?! That is terrible =s The 770 I sent back would average ALOT higher than that at 1920x1200.
> 
> Going to hopefully chalk that up to immature drivers I guess(even though the card is closing in on 2 years old now).
> 
> Ah ok an fx 6300, I am kinda hoping that's whats holding you back, as 30 fps is really bad for the entry enthusiast gpu market i.e the 770/280x.


guess nobody noticed the 4x SSAA he's using... he'd probably have around 60 without it


----------



## Jorginto

Does gigabyte 280x have also locked voltage like the 7970 rev 2.0 and 2.1 models?


----------



## kpo6969

Asus vs MSI vs XFX 280X

http://www.legitreviews.com/amd-radeon-r9-280x-video-card-review-asus-xfx-msi_126195


----------



## leyzar

Hey guys,

*Did anybody got their hands on the Sapphire Toxic 280x ?*
I want to pick this up but im worried abit about the excessive power drain on it.
I have contacted Sapphire Customer Support and told them my ring (in my sig) and they said i would need a 750W-800W PSU to run and oc this card + the proc.
So i am interested to see how people run this card so i know for sure if i can use it before i buy it.


----------



## ironhide138

Quote:


> Originally Posted by *kpo6969*
> 
> Asus vs MSI vs XFX 280X
> 
> http://www.legitreviews.com/amd-radeon-r9-280x-video-card-review-asus-xfx-msi_126195


hmm they say the xfx double d runs the coolest, but everything se.I've hears about it says the cooler is crap...


----------



## raghu78

Quote:


> Originally Posted by *ironhide138*
> 
> hmm they say the xfx double d runs the coolest, but everything se.I've hears about it says the cooler is crap...


the xfx cooler on the r9 280x is a completely new design. most reviews show this cooler has the lowest temps at around 60c. the fan profile is a bit too aggressive and so a bit loud. xfx can actually go for a less aggressive fan profile or the user can reduce the fan speed using a custom fan profile.

http://videocardz.com/46600/xfx-launches-radeon-r9-r7-graphics-cards-series

the xfx design is called Ghost 2.0 and the card is very good looking as well as cools very well.

http://www.legitreviews.com/amd-radeon-r9-280x-video-card-review-asus-xfx-msi_126195/11
http://www.anandtech.com/show/7400/the-radeon-r9-280x-review-feat-asus-xfx/20


----------



## lappy

Hi all

I have not read all comments but need help with a problem with my XFX R9 280X
When I try to OC the screen starts to flicker 

Any suggestions?

Setup:

Asus X79-Deluxe
• Intel Core I7 4960X @ stock
• Corsair Vengeance Pro Series 32GB CL11, 11-11-11-27
• Cooler Master HAF XB
• Enermax Revolution 85+ 1050W 80+ Silver
• Kingston SSDNow 60GB


----------



## Gulbis

Quote:


> Originally Posted by *lappy*
> 
> Hi all
> 
> I have not read all comments but need help with a problem with my XFX R9 280X
> When I try to OC the screen starts to flicker
> 
> Any suggestions?
> 
> Setup:
> 
> Asus X79-Deluxe
> • Intel Core I7 4960X @ stock
> • Corsair Vengeance Pro Series 32GB CL11, 11-11-11-27
> • Cooler Master HAF XB
> • Enermax Revolution 85+ 1050W 80+ Silver
> • Kingston SSDNow 60GB


Hi,
have the same problem, but with my hd6950.
As I understand it something to do with OC tool(msi afterburner and so on)

short term solution: change your resolution to lower one, and then change it back to normal, should fix the flickering(for me)
long term solution: have no idea


----------



## cyph3rz

Ok in Metro Last Light with these settings I get anywhere from 10 to 75 FPS average:

1680X1050
Very High
SSAA OFF
AF 4X


----------



## smartdroid

What is the VDDC voltage on this gpu?

from searching around a lot of people refers to it as the gpu voltage but that is another setting.


----------



## lappy

Did not help much with lower screen res. It still flickers when I use mouse


----------



## cyph3rz

It fluctuates between 0.850 and 0.950 VDDC.

I'm wondering how my Gigabyte will do with the Battlefield 4 release AND HOPING that AMD will release new drivers by that time. At least I'm happier with this card than the XFX7950 I used to have.


----------



## rdr09

Quote:


> Originally Posted by *cyph3rz*
> 
> It fluctuates between 0.850 and 0.950 VDDC.
> 
> I'm wondering how my Gigabyte will do with the Battlefield 4 release AND HOPING that AMD will release new drivers by that time. At least I'm happier with this card than the XFX7950 I used to have.


stock 6300? my i7 stock cannot keep up with the 7950. have to oc it to 4.5 to play nice.

edit: the thing is . . . a stock i7 SB is faster than a 6300 at 5GHz.


----------



## lappy

Anybody who can help?


----------



## rdr09

Quote:


> Originally Posted by *lappy*
> 
> Anybody who can help?


not sure if this will help but worth a try using the guide, especially when oc'ing the gpu . . .

http://www.overclock.net/t/1265543/the-amd-how-to-thread


----------



## Cid

Quote:


> Originally Posted by *This calling*
> 
> I am not saying me personally, I never wanted to get the gygabite version anyway. It's just why cheap out like that, when they are making the exact same windforce x3 models on the other camp but better.


I think it's because GB are just selling their old stock of 7970s as 280Xs. And those still have the old design. The lower end Gigabyte 270X gets the new design with black PCB, so that's the only explanation I can come up with.


----------



## lappy

It´s for the 7970?


----------



## rdr09

Quote:


> Originally Posted by *lappy*
> 
> It´s for the 7970?


yes, and it should work for your card. they are pretty much the same.


----------



## lappy

Not the same. The memory settings etc are not the same


----------



## rdr09

Quote:


> Originally Posted by *lappy*
> 
> Not the same. The memory settings etc are not the same


i use trixx or Afterburner to oc if needed. here is how it looks like for both my 7950 and 7970. of course the clocks will be different.



page 2


----------



## lappy

I'm already using this tool but already at 1MHz OC the screen flickers. By lowering res. the screen only flickers when using the mouse


----------



## rdr09

Quote:


> Originally Posted by *lappy*
> 
> I'm already using this tool but already at 1MHz OC the screen flickers. By lowering res. the screen only flickers when using the mouse


i would return that darn thing. sorry,


----------



## lappy

It works fine without *OC


----------



## rdr09

Quote:


> Originally Posted by *lappy*
> 
> It works fine without *OC


wow. do you mind filling out the rig builder . . .

http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-owners-club/220

i know oc'ing is a silicon lottery but can't do 1MHz oc is unacceptable in my book. although i keep my gars stock during games but the thought of not being able to oc really sucks.

make sure AMD overdrive is disabled. Don't even let CCC be in one of the start up programs. what resolution are you using?

edit: my last suggestion is to start your own thread about this matter.


----------



## lappy

_wow. do you mind filling out the rig builder . . .
_

what do you mean?

2566x1440.
AMD overdrive is disabled


----------



## rdr09

Quote:


> Originally Posted by *lappy*
> 
> _wow. do you mind filling out the rig builder . . .
> _
> 
> what do you mean?
> 
> 2566x1440.
> AMD overdrive is disabled


man, you really need a card that can oc. if your components are in your sig it helps others understand your situation better. see, i didn't know your rez and i was wondering why you need to oc your gpu. now i know.

starting your own thread in this issue will increase visibility and you'll never know another member or guest might have a similar setup and found a solution.


----------



## DoooX

Still waiting to see someone CF-ing 280X and how does it compare to 770 SLI...


----------



## candy_van

It's just sitting here by my desk...mocking me till 5


----------



## navit

Quote:


> Originally Posted by *candy_van*
> 
> It's just sitting here by my desk...mocking me till 5


Oh how that sucks. ??


----------



## Reloaded83

Just got my Asus 280x last night and installed it as soon as I got home. I need to take a pic of it with my sn here tonight.

One of my 560 Tis died, so this is replacing 2 in SLI. And wow, it's awesome! Started with one 560 Ti, got a 2nd, and now this, and it is great to crank the settings up to max in all of the games I play. My main problem with the 560s was the lack of VRAM. I was constantly hovering near or over 1000 of vram. I went AMD this time around because of the extra VRAM, and the price couldn't be beat.









Now to set up eyefinity. This is the most I've spent on a single card yet, and so far, totally worth it. Now to figure out exactly how to get the 560 Ti that still works set up as a Physx only card.


----------



## Arizonian

Quote:


> Originally Posted by *lappy*
> 
> I'm already using this tool but already at 1MHz OC the screen flickers. By lowering res. the screen only flickers when using the mouse


Quote:


> Originally Posted by *rdr09*
> 
> i would return that darn thing. sorry,


Isn't flickering caused when OC'ing with ULPS on? Doesn't disabling ULPS take care of that? Or is that just with crossfire?

You'll have to excuse me if I'm wrong last card for me on the red side was just after AMD bought ATI.


----------



## -Droid-

Guys, 280X Vapor X or 7970 Lightning Boost ? Price is the same.

Looks like a no brainer to me (MSI), which one is louder ?

Edit: Oh god, price for the vapor x went down from 280 euros to 260, now what ?


----------



## raghu78

Quote:


> Originally Posted by *-Droid-*
> 
> Guys, 280X Vapor X or 7970 Lightning Boost ? Price is the same.
> 
> Looks like a no brainer to me (MSI), which one is louder ?
> 
> Edit: Oh god, price for the vapor x went down from 280 euros to 260, now what ?


which country are you from ? 260 euros for msi hd 7970 lightning boost is very good. 1150 mhz out of the box. very high quality custom PCB. good cooler and unlocked voltage. the sapphire vapor-x is no match and the cooler is loud after 50% fan speed which is required when overclocking.

the asus r9 280x is the pick of the r9 280x cards. easily the quietest r9 280x card. fantastic cooler and high quality custom PCB. go for it.


----------



## MLJS54

Quote:


> Originally Posted by *D3TH.GRUNT*
> 
> It came in today!
> ...snip...


That DCII looks awesome. Glad I went with that model from ASUS. Can't wait till mine gets here.


----------



## Theroty

Got my Matrix 280x installed. I will have some pics up shortly.









Ran a few tests with it at out of the box clocks but the screenshots are at home.


----------



## jason387

Anyone here with a R7-260X?


----------



## -Droid-

Quote:


> Originally Posted by *raghu78*
> 
> which country are you from ? 260 euros for msi hd 7970 lightning boost is very good. 1150 mhz out of the box. very high quality custom PCB. good cooler and unlocked voltage. the sapphire vapor-x is no match and the cooler is loud after 50% fan speed which is required when overclocking.
> 
> the asus r9 280x is the pick of the r9 280x cards. easily the quietest r9 280x card. fantastic cooler and high quality custom PCB. go for it.


Italy. Sadly the 7970 went out of stock (i can still buy it from germany but its risky). I knew it. It was 282 euros btw.

I guess i will wait a week until all online shops get their cards and pick an Asus, or a Toxic if its priced reasonably.


----------



## Theroty

Here are some of the pics I have so far. I put it side by side with the Asus 7850 for size comparison. I ran 3dMark11 and 3dMark(firestrike, etc). At 99% load and fan on auto the fan got to 31% and the GPU temp was 70 C. I got lots of work with it to do and it will be so much fun.


----------



## ironhide138

man the matrix is nice, but i can't ever buy a 3 slot cooler again ahah, takes up too much room.


----------



## Theroty

I got the slots for it. The Sabertooth can fit two of them if someone wanted to because of the slot spacing.


----------



## FeelKun

Just recieved mine from UPS. I'll post more details after testing the driver and playing video games










Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!







edit;


----------



## EnToxication

Anyone can comment on which card overclocks the best? They only got MSI Twin Frozr at microcenter, but they price match so thinking of picking up two of them today maybe. I'm gonna put it underwater anyways.


----------



## FeelKun

Any tips on overclocking and testing the overclock? And I'll test it out


----------



## EnToxication

Quote:


> Originally Posted by *Resme*
> 
> Any tips on overclocking and testing the overclock? And I'll test it out


Usually try to get it stable at high voltage on highest clock possible, bring it down until it doesn't artifact or crash on load test. Then I bring down voltage as low as possible. What's how I do it but I'm guessing voltage clock ratio should be similar to 7970?

Also watch for load temps on VRM.


----------



## FeelKun

Quote:


> Originally Posted by *EnToxication*
> 
> Usually try to get it stable at high voltage on highest clock possible, bring it down until it doesn't artifact or crash on load test. Then I bring down voltage as low as possible. What's how I do it but I'm guessing voltage clock ratio should be similar to 7970?
> 
> Also watch for load temps on VRM.


What programs do I use? For overclocking then testing.


----------



## smartdroid

Quote:


> Originally Posted by *EnToxication*
> 
> Anyone can comment on which card overclocks the best? They only got MSI Twin Frozr at microcenter, but they price match so thinking of picking up two of them today maybe. I'm gonna put it underwater anyways.


my asus r9 280x matrix platinum is getting decent overclocks









http://www.3dmark.com/3dm/1410854?


----------



## inzajt

Just got my Asus 280x DCU2T today, they are decent overclockers with 1250/1700 @ 1.3v. But wasnt the cards supposed to be able to do 1.4v? Ive tried everything but nothing seems to work. =/


----------



## bencher

Quote:


> Originally Posted by *Resme*
> 
> Just recieved mine from UPS. I'll post more details after testing the driver and playing video games
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> edit;


I thought all the DCII came with back covers.


----------



## Theroty

Quote:


> Originally Posted by *bencher*
> 
> I thought all the DCII came with back covers.


Only, the Matrix edition has the cover on the back.


----------



## inzajt

Quote:


> Originally Posted by *Theroty*
> 
> Only, the Matrix edition has the cover on the back.


Are you able to do 1.4v on your matrix? Cant go past 1.3v on my DCU2T


----------



## Theroty

Quote:


> Originally Posted by *inzajt*
> 
> Are you able to do 1.4v on your matrix? Cant go past 1.3v on my DCU2T


I have not had the chance to try it honestly.


----------



## ChrisAfric

Hi Guys! I am currently using an EVGA GTX 680 SC2, is it worth selling this card then upgrade to a Asus R9 280X matrix platinum in crossfire?

The reason is that I find the 280x really cheap compared to any gtx 770/780.

Hoping for your response.


----------



## Fniz92

Quote:


> Originally Posted by *ChrisAfric*
> 
> Hi Guys! I am currently using an EVGA GTX 680 SC2, is it worth selling this card then upgrade to a Asus R9 280X matrix platinum in crossfire?
> 
> The reason is that I find the 280x really cheap compared to any gtx 770/780.
> 
> Hoping for your response.


I don't see the point when you could buy a used 680 for less and run sli








You might wait for the R9-290X though.


----------



## Durvelle27

Lot of ASUS owners here I see

Will update list in the morning


----------



## ironhide138

Quote:


> Originally Posted by *ChrisAfric*
> 
> Hi Guys! I am currently using an EVGA GTX 680 SC2, is it worth selling this card then upgrade to a Asus R9 280X matrix platinum in crossfire?
> 
> The reason is that I find the 280x really cheap compared to any gtx 770/780.
> 
> Hoping for your response.


Matrix are 3 slot cards, they're a pain in the ass to sli/xf :\. you may as well wait for the 290x if you want to spend that much money, or buy a 2nd 680.


----------



## Roaches

IDS HABBIDING guys, wait for it...wait for it...









Source: http://wccftech.com/amd-never-settle-budle-coming-r200-series/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Wccftechcom+%28WCCFtech.com%29

Its a shame I own most games in that bundle already :/


----------



## Roaches

Has anyone CF'd 7970 Matrix before? I'm curious about temps and power-draw...


----------



## NinjaToast

Quote:


> Originally Posted by *Roaches*
> 
> 
> 
> IDS HABBIDING guys, wait for it...wait for it...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Source: http://wccftech.com/amd-never-settle-budle-coming-r200-series/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Wccftechcom+%28WCCFtech.com%29
> 
> Its a shame I own most games in that bundle already :/


I'm not gonna lie, it's disappointing to see the same games they've been bundling.


----------



## AgentHydra

Quote:


> Originally Posted by *NinjaToast*
> 
> I'm not gonna lie, it's disappointing to see the same games they've been bundling.


That's not the actual lineup, in the article they're saying BF4 is a distinct possibility but the lineup isn't confirmed yet.

I probably should have waited for the next bundle, this card is sick but I don't have any good games to push it to the limit with.


----------



## This calling

Waiting on amazon getting the asus top in stock for $310, so tempted though to pull the trigger on the msi gaming edition.
http://www.amazon.com/MSI-R9-280X-GAMING-3G/dp/B00FR6XPL8/ref=pd_sim_sbs_pc_4/190-5228932-9594738

Only $299 >_< I really did love that cooler on the 760 gaming edition I had before I sent it back for the 770. Wonder if it translates over to this camp.

I mean I could crank that fan up to 70 and not hear a thing, would also keep the card at about 60-64 under max load......... in florida.


----------



## NinjaToast

Quote:


> Originally Posted by *AgentHydra*
> 
> That's not the actual lineup, in the article they're saying BF4 is a distinct possibility but the lineup isn't confirmed yet.
> 
> I probably should have waited for the next bundle, this card is sick but I don't have any good games to push it to the limit with.


Thought BF4 was already going to be bundled with the 290x? Either way the line up pictured would be decent for the cards right now as long as they left Dirt off entirely. Probably the biggest issue I would have with it if they kept current line up, crap titles don't make me jump for joy. xD


----------



## NinjaToast

Quote:


> Originally Posted by *This calling*
> 
> Waiting on amazon getting the asus top in stock for $310, so tempted though to pull the trigger on the msi gaming edition.
> http://www.amazon.com/MSI-R9-280X-GAMING-3G/dp/B00FR6XPL8/ref=pd_sim_sbs_pc_4/190-5228932-9594738
> 
> Only $299 >_< I really did love that cooler on the 760 gaming edition I had before I sent it back for the 770. Wonder if it translates over to this camp.
> 
> I mean I could crank that fan up to 70 and not hear a thing, would also keep the card at about 60-64 under max load......... in florida.


Someone in this thread got that MSI card (2 actually) and one of them had a cap blow off it, dunno that it would a reliable card.









Edit: should also note that TPU found it to run hot even on the circuitry.


----------



## This calling

Haha, that is the reason I have not jumped all over it and the fact that one of the reviewers got one that would hit 87c under load. Though it is really temping sitting here with no card to play games. I have been sitting playing warcraft 3 from the igpu on the 4670k. Takes me back 10 years =p


----------



## FeelKun

Quote:


> Originally Posted by *This calling*
> 
> Waiting on amazon getting the asus top in stock for $310, so tempted though to pull the trigger on the msi gaming edition.
> http://www.amazon.com/MSI-R9-280X-GAMING-3G/dp/B00FR6XPL8/ref=pd_sim_sbs_pc_4/190-5228932-9594738
> 
> Only $299 >_< I really did love that cooler on the 760 gaming edition I had before I sent it back for the 770. Wonder if it translates over to this camp.
> 
> I mean I could crank that fan up to 70 and not hear a thing, would also keep the card at about 60-64 under max load......... in florida.


Well, the Asus DC2 is way quieter than my sapphire 6950... Geeze that thing ran hot and had the most obnoxious fan past 50% ( Whine + vroom vroom). As of right now the max I've seen it hit 78C... ( OCCT test) fan doesn't start cranking up till around 70-72C. (40%-50%) barely noticeable.


----------



## This calling

I would be all over that matrix edition, I just feel 3 slots is a bit much. Especially since the dcii top is getting better cooling results in reviews.


----------



## candy_van

FInally, after staring at that damn box all day at work....I have my card installed


----------



## Arizonian

Quote:


> Originally Posted by *candy_van*
> 
> FInally, after staring at that damn box all day at work....I have my card installed


Grats







Now to take it for a test spin and do some gaming.


----------



## This calling

Grats







would love to see some benches/temps.


----------



## candy_van

Aww yiss, time to try and finish Bioshock, then start Tomb Raider








I'll see how far I can push the card too w/o touching voltage first, curious how much further taking them mem will go since it already comes @ 1600

Will post temps / results too









I guess Unengine Heaven is the standard now?
I haven't benched in a long time, but can do a few quick runs if it will help people out.


----------



## VegetarianEater

Quote:


> Originally Posted by *candy_van*
> 
> Aww yiss, time to try and finish Bioshock, then start Tomb Raider
> 
> 
> 
> 
> 
> 
> 
> 
> I'll see how far I can push the card too w/o touching voltage first, curious how much further taking them mem will go since it already comes @ 1600
> 
> Will post temps / results too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I guess Unengine Heaven is the standard now?
> I haven't benched in a long time, but can do a few quick runs if it will help people out.


3dmark firestrike!


----------



## zachll88

About to pull the trigger on a Gigabyte R9 280X oc, any thoughts or feedback on this particular card?


----------



## cyph3rz

Quote:


> Originally Posted by *zachll88*
> 
> About to pull the trigger on a Gigabyte R9 280X oc, any thoughts or feedback on this particular card?


I have it. Many people don't like it because of the way it looks and it looks identical to the Gigabyte 7970. But like I said before I bought it because I'm a fan of Gigabyte and it's quieter than the XFX 7950 I used to have. I love the triple fans. I tested this card with Skyrim a little while ago with ultra settings at 1680X1050 res and it was doing 60 FPS average.


----------



## raghu78

Quote:


> Originally Posted by *zachll88*
> 
> About to pull the trigger on a Gigabyte R9 280X oc, any thoughts or feedback on this particular card?


if you want the quietest r9 280x go with asus r9 280x dcii top. its the best reviewed r9 280x card out there. put yourself on autonotify and grab one when its back in stock

http://www.tomshardware.de/amd-radeon-r9-280x-roundup-test,testberichte-241401-6.html

http://www.newegg.com/Product/Product.aspx?Item=N82E16814121803


----------



## kpo6969

Quote:


> Originally Posted by *zachll88*
> 
> About to pull the trigger on a Gigabyte R9 280X oc, any thoughts or feedback on this particular card?


http://hexus.net/tech/reviews/graphics/61249-gigabyte-radeon-r9-280x-overclocked-edition/


----------



## Roaches

If DCUIITOP had a decent backplate, it could've shot for it....


----------



## kpo6969

Vapor X has one


----------



## Roaches

Oh cool, I didn't know the Vapor X had one







....I'm still waiting for the Toxic to get in stock :3


----------



## zachll88

Quote:


> Originally Posted by *kpo6969*
> 
> http://hexus.net/tech/reviews/graphics/61249-gigabyte-radeon-r9-280x-overclocked-edition/


Ended up ordering the gigabyte r9 280x!

My 6870 decided to die last night, and this was just a very good price for me.

Ty for the link, was a good read/review.


----------



## EnToxication

ordered 2 for giggles. ordered water blocks also.


----------



## olliiee

Hey guys have any of you any experience with the MSI 280x Gaming edition? Is it one of the weaker better models? (Prefer 2 DVI ports)

and is it worth my while to get a 280x over a 7970? some sites have the ghz edition 7970s beating the 280x but I dont see how thats possible :S


----------



## r4yne

http://www.techpowerup.com/gpuz/gc3a4/


----------



## This calling

Quote:


> Originally Posted by *kpo6969*
> 
> Vapor X has one


That is sexy, didn't know that. The vapor-x cooler has been prone to coil wine though right?
Hmmm $10 more than the asus version and $20 more than the msi/gigabyte editions. With a lower memory clock than the asus. I still really like that back plate though.

Nah nvm, will hold out for the asus that is never in stock.


----------



## Cid

Man, I think our local e-tailers have been reading reviews, the ASUS DC2T is consistently the most expensive 280X card they have. So much for following MSRP. Comes out to $350 before tax (which is 21% here, so yeah). I was hoping prices would drop once retailers got their hands on decent supplies, but so far it's been bupkis. If anything, the prices are going up.

But, for some reason the Toxic is hella cheap here. Now, I know it's voltage locked, but does that also mean I can't lower the voltage? Because I'd have no qualms turning that OC down a bit to get better temps and noise.


----------



## NinjaToast

Quote:


> Originally Posted by *olliiee*
> 
> Hey guys have any of you any experience with the MSI 280x Gaming edition? Is it one of the weaker better models? (Prefer 2 DVI ports)
> 
> and is it worth my while to get a 280x over a 7970? some sites have the ghz edition 7970s beating the 280x but I dont see how thats possible :S


The MSI 280x has heating issues, heck someone in this thread got 2 of them and one of the cards had a cap blow off. Unless you plan to water cool I don't know that it's a recommended buy. The Asus DCII card is the best reviewed card of the 280x line up, might be better off getting that.

Considering that the 280x is cheaper than 7970GE (excluding matrix and toxic edition 280x cards) it would be worth it over 7970GE. It beats it because it's not an exact re-badge of the 7970GE, they are slightly tweaked versions though not enough that you couldn't flash a bios of 7970 on to it (pretty sure they tested this with the MSI and Asus DCII cards) and you can crossfire the 280x with a 7970, though you may already know that.


----------



## -Droid-

What ? Is the toxic voltage locked ? Can anyone confirm ?


----------



## Sumatra

Quote:


> Originally Posted by *olliiee*
> 
> Hey guys have any of you any experience with the MSI 280x Gaming edition? Is it one of the weaker better models? (Prefer 2 DVI ports)
> 
> and is it worth my while to get a 280x over a 7970? some sites have the ghz edition 7970s beating the 280x but I dont see how thats possible :S


It's a rebranded 7970 that's why they trade blows so easily.


----------



## FeelKun

Quote:


> Originally Posted by *EnToxication*
> 
> 
> ordered 2 for giggles. ordered water blocks also.


Woah is that newegg? That's some serious tax... Anyways.. Gratz on your purchase!!


----------



## Roaches

Quote:


> Originally Posted by *Resme*
> 
> Woah is that newegg? That's some serious tax... Anyways.. Gratz on your purchase!!


Taxes are pretty harsh in California, its nearly 10% of the purchases, The 280X matrix nearly hits $400 with shipping combined.


----------



## EnToxication

Quote:


> Originally Posted by *Resme*
> 
> Woah is that newegg? That's some serious tax... Anyways.. Gratz on your purchase!!


Tax does suck on big purchases, paid like 3 grand on tax for my car... Good thing it ships from like 10 miles away though, so usually only takes one day unless if it's slow super egg destroy my box shipping.


----------



## FeelKun

Quote:


> Originally Posted by *Roaches*
> 
> Taxes are pretty harsh in California, its nearly 10% of the purchases, The 280X matrix nearly hits $400 with shipping combined.


Ya, No sales tax on Internet purchases in Virginia ( except for amazon ).... I'll be ordering a second asus dc2 before they start in Virginia. Taxes suck!
Quote:


> Originally Posted by *EnToxication*
> 
> Tax does suck on big purchases, paid like 3 grand on tax for my car... Good thing it ships from like 10 miles away though, so usually only takes one day unless if it's slow super egg destroy my box shipping.


Ouch dude.


----------



## Cid

Quote:


> Originally Posted by *-Droid-*
> 
> What ? Is the toxic voltage locked ? Can anyone confirm ?


Quote:


> Meanwhile overclockers will want to pay note that overclocking options on the 280X Toxic will be limited. The card has solid power delivery, however there isn't any voltage control on the card - Sapphire Trixx, MSI Afterburner, and Asus GPU Tweak are unable to set a different voltage - so you can only overclock it as far as the default voltage will take you. In that respect Sapphire has clearly designed the card for stock operation as opposed to satisfying that end-user overclocking itch.


Source

And re-reading that, guess you also can't lower the voltage. Balls.


----------



## Theroty

In Tennessee we have to pay 9.75% tax because of the Newegg warehouse in Memphis. It blows!


----------



## candy_van

Ouch, NJ sales tax is 7%.
There's a warehouse in Edison, but fortunately for me I work in NY and can ship there tax free


----------



## Thorteris

So so far whats the best r9-270x currently (cooling wise) I will probably crossfire them in a month or two after I buy first. Also are they worth overclocking?


----------



## FeelKun

Quote:


> Originally Posted by *candy_van*
> 
> Ouch, NJ sales tax is 7%.
> There's a warehouse in Edison, but fortunately for me I work in NY and can ship there tax free


Any updates on your overclock adventure







?


----------



## candy_van

Quote:


> Originally Posted by *Resme*
> 
> Any updates on your overclock adventure
> 
> 
> 
> 
> 
> 
> 
> ?


None thus far; just gamed a little last night, but will see what she does in Furmark etc when I get a good night / don't stay too late @ work


----------



## DannyT

Is an antec vp550p enough for a 280x?


----------



## Theroty

Quote:


> Originally Posted by *DannyT*
> 
> Is an antec vp550p enough for a 280x?


Hmm, that would probably be pushing it. I'm sure the others that could give you more insight on the matter though. I know that I saw full system load power draws for some of the 280x cards and it was in the 500 watt range. There may be some that get even high than that.


----------



## Durvelle27

List Updated


----------



## D3TH.GRUNT

Quote:


> Originally Posted by *DannyT*
> 
> Is an antec vp550p enough for a 280x?


I would say so, im running a 450w PSU (Enermax) and i can run everything fine, i haven't noticed any power throttling issues. Although i am probably going to get a 600w PSU as i would love to actually overclock my card, haha.


----------



## Theroty

Quote:


> Originally Posted by *Durvelle27*
> 
> List Updated


Sweet! Now we need something for our sig!







If I knew how to come up with something nice I would do it.


----------



## navit

Quote:


> Originally Posted by *Theroty*
> 
> In Tennessee we have to pay 9.75% tax because of the Newegg warehouse in Memphis. It blows!


But you get next day service for a 3 day price.


----------



## Conissah

Does anyone have an XFX R9 280x they could measure for me? Im going to soon build me a gaming rig with that card, but Im limited on space. On Newegg it says the card is 9.9", but on legit reviews its around 11". So anyone want to measure one for me?


----------



## cyph3rz

Quote:


> Originally Posted by *Conissah*
> 
> Does anyone have an XFX R9 280x they could measure for me? Im going to soon build me a gaming rig with that card, but Im limited on space. On Newegg it says the card is 9.9", but on legit reviews its around 11". So anyone want to measure one for me?


Which case do you have/want to get?


----------



## Durvelle27

Quote:


> Originally Posted by *Conissah*
> 
> Does anyone have an XFX R9 280x they could measure for me? Im going to soon build me a gaming rig with that card, but Im limited on space. On Newegg it says the card is 9.9", but on legit reviews its around 11". So anyone want to measure one for me?


My XFX card measures a little over 11"


----------



## hoevito

Count me in









It's a pretty big card to me...but then again the biggest I've ever had before this was a GTX 570 lol...


----------



## Roaches

Oooh pretty!, Does it sag on its own weight?

I know both of my GTX 680 SOCs does alot before I switched to the FT02...

280X Matrix has been in my crosshairs lately even though I'm waiting for the 290X...


----------



## hoevito

Quote:


> Originally Posted by *Roaches*
> 
> Oooh pretty!, Does it sag on its own weight?
> 
> I know both of my GTX 680 SOCs does alot before I switched to the FT02...
> 
> 280X Matrix has been in my crosshairs lately even though I'm waiting for the 290X...


It does sag ever so slightly...but I would bet it's because of that big ass cooler hanging off of it...still it's hardly noticeable for the most part!!!


----------



## Roaches

Good to hear! If the 290/290X pricing disappoints, I'm definitely grabbing the Matrix then....

Normally used to hear their older 3 slot DCUII had sagging issues during the GTX 580 era...though taking a look at the Matrix PCB, the thick VRM heatsink also acts as a stiff sandwich with the backplate.


----------



## Stay Puft

Anyone grab a 270X Hawk? How are overclocks?


----------



## smaudioz

What is the cooler shroud made from on the Asus DirectCU II 280x? Is it plastic or metal?


----------



## Conissah

Quote:


> Originally Posted by *cyph3rz*
> 
> Which case do you have/want to get?


EVGA Hadron Air.


----------



## Phixit

I should have my SAPPHIRE VAPOR-X R9 280X by Monday I guess .. it is stuck in Toronto and flight is being delayed because of bad weather.


----------



## navit

So we have all heard you can CF a 7970 with a 280 right? Well how do you think one would go about loading drivers for the two different cards? Or would the 13.11 beta work for both?


----------



## Mombasa69

I'm new to Radeon Cards, always used Nvidia, I've recently ordered 2 Sapphire R9 280X Toxic Edition GPU's, will keep a check on this area, will be a change from what I've been used too for the last 10 years.


----------



## Neo_Morpheus

I cant find much on the crossfire speed comparisons of the 280x here's a review http://www.vmodtech.com/th/article/amd-radeon-r9-280x-crossfire-performance-review/page/4 although in comparison my moderate oc on my cards got me P17666 in 3dmark11.


----------



## miklkit

Installed and ran my new MSI R9 280X last night. MSI afterburner calls it a 7900 series.

Anyway, it runs cooler and quieter than the 6970 it replaced. It is also nice and fast as it appears to be slightly faster than X2 6970s were. No OC yet.


----------



## Reloaded83

Quote:


> Originally Posted by *smaudioz*
> 
> What is the cooler shroud made from on the Asus DirectCU II 280x? Is it plastic or metal?


Plastic.


----------



## This calling

Quote:


> Originally Posted by *Stay Puft*
> 
> Anyone grab a 270X Hawk? How are overclocks?


I doubt its going to out perform an overclocked 7950, which is the same price as one atm. Which kinda makes that card pointless, while they still have 7950's in stock.


----------



## This calling

http://www.amazon.com/HIS-iPower-DLDVI-I-Graphics-H280XQMT3G2M/dp/B00FNHVUTA/ref=pd_sim_sbs_pc_1

Amazon finally have the iceQ turbo in stock, isn't that the best oc'er out them all? Most reviewers are doing very well with them. Bit strange why they priced it at $320-$325 though considering all the others are priced at $310.


----------



## FernTeixe

hi guys I'm looking for Sapphire Vapo-X R9 280X oc bios, if anyone have it please share it! want to try it ... I have a 7970


----------



## EnToxication

Got mine today. Good thing ontrac over here in oc is pretty okay. Still i don't get why newegg is using ontrac which is highly unreliable.

MSI 280X x 2, waterblocks on the way




good bye 580gtx ;( well lived...


----------



## Stay Puft

Quote:


> Originally Posted by *This calling*
> 
> I doubt its going to out perform an overclocked 7950, which is the same price as one atm. Which kinda makes that card pointless, while they still have 7950's in stock.


I dont care if it out performs the 7950. I just want some OC results


----------



## ducknukem86

Newegg had the Sapphire Toxic in stock for a very brief moment, and when i say brief, it means like 1 hour, and now they're sold out again! that's such a shame, i just receive money in my bank account and i can't order it!

The only one available is the Sapphire Vapor X, but it's more expensive than the Asus DCU TOP, and i think the ASUS is better. At least that's what i get from the reviews.

So, my choice is either the Asus DCU TOP or the Toxic to go all the way!


----------



## NinjaToast

Anyone have good experience with the XFX R9 280X? (lookin at you OP








) I'm really interested in how it performs OC wise and if it's voltage locked (haven't seen a confirmation of this,) plus it just looks sexy imo.


----------



## Stay Puft

Quote:


> Originally Posted by *NinjaToast*
> 
> Anyone have good experience with the XFX R9 280X? (lookin at you OP
> 
> 
> 
> 
> 
> 
> 
> ) I'm really interested in how it performs OC wise and if it's voltage locked (haven't seen a confirmation of this,) plus it just looks sexy imo.


Avoid xfx


----------



## NinjaToast

Quote:


> Originally Posted by *Stay Puft*
> 
> Avoid xfx


I'm acutely aware XFX customer complaints, own an XFX 7850 and I've had a good experience with the card I'm just not fond of the voltage lock.


----------



## Neo_Morpheus

Cant wait to see some results finally and water blocks on a 280x will probably work out to be a better price/performance over my expensive 770's


----------



## hoevito

Here's results in Valley Benchmark with my R9 280x Matrix platinum oc'ed to 1200/6700mhz...


----------



## Kraanipea

Sapphire VaporX R9 280X is like a damn leafblower under load







. I have no idea how kitguru got the 36dB reading on it with furmark. Sounds like 50dB+ in closed Fractal R4 case and i'm about 1,5m away from the case.


----------



## FeelKun

Quote:


> Originally Posted by *EnToxication*
> 
> Got mine today. Good thing ontrac over here in oc is pretty okay. Still i don't get why newegg is using ontrac which is highly unreliable.
> 
> MSI 280X x 2, waterblocks on the way
> 
> good bye 580gtx ;( well lived...


Post results! Gonna get me a second 280x here soon... I'd like to see fps results since we both use 2560x1440p


----------



## ironhide138

Quote:


> Originally Posted by *Kraanipea*
> 
> Sapphire VaporX R9 280X is like a damn leafblower under load
> 
> 
> 
> 
> 
> 
> 
> . I have no idea how kitguru got the 36dB reading on it with furmark. Sounds like 50dB+ in closed Fractal R4 case and i'm about 1,5m away from the case.


did you set a custom fan curve? maybe they kitguru used stock?


----------



## Stay Puft

Quote:


> Originally Posted by *hoevito*
> 
> Here's results in Valley Benchmark with my R9 280x Matrix platinum oc'ed to 1200/6700mhz...


Kinda weak. Need to push that matrix over 1300 core


----------



## Cid

Quote:


> Originally Posted by *Kraanipea*
> 
> Sapphire VaporX R9 280X is like a damn leafblower under load
> 
> 
> 
> 
> 
> 
> 
> . I have no idea how kitguru got the 36dB reading on it with furmark. Sounds like 50dB+ in closed Fractal R4 case and i'm about 1,5m away from the case.


How are the temps? If they're nice 'n cool it may just be the fan profile being too aggressive?


----------



## Kraanipea

GPU temp shows 74C Max and VRM 72C Max while playing BF3 for about an hour. Fan was at 2900-3100 rpm. At about 50% (2400rpm) it starts being really loud. I'll try a custom fan profile and see how it goes.


----------



## Cid

Hmm, the VRM temp seems pretty good but the GPU temp is indeed way higher than the reviews (and so the fan noise is as well, from trying to keep up). It could be a thermal paste issue? Does Sapphire allow removing the heatsink to re-apply the paste?


----------



## Kraanipea

88C GPU while doing Furmark for like 1 minute, VRM temp went up to 99C, then i just stopped the test manually. Fan went on 3500rpm. HWiNFO64 shows me two VRM temps though, one was 85C, the other one 99C. Not sure if Sapphire really screwed up this VaporX model or bad contact between heatsink and GPU.


----------



## -Droid-

I think its your gpu. It would come up atleast in 1 review if it was for all the line. Plus were talking 2+ years cards here, they just put a new sticker over the 7970 ghz, no changes to the cooler.


----------



## This calling

Quote:


> Originally Posted by *Stay Puft*
> 
> I dont care if it out performs the 7950. I just want some OC results


That's what I was saying, the oc results will come under a 7950, so take that as your results


----------



## This calling

Quote:


> Originally Posted by *Kraanipea*
> 
> 88C GPU while doing Furmark for like 1 minute, VRM temp went up to 99C, then i just stopped the test manually. Fan went on 3500rpm. HWiNFO64 shows me two VRM temps though, one was 85C, the other one 99C. Not sure if Sapphire really screwed up this VaporX model or bad contact between heatsink and GPU.


Holy terrible temps.


----------



## Kraanipea

I put power limit onto -20%, which lowered the GPU volt to 1.2V (stock 1.256V) and the VRM was around 87-90C and core being at 80C while fan only at 2500 rpm, which was actually quite good. But then again, core clocks went down dramatically. The stock voltage is too high imo, and i can't find any other way to change it without affecting the core clocks.


----------



## Roaches

Quote:


> Originally Posted by *Kraanipea*
> 
> GPU temp shows 74C Max and VRM 72C Max while playing BF3 for about an hour. Fan was at 2900-3100 rpm. At about 50% (2400rpm) it starts being really loud. I'll try a custom fan profile and see how it goes.


Quote:


> Originally Posted by *Kraanipea*
> 
> 88C GPU while doing Furmark for like 1 minute, VRM temp went up to 99C, then i just stopped the test manually. Fan went on 3500rpm. HWiNFO64 shows me two VRM temps though, one was 85C, the other one 99C. Not sure if Sapphire really screwed up this VaporX model or bad contact between heatsink and GPU.


Try tightening the screws on the back-plate....sometimes quick assembly in the manufacturing process can cause some QA misses when workers aren't or forget to tighten the screws...At least I know the Devil 13 had this issue...If all else fails try reapply TIM and check the VRM heatsink....


----------



## Cid

Quote:


> Originally Posted by *Kraanipea*
> 
> I put power limit onto -20%, which lowered the GPU volt to 1.2V (stock 1.256V) and the VRM was around 87-90C and core being at 80C while fan only at 2500 rpm, which was actually quite good. But then again, core clocks went down dramatically. The stock voltage is too high imo, and i can't find any other way to change it without affecting the core clocks.


I don't recall the Vapor-X having a non-stock voltage. I thought only the Toxic was supposed to have a higher voltage? Like 1.25V but running at 1100 to 1150 MHz.


----------



## Opentoilet

Sigh, my Matrix can only go up to 1240 core. No matter how much voltage I slam into it... this is insulting..


----------



## Stay Puft

Quote:


> Originally Posted by *Opentoilet*
> 
> Sigh, my Matrix can only go up to 1240 core. No matter how much voltage I slam into it... this is insulting..


Ouch


----------



## Roaches

Unfortunately that's because the 7970/280X Matrix are known for not being cherry picked when they should have with all those PCB enthusiast features.


----------



## 50shadesofray

the second graph is GPU usage while playing BF3. As you can see its all over the friggen place. How can i fix this? Or is this due to crappy drivers?
specs:
i7 950 @ 4.2ghz
12gb crucial ballistix elite 1600mhz
MSI R9 280X GAMING (no mine has no cooling issues is has never gone above 60C with 30% fan speed)
nzxt hale v2 700W bronze PSU
Asus p6t


----------



## Jesse D

That matrix looks good as he'll above... I've pretty much shut off all my case lites unless I have peeps over to show off to, but it looks nice.

De as well btw...

Newegg has newI code to [ btw...

Code:



Code:


save10oct18q

Only good through midnight for up to 10% off (up to 20 bucks)
Ill[ keep posting codes as I see them tidecided on my next gpu... (290x or u 90ti) then I'm problem gone again for a bit... either way... *** is up with this site. It keeps fuming up my mobile browsers as of late?


----------



## Roaches

Where are you getting these promo codes from? I look at all my recent Newegg email deals and I don't see that code in them....I'd love to make a huge purchase though with a saving at least









Damn you CA taxes


----------



## ducknukem86

i tried the code in the cart, but it doesn't seem to work


----------



## xSneak

Quote:


> Originally Posted by *Stay Puft*
> 
> Ouch


I don't see what's so bad about that when the majority of 7970s cant get past 1150.


----------



## Opentoilet

Quote:


> Originally Posted by *xSneak*
> 
> I don't see what's so bad about that when the majority of 7970s cant get past 1150.


I got suckered into these 7970 matrix reviews, knowing that 1300+ on these made me so antsy


----------



## DeviousAddict

Hello people,

I have come to join your club. I've been holding out on upgrading my GTX470's for some time now, then i saw the price and performance of these R9 280X's. Lets just say i was blown away with the performance at such a low price!
I have just placed an order for my 1st XFX Radeon R9 280X, it should arrive at home on monday, i will then be ordering another on monday morning once the funds have appeared. Going for my 1st Xfire with these cards.
Does anyone know where i can find a review for Xfire on these as i can't find one? curious what kind of performance to expect









anyway a picture of the card,

http://videocardz.com/images/2013/09/XFX-Radeon-R9-280X.jpg

I know its not the fastest version but i prefer the look of the card, its very sleek and tidy, which is the style i like.


----------



## EnToxication

MSI 280X.

I just did light oc today because I'm on air, it does pretty okay. I got it to 1220core, but it gets way too hot. my target is 1200core/1700 so i'll wait for my waterblocks that come in on Monday before I go higer. Installing BF3 and crysis again because I had to format due to corrupted registry from nvidia drivers

it runs extremly hot, same as my 580gtx.

For voltage, anything lower crashes my kombuster to desktop. mem volt @ 1.55 also if you guys are wondering.
Quote:


> Originally Posted by *DeviousAddict*
> 
> Hello people,
> 
> I have come to join your club. I've been holding out on upgrading my GTX470's for some time now, then i saw the price and performance of these R9 280X's. Lets just say i was blown away with the performance at such a low price!
> I have just placed an order for my 1st XFX Radeon R9 280X, it should arrive at home on monday, i will then be ordering another on monday morning once the funds have appeared. Going for my 1st Xfire with these cards.
> Does anyone know where i can find a review for Xfire on these as i can't find one? curious what kind of performance to expect
> 
> 
> 
> 
> 
> 
> 
> 
> 
> anyway a picture of the card,
> 
> http://videocardz.com/images/2013/09/XFX-Radeon-R9-280X.jpg
> 
> I know its not the fastest version but i prefer the look of the card, its very sleek and tidy, which is the style i like.


Should be similar to 7970ghz crossfire i figure. that's why lot of reviewers don't bother with it


----------



## Fniz92

Quote:


> Originally Posted by *DeviousAddict*
> 
> Does anyone know where i can find a review for Xfire on these as i can't find one? curious what kind of performance to expect


Here is the performance you should expect : http://www.hardwareheaven.com/reviews/1727/pg1/amd-radeon-hd-7990-graphics-card-review-introduction.html

Now remember the 7990 in the test is not using the 13.8/13.10 drivers that fixed frame pacing and considerable boost performance, but it's still doing very well against the 690 both in scaling and frame latency.


----------



## DeviousAddict

Cheers for the link Fniz92








I'm expecting some pretty good performance then







looking at a 3screen set up by Xmass and by the looks it they wont struggle at all with maxed out gaming


----------



## candy_van

Anyone w/ a DCII TOP OC yet?
Haven't been home since I got mine so curious (hehe)









Curious how far Mem clocks since its already @ 1600.


----------



## Kraanipea

Quote:


> Originally Posted by *Roaches*
> 
> Try tightening the screws on the back-plate....sometimes quick assembly in the manufacturing process can cause some QA misses when workers aren't or forget to tighten the screws...At least I know the Devil 13 had this issue...If all else fails try reapply TIM and check the VRM heatsink....


I did tighten the Core screws about half a turn, which was max and added more a pair of washers to the VRM heatsink. The VRM temps seemed to stop at around 108C with 7 mins of Furmark. GPU was at 87C, fan still at max. Going to have to get some decent washers and i guess take apart the cooler







. Is the TIM that comes with Noctua NH-D14 good for it? I also have a tube of MX-4.

EDIT: I think the Core starts to throttle as well. I see it fluctuates between 950Mhz and 1070Mhz. Not sure if that is what you would call throttling.

EDIT2: I must have added washers to the cooler VRM not the one with 108C. But as i see on the internet, sapphire VRM1 cooling is unacceptable with its little aluminum slab and a tiny thermal tape.


----------



## Shurtugal

Hey, finally got back today and have my 280x sitting here!
It is an Asus Matrix Platinum 280x.


----------



## navit

Here is a shot of my new Asus R9280 DCII CF with my 7970 lightning



Sorry the pic is pure crap


----------



## Amhro

Quote:


> Originally Posted by *Shurtugal*
> 
> Hey, finally got back today and have my 280x sitting here!
> It is an Asus Matrix Platinum 280x.


Looks great!








I like that logo on the side, does any other 280X have it?


----------



## navit

3D mark score With CF- Cpu Stock

Before CF



CF With cpu @ 4.5


----------



## MooseHead

Quote:


> Originally Posted by *Amhro*
> 
> Looks great!
> 
> 
> 
> 
> 
> 
> 
> 
> I like that logo on the side, does any other 280X have it?


If you're referring to the illuminated logo... I believe the asus matrix, the sapphire toxic, and sapphire vapor x have an illuminated logo on the top of the card. If you're referring to the backplate, those same models are the only ones with them as well.


----------



## dade_kash_xD

These new XFX DD 280x look simply irresistible. I'm so tired of waiting for the R9-290x. I'm so tempted to hit checkout on newegg for 2 of these bad boys.


----------



## Stay Puft

Quote:


> Originally Posted by *dade_kash_xD*
> 
> These new XFX DD 280x look simply irresistible. I'm so tired of waiting for the R9-290x. I'm so tempted to hit checkout on newegg for 2 of these bad boys.


A pair of 280X's are going to put a beating on a single 290


----------



## Cool Mike

Been watching for over week for a Sapphire 280x Toxic to come available on Newegg or anywhere for that matter. Lucked out and grabbed one yesterday at around 2PM EST at Newegg, sold out in less than 30 minutes. They must be trickling in a few at a time. I will receive the Toxic Monday


----------



## Waleh

Sorry if this has already been posted, but did anyone try the bf4 beta with the 280x and a 4670k?


----------



## Reloaded83

A few crummy pics I took a couple of days ago when I got the card. VERY happy with it thus far.

Size difference when compared to my old 560 Ti that died:



BARELY fitting in the Lanboy case. (The shroud just touches the support for the case.):



And just the card:



I had to remove the side case fans on the door, because as you may notice in the first pic, where the bracket ends on the 560 Ti is basically where the card ends. On the 280, the cooler goes *far* wider than the old card. Luckily, plenty of room with the fans removed.


----------



## Jesse D

Quote:


> Originally Posted by *Roaches*
> 
> Where are you getting these promo codes from? I look at all my recent Newegg email deals and I don't see that code in them....I'd love to make a huge purchase though with a saving at least
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Damn you CA taxes


Through the enormous amounts of emails they send and from the site itself

Quote:


> Originally Posted by *ducknukem86*
> 
> i tried the code in the cart, but it doesn't seem to work


It might have already met its spending limit by the time I posted it... I had tried earlier in the day but the post would never send... This sites been acting a bit crazy as of late on the desktop browser on my phone, allowing me to type whole paragraphs before anything showsand having to click ssubmit a bunch of times before it will go through.

As for today... 25 off 250 or more again. Will 2ork on 280x or 270/60x if you add some more crap to the cart

Code:



Code:


MBLOCT19

Good through 10-20


----------



## dade_kash_xD

Put me down for 2x XFX R9-280x DD in crossfire. Ordered them earlier today, will take pics and post up with my new build asap!


----------



## cyph3rz

I was playing Battlefield 3 campaigns again and my card was doing 90 FPS average at 1680X1050 ultra settings with 61c maximum temp.


----------



## inzajt

Has anyone managed over 1.3v on their Asus 280x dcu yet? I have tried almost everything but it seems that 1.3v is max?


----------



## ducknukem86

Thanks for the Code!

I think i'll go with the Sapphire Vapor X since it's the only one in stock that has decent clock speeds


----------



## Amhro

Quote:


> Originally Posted by *MooseHead*
> 
> If you're referring to the illuminated logo... I believe the asus matrix, the sapphire toxic, and sapphire vapor x have an illuminated logo on the top of the card. If you're referring to the backplate, those same models are the only ones with them as well.


I was referring to this


Damn, i'd like that matrix, too bad it is 330€, while dual-x is just 260€ and gigabyte 270€


----------



## MooseHead

Quote:


> Originally Posted by *Amhro*
> 
> I was referring to this
> 
> 
> Damn, i'd like that matrix, too bad it is 330€, while dual-x is just 260€ and gigabyte 270€


If that's what you're referring to then yes, only the Asus Matrix, Sapphire Toxic, and Sapphire Vapor-X have illuminated logo's off the top of my head. The logo on Matrix and Toxic changes colors but I believe the Vapor-X version only has a blue LED. As you can tell from my avatar, I like lights in my case....LOTS OF EM! So yea, I kinda did my homework on illuminated logos. Hopes this helps


----------



## battleaxe

I'm considering the ASUS 280x or just to add a GTX660 to the one I have for SLI GTX660's...

I know the 660 SLI will bench better. But not sure what would be more satisfying in game. I'm playing BF3, Metro LL, and plan to play BF4 when the price comes down a bit on Premium.

What do you think I should do?


----------



## 50shadesofray

go for the 280X since nvidia's ego gets in the way of using mantle. I just made the switch from the green team and i regret nothing


----------



## roudabout6

I am getting crossfire 280x in the next coming week but I am torn between the Sapphire Toxic or the Vapor-X. Does anyone own either and have an opinion on them?


----------



## battleaxe

I just think 300 for this level of performance is pretty nice. I realize its a re-badged 7970, but still even that is awesome for 300. This is a great buy. Very hard to decide if going SLI 660 or just go 280x.

What are the expected prices on the 290?


----------



## 50shadesofray

Quote:


> Originally Posted by *battleaxe*
> 
> I just think 300 for this level of performance is pretty nice. I realize its a re-badged 7970, but still even that is awesome for 300. This is a great buy. Very hard to decide if going SLI 660 or just go 280x.
> 
> What are the expected prices on the 290?


There are no prices on the 290X yet. and also the beta drivers on the 280X out perform the 7970ghz just saying


----------



## miklkit

I have an MSI R9 280X and have run into a problem. I play games both new and old, even some DX7 and DX8 games. This thing flies with the new stuff but has a problem with the DX8 game in that the core clocks drop to 300 to 500 and frame rates plummet.
Is there any way to keep the clocks up all the time?


----------



## 50shadesofray

i have the same card. I'm already assuming you have the proper drivers. I was having problems with my GPU being 100% utilized so i flashed a bios that MSI provided to Techpowerupforum, no problems now besides the beta drivers
http://www.techpowerup.com/vgabios/index.php?did=1002-6798-1462-2777


----------



## Shurtugal

With my 280X Matrix, my 3D Mark 11 score is: P9456



http://www.3dmark.com/3dm11/7342053?

Intel i5 3570k @ 3.4 GHz
Asus 280X Matrix @ 1100/1600


----------



## Stay Puft

Quote:


> Originally Posted by *Shurtugal*
> 
> With my 280X Matrix, my 3D Mark 11 score is: P9456
> 
> 
> 
> http://www.3dmark.com/3dm11/7342053?
> 
> Intel i5 3570k @ 3.4 GHz
> Asus 280X Matrix @ 1100/1600


Do us all a favor

Set fan to 100%
Set gpu voltage to 1300mv
Set core clock to 1300 core
Click apply

Run 3dmark11 again


----------



## Shurtugal

Quote:


> Originally Posted by *Stay Puft*
> 
> Do us all a favor
> 
> Set fan to 100%
> Set gpu voltage to 1300mv
> Set core clock to 1300 core
> Click apply
> 
> Run 3dmark11 again


Haha, getting there! I've just overclocked my cpu to 4.2 GHz, the graphics score is obviously similar, but the physics score is significantly higher (+ 1414), after I get my CPU to 4.5 GHz stable i'll try overclocking the matrix a tad more!


----------



## jason387

Quote:


> Originally Posted by *Shurtugal*
> 
> Haha, getting there! I've just overclocked my cpu to 4.2 GHz, the graphics score is obviously similar, but the physics score is significantly higher (+ 1414), after I get my CPU to 4.5 GHz stable i'll try overclocking the matrix a tad more!


According to reviews the new Radeon cards don't overclock too well. You're really getting lucky I must say







. I'll be impressed if you reach 1300Mhz stable on the core








Your Physics score is quite low. That's the score I get with my FX 6300 at 4.8Ghz


----------



## Shurtugal

Quote:


> Originally Posted by *jason387*
> 
> According to reviews the new Radeon cards don't overclock too well. You're really getting lucky I must say
> 
> 
> 
> 
> 
> 
> 
> . I'll be impressed if you reach 1300Mhz stable on the core
> 
> 
> 
> 
> 
> 
> 
> 
> Your Physics score is quite low. That's the score I get with my FX 6300 at 4.8Ghz


That was at stock, at 4.2 GHz I got 8193, should be able to go up to 4.5 GHz though.


----------



## tsm106

For a point of reference the top gold 7970s are into the 13.5K range with 13K+ gpu scores. I'm not sure if you are getting the benefit of the Ivy gpu score bias, but I'd assume you are with 3dmark11. It boosts Ivy gpu scores around 400pts, so you should have that bonus in effect.


----------



## Shurtugal

Quote:


> Originally Posted by *tsm106*
> 
> For a point of reference the top gold 7970s are into the 13.5K range with 13K+ gpu scores. I'm not sure if you are getting the benefit of the Ivy gpu score bias, but I'd assume you are with 3dmark11. It boosts Ivy gpu scores around 400pts, so you should have that bonus in effect.


What about with 3D Mark (Non 11)? I have both, and after I fix my screens issues, i'll run some more benchmarks.


----------



## tsm106

The two fastest in the HoF FS are over 9700 with almost 11K gpu scores. That puts them around 2K pts under high clocked titans.


----------



## Shurtugal

Well, at least i've finally beaten 10k (10,007)
Thats with the Matrix at 1100/1600 scoring = 10634
i5 3570k @ 4.5 GHz scoring = 8510
Time to try and overclock the Matrix, as i'm reasonably new to this, anyone have any important tips?

http://www.3dmark.com/3dm11/7342706?

Edit: The maximum GPU temp was 67 degrees celcius during that last run.


----------



## Stay Puft

Quote:


> Originally Posted by *Shurtugal*
> 
> Well, at least i've finally beaten 10k (10,007)
> Thats with the Matrix at 1100/1600 scoring = 10634
> i5 3570k @ 4.5 GHz scoring = 8510
> Time to try and overclock the Matrix, as i'm reasonably new to this, anyone have any important tips?
> 
> http://www.3dmark.com/3dm11/7342706?
> 
> Edit: The maximum GPU temp was 67 degrees celcius during that last run.


67C is super cool. Start to worry at 90C


----------



## black7hought

I placed my order for one XFX R9-280X DD. I was trying to hold out with my unlocked 6950 but my wife telling me to buy it, the code for $25 off and the rumor that AMD won't release new GPUs until 2015 made the decision easier. Now to wait at the door for the mail like Scott Pilgrim.


----------



## jason387

Quote:


> Originally Posted by *Shurtugal*
> 
> Well, at least i've finally beaten 10k (10,007)
> Thats with the Matrix at 1100/1600 scoring = 10634
> i5 3570k @ 4.5 GHz scoring = 8510
> Time to try and overclock the Matrix, as i'm reasonably new to this, anyone have any important tips?
> 
> http://www.3dmark.com/3dm11/7342706?
> 
> Edit: The maximum GPU temp was 67 degrees celcius during that last run.


Doesn't an overclocked 7970 get a higher gpu score?


----------



## Shurtugal

Quote:


> Originally Posted by *jason387*
> 
> Doesn't an overclocked 7970 get a higher gpu score?


I'm not entirely sure, but all a 280x is, is a re-badged 7970 GHz edition, so if the 7970 was a GHz edition and overclocked, it could easily have a higher score. I'm still in the process of overclocking mine, it gets to 1200 MHz without bios adjustments (I haven't done stability tests, gets through benchmarks though). But the screen starts flashing and making funny lines everywhere when i change the voltage?


----------



## Amhro

Quote:


> Originally Posted by *MooseHead*
> 
> If that's what you're referring to then yes, only the Asus Matrix, Sapphire Toxic, and Sapphire Vapor-X have illuminated logo's off the top of my head. The logo on Matrix and Toxic changes colors but I believe the Vapor-X version only has a blue LED. As you can tell from my avatar, I like lights in my case....LOTS OF EM! So yea, I kinda did my homework on illuminated logos. Hopes this helps


Thanks








Damn, I should really buy a new case, so toxic would fit in.
I still like matrix the most, but not sure if it is worth the extra money. This is actual offer, which would you pick? (Prices are in €)


----------



## Shurtugal

Quote:


> Originally Posted by *Amhro*
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> Damn, I should really buy a new case, so toxic would fit in.
> I still like matrix the most, but not sure if it is worth the extra money. This is actual offer, which would you pick? (Prices are in €)


If your trying to spend as little as possible, then I would go with the Asus for 238, if you want to get a matrix, keep in mind it takes up 3 slots. If the toxic will fit it could also be a good idea.


----------



## Shurtugal

Quote:


> Originally Posted by *smartdroid*
> 
> my asus r9 280x matrix platinum is getting decent overclocks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/1410854?


Hey, i've got the same card, what voltage is this running at?

Edit: Is 1.4v safe? I'm currently on 1.35v and hitting a max of 73 degrees celcius, and I should just keep it below 80 right? Thats with fans on auto too, not 100%, the max they went to was 40%


----------



## jason387

I think AMD's new cards don't have much OC headroom. Remember what they did with the Radeon 6970? They pushed it so far that it could hardly be tweaked.


----------



## Shurtugal

Quote:


> Originally Posted by *jason387*
> 
> I think AMD's new cards don't have much OC headroom. Remember what they did with the Radeon 6970? They pushed it so far that it could hardly be tweaked.


After playing around with mine for a while, I can't get it above 1240 MHz without it flashing during benchmarks (3D Mark 11 and Firestrike).
Asus 280X Matrix Platinum
GPU Clock: 1240 MHz
GPU Voltage: 1388 mV
Memory Clock: 7200 MHz
Memory Voltage: 1720 mV

Firestrike:
http://www.3dmark.com/fs/1017254
3DMark 11
http://www.3dmark.com/3dm11/7343931


----------



## flopper

Quote:


> Originally Posted by *jason387*
> 
> I think AMD's new cards don't have much OC headroom. Remember what they did with the Radeon 6970? They pushed it so far that it could hardly be tweaked.


hum, without any mass of 290 cards out great belief.
my reference 7970 does 1200mhz without fuss still.

1100mhz 290x is likely touching 7990 performance in games-.
dont call that bad.


----------



## dade_kash_xD

Quote:


> Originally Posted by *black7hought*
> 
> I placed my order for one XFX R9-280X DD. I was trying to hold out with my unlocked 6950 but my wife telling me to buy it, the code for $25 off and the rumor that AMD won't release new GPUs until 2015 made the decision easier. Now to wait at the door for the mail like Scott Pilgrim.


Bro, get out of my head! Lol! I did the same thing. I didnt want bf4 to launch and not play it how I want and I've waited out too long for 290x so I ordered 2x XFX DD 280x. Didn't take me too long to decide once I saw them because they sooo perttyyyy!


----------



## Purejoke

Will be this PSU the SeaSonic S12II 620 Bronze 620W good enough to run the ASUS MATRIX R9 280X ?


----------



## hotrod717

Quote:


> Originally Posted by *Shurtugal*
> 
> After playing around with mine for a while, I can't get it above 1240 MHz without it flashing during benchmarks (3D Mark 11 and Firestrike).
> Asus 280X Matrix Platinum
> GPU Clock: 1240 MHz
> GPU Voltage: 1388 mV
> Memory Clock: 7200 MHz
> Memory Voltage: 1720 mV
> 
> Firestrike:
> http://www.3dmark.com/fs/1017254
> 3DMark 11
> http://www.3dmark.com/3dm11/7343931


Wow, your voltage is extremely high! What is voltage stock? Also your memory to core ratio is way out. I suspect memory is part of the issue. I would drop memory to stock and work on finding your top core clocks before working on your memory oc. Where is your power limit set?

One question for club members - How do you distinguish if someone has a 280X or not. Flashing a 7970 to 280X bios count?
@ pureloke - Absolutely. I've ran 2 Matrix's in CF with a X 750


----------



## Amhro

Quote:


> Originally Posted by *Shurtugal*
> 
> If your trying to spend as little as possible, then I would go with the Asus for 238, if you want to get a matrix, keep in mind it takes up 3 slots. If the toxic will fit it could also be a good idea.


I don't mind spending more for matrix, but only if it is worth it.
And yeah, these 3 slots... well my mobo shouldn't be in its way, i guess


----------



## navit

3d mark 11 score


----------



## leyzar

Anybody got their hands on the Toxic 280x yet ?
Im really interested to see how much power dose the card actually suck , as i am interested in running one but i am unsure my 700W seasonic can run it efficiently.
Also what tool must i buy to measure how much power my system actually uses ?


----------



## Clockster

Quote:


> Originally Posted by *leyzar*
> 
> Anybody got their hands on the Toxic 280x yet ?
> Im really interested to see how much power dose the card actually suck , as i am interested in running one but i am unsure my 700W seasonic can run it efficiently.
> Also what tool must i buy to measure how much power my system actually uses ?


Your 700w Seasonic PSU is more than enough to run a 280X bud, you could most likely run 2 of them and you should be ok.

I got my hands on a Gigabyte R9 280X while awaiting the arrival of my 290X cards, Gotta say I love the card but it has very little oc headroom.
Then again it does come out the box with a pretty decent 1100 Core clock, managed to get up to 1200 but it would not go further than that.

http://www.techpowerup.com/gpuz/c9xzk/


----------



## Cid

What's the noise like on that Gigabyte? Is it as bad as it seems from these videos?


----------



## Clockster

Quote:


> Originally Posted by *Cid*
> 
> What's the noise like on that Gigabyte? Is it as bad as it seems from these videos?


Neah its really not bad, I would say its about on par with my 7970 Matrix.
That said it hasn't once gone above 45% while playing Blops II and Dota 2 and at that speed its dead quiet and temps are very good.

Over all its a damn good card (I don't even like Gigabyte but this is good lol)


----------



## Stay Puft

Quote:


> Originally Posted by *Shurtugal*
> 
> After playing around with mine for a while, I can't get it above 1240 MHz without it flashing during benchmarks (3D Mark 11 and Firestrike).
> Asus 280X Matrix Platinum
> GPU Clock: 1240 MHz
> GPU Voltage: 1388 mV
> Memory Clock: 7200 MHz
> Memory Voltage: 1720 mV
> 
> Firestrike:
> http://www.3dmark.com/fs/1017254
> 3DMark 11
> http://www.3dmark.com/3dm11/7343931


Ouch that card is a dud


----------



## jason387

Quote:


> Originally Posted by *Stay Puft*
> 
> Ouch that card is a dud


Most 280x's are. I saw the reviews. If I were to buy a card I'd stick to the 7*** series cards. They overclock way better and will also benefit from Mantle.


----------



## battleaxe

Quote:


> Originally Posted by *jason387*
> 
> Most 280x's are. I saw the reviews. If I were to buy a card I'd stick to the 7*** series cards. They overclock way better and will also benefit from Mantle.


If they are essentially the same card. Why do the 7970's overclock better? That doesn't make sense to me. They're either the same card or they're not. Is it caused by the memory being clocked higher? If so; then if you want a higher core clock, downclock the mem then overclock the core.... to get the same results? I don't see how a 7970 is going to do any better unless they made a change to their die making techniques which resulted in less OC ability. Anyone have any ideas?


----------



## 50shadesofray

Can we just take note that these brand new cards are running crappily optimized beta drivers... Most likely when the drivers have time to mature (and maybe some new BIOS updates from manufacturers:thumb: ) these cards will be running at their full potential. Currently a 280X will out perform (not marginally) a 7970ghz, WITH their crappy beta drivers and immature BIOS.


----------



## jason387

Quote:


> Originally Posted by *50shadesofray*
> 
> Can we just take note that these brand new cards are running crappily optimized beta drivers... Most likely when the drivers have time to mature (and maybe some new BIOS updates from manufacturers:thumb: ) these cards will be running at their full potential. Currently a 280X will out perform (not marginally) a 7970ghz, WITH their crappy beta drivers and immature BIOS.


Don't think they will. Just like the other new AMD cards won't be able to beat their 7*** series cards when overclocked.


----------



## 50shadesofray

I still think its due to Prepubescent drivers...


----------



## inzajt

How well does the 7970ghz overclock? Seems like they arent much better than 280x tbh. Plus most 7970 have their voltage locked.


----------



## Stay Puft

Quote:


> Originally Posted by *50shadesofray*
> 
> I still think its due to Prepubescent drivers...


Its a rebranded 7970. Amd has been putting out these drivers for almost 2 years


----------



## 50shadesofray

Quote:


> Originally Posted by *Stay Puft*
> 
> Its a rebranded 7970. Amd has been putting out these drivers for almost 2 years


Exactly, rebranded, not exactly the same GPU.


----------



## jason387

Quote:


> Originally Posted by *Stay Puft*
> 
> Its a rebranded 7970. Amd has been putting out these drivers for almost 2 years


I totally agree.


----------



## battleaxe

Quote:


> Originally Posted by *jason387*
> 
> Don't think they will. Just like the other new AMD cards won't be able to beat their 7*** series cards when overclocked.


If they are using the same die neither will be able to overclock better than the other. (assuming the other components are identical or same quality) So this doesn't make sense. How is a 7970 going to score better 'overall' when they are the same die-set? Changing the numbers on the plastic doesn't change the card. But the memory is running slightly higher, which can affect other performance such as the core clock. So how is it really any different? Its understood that its a re-badged 7970, how is it going to get beat out by a 280x due to the 7979 achieving a higher OC? Better yet, how is it going to achieve a higher OC at all? And why would the drivers need to be different in the first place if they're both essentially the same card 7970/280x but with higher clock on the memory?

This doesn't add up.


----------



## 50shadesofray




----------



## jason387

How about you compare the R7-260X with the Radeon 7790. At stock the R7-260X might do better but once the Radeon 7790 is overclocked it will surpass an overclocked R7-260X.


----------



## battleaxe

Quote:


> Originally Posted by *50shadesofray*


LOL...


----------



## Stay Puft

Quote:


> Originally Posted by *50shadesofray*
> 
> Exactly, rebranded, not exactly the same GPU.










A rebrand means its the same exact card


----------



## jason387

Quote:


> Originally Posted by *Stay Puft*
> 
> 
> 
> 
> 
> 
> 
> 
> A rebrand means its the same exact card


Like the 5770/6770, right? Like your profile pic dude


----------



## 50shadesofray

then we shall call it an almost rebrand. The memory clocks are different.


----------



## Stay Puft

Quote:


> Originally Posted by *50shadesofray*
> 
> then we shall call it an almost rebrand. The memory clocks are different.


What do memory clocks have to do with drivers?


----------



## 50shadesofray

Listen, Im just trying to come up with a logical explanation for why these 280X are running poorly, and you sir are not helping.


----------



## battleaxe

So you're saying its not a re-badged 7970, but that the 280x is actually inferior to the 7970?

same for the 260x/7790?

I don't really care either way. I'm not an AMD or Nvidia fanboy. I like both brands and own both brands.

I'm just saying something can't be better or worse if its the same. Otherwise, its not the same.

The mem on the 280x is clocked higher. Clock it down to 7970 specs, then OC the core and see where it leaves you. That's the only way to know if we're getting an apples to apples OC. Saying a 280x won't clock as high on the core isn't true unless the mem is also clocked the same.


----------



## Cid

Every single 280X has a custom PCB. And they're a lot cheaper than the 7970. So either the manufacturing process was refined quite some time ago and the board partners or AMD were operating with insane profit margins during the last (half) year or so, or they're skimping on stuff hard with the 280X.


----------



## Clockster

I don't understand what you guys are on about, I haven't had any complaints so far on the 280X cards.
Considering they are a lot cheaper than the 7970's locally and they perform just as well.
This Gigabyte I am using clocked to 1200 and its perfectly stable, yes it doesn't clock like my matrix but the matrix is so much more expensive locally.

For the money the 280X is good value unless you already own a 7970.


----------



## battleaxe

Quote:


> Originally Posted by *Cid*
> 
> Every single 280X has a custom PCB. And they're a lot cheaper than the 7970. So either the manufacturing process was refined quite some time ago and the board partners or AMD were operating with insane profit margins during the last (half) year or so, or they're skimping on stuff hard with the 280X.


That's kinda what I was thinking... something else has to be different. Which means they are not the same card.

That being said I still think the 280x is a great card at a great price. Lets not forget that its basically competitive with a GTX770 (from what I understand). That's a great deal. I'm considering getting a 280x myself. I just don't understand the premise that they are the same as a 7970, but somehow don't clock as well.


----------



## Cid

Oh I'm definitely buying a 280X. 7970s still cost as much as ever here (as they do most places except for in the US I think) so even just stock 7970 Ghz performance for €100 less is just great. I'm just hoping the price on the DC2T drops a bit because my local guys are trying to rip everyone off. They're charging the same as for the Toxic (but that one is still pre-order status only with a month wait time)

But yeah, exactly, they're either the same card and people have just been unlucky, or they're not the same but actually worse than the 7970s (due to low quality components or something else I don't know). You can't have it both ways.


----------



## eternal7trance

I decided to go with HIS since they had one of the better PWM setups and it looks like they just used old 7970s for the 280Xs. So with the fact that it's not a redesign like most of the other ones, I'm hoping that maybe I'll get a good old chip for OCing. Voltage is unlocked too.


----------



## Shurtugal

Quote:


> Originally Posted by *hotrod717*
> 
> Wow, your voltage is extremely high! What is voltage stock? Also your memory to core ratio is way out. I suspect memory is part of the issue. I would drop memory to stock and work on finding your top core clocks before working on your memory oc. Where is your power limit set?
> 
> One question for club members - How do you distinguish if someone has a 280X or not. Flashing a 7970 to 280X bios count?
> @ pureloke - Absolutely. I've ran 2 Matrix's in CF with a X 750


I'm new to overclocking in general, so feel free to point out any mistakes i've made!
These are stock settings:


And these are after the OC:


Also, what is a safe voltage? What should I try and stay under, it's a max of 1.4v.
And what is a memory to core ratio?
Thanks!


----------



## leyzar

Quote:


> Originally Posted by *Clockster*
> 
> Your 700w Seasonic PSU is more than enough to run a 280X bud, you could most likely run 2 of them and you should be ok.
> 
> I got my hands on a Gigabyte R9 280X while awaiting the arrival of my 290X cards, Gotta say I love the card but it has very little oc headroom.
> Then again it does come out the box with a pretty decent 1100 Core clock, managed to get up to 1200 but it would not go further than that.
> 
> http://www.techpowerup.com/gpuz/c9xzk/


What are your thoughts on its power drain on this test ?
Im trying to figure out what 280x to buy... and its not been easy as there is no clear winner... The Toxic has the best out of the box clocks with a pretty respectable OC.. but dose that make it the best Choice ?
Theoretically i can buy any 280x ... but i just do not know what to chose ... what factors are the most important ?...
With all options on the table... which would be the best choice... and why ? ...


----------



## Cid

This is how I see it.

If you don't want to mess with overclocking yourself but want the fastest card and noise doesn't matter: *Sapphire Toxic*
If you do want to OC yourself and you don't mind a blue PCB: *His IceQ X² Turbo*
If you'll be doing no or very mild OC'ing, still want some factory OC and you want a crazy silent card: *ASUS DC2T* If it can be a tiny bit louder, the above HIS fits the bill as well.
If you want good stock clocks, don't mind a blue PCB, some noise and want the cheapest card: *Gigabyte WF*. They seem to be consistently the cheapest around these here parts anyway and they're already running a good factory OC.

The others aren't even in the running in my opinion. Maybe the Matrix Platinum if you've got money to burn, but results here seem to vary wildly so even that premium pricetag offers no assurances.

HOWEVER, remember that only the ASUS and Sapphire Toxic have 2 DVI ports. The others only have 1 and then a combo of HDMI, DisplayPort and mini-DisplayPort. I need 2 DVI, so I need to include the cost of a mini-DP to DVI/normal DP cable as well in my considerations. But maybe you don't.


----------



## eternal7trance

I will report back on how my HIS does when I get it this week


----------



## battleaxe

Quote:


> Originally Posted by *eternal7trance*
> 
> I decided to go with HIS since they had one of the better PWM setups and it looks like they just used old 7970s for the 280Xs. So with the fact that it's not a redesign like most of the other ones, I'm hoping that maybe I'll get a good old chip for OCing. Voltage is unlocked too.


That sounds like a pretty good reason to go with HIS to me. I just love the way the ASUS cards look and how much difference can there really be? Honestly....? I'm not going to gripe about a difference of 25hz on an overclock. I don't care about that. Now, 100Mhz., that might bother me.


----------



## hotrod717

Quote:


> Originally Posted by *Shurtugal*
> 
> I'm new to overclocking in general, so feel free to point out any mistakes i've made!
> These are stock settings:
> 
> 
> And these are after the OC:
> 
> 
> Also, what is a safe voltage? What should I try and stay under, it's a max of 1.4v.
> And what is a memory to core ratio?
> Thanks!


1st are you on air or water? If you are on air, that's definitely going to limit you as you will encounter throttling at some point. You need to keep your memory at default, find max stable core clock, then work on your memory. Bumping everything at once isn't going to help. There is also a neat little tool that locks voltage to core clock. Use that initially to get an idea of what volts you need at what clock. You'll probably need additional volts at some point, but it's a good place to start. An unstable memory oc will limit you so find core max 1st. WATCH YOUR TEMPS. Stay under 75*
You have to remember every chip is different and just because someones matrix can do 1375/1875 doesn't mean yours will. It's what they call the silicon lottery. Sometimes you get a good chip, sometimes you don't.

bump llc to 75% and power target to 120%


----------



## hotrod717

dp.


----------



## Stay Puft

Quote:


> Originally Posted by *Shurtugal*
> 
> I'm new to overclocking in general, so feel free to point out any mistakes i've made!
> These are stock settings:
> 
> 
> And these are after the OC:
> 
> 
> Also, what is a safe voltage? What should I try and stay under, it's a max of 1.4v.
> And what is a memory to core ratio?
> Thanks!


What sort of temps are you seeing?


----------



## navit

Can someone tell me what is happening or how to make it not happen? The Asus card is my bottom card and its running a t99%while the top card is down clocking as normal.


----------



## sugarhell

Disable ulps. Or try newer drivers.


----------



## navit

Quote:


> Originally Posted by *sugarhell*
> 
> Disable ulps. Or try newer drivers.


Have the latest drivers. Didn't start to do this till last night. Friday and most of Saturday it was great, low temps on both cards, but now .........not so......


----------



## Shurtugal

Quote:


> Originally Posted by *hotrod717*
> 
> 1st are you on air or water? If you are on air, that's definitely going to limit you as you will encounter throttling at some point. You need to keep your memory at default, find max stable core clock, then work on your memory. Bumping everything at once isn't going to help. There is also a neat little tool that locks voltage to core clock. Use that initially to get an idea of what volts you need at what clock. You'll probably need additional volts at some point, but it's a good place to start. An unstable memory oc will limit you so find core max 1st. WATCH YOUR TEMPS. Stay under 75*
> You have to remember every chip is different and just because someones matrix can do 1375/1875 doesn't mean yours will. It's what they call the silicon lottery. Sometimes you get a good chip, sometimes you don't.
> 
> bump llc to 75% and power target to 120%


Ok, thanks for the info, will try it tonight.
Also Stay Puff, whilst overclocking I had fans at 100%, and the max temps were 72 degrees.

Edit:

Also, I am running it on air, and what voltage would you suggest try to stay under?


----------



## hotrod717

Quote:


> Originally Posted by *Shurtugal*
> 
> Ok, thanks for the info, will try it tonight.
> Also Stay Puff, whilst overclocking I had fans at 100%, and the max temps were 72 degrees.
> 
> Edit:
> 
> Also, I am running it on air, and what voltage would you suggest try to stay under?


Temps will dictate your voltage. Unfortunately I have another card I'm in the middle of testing, but after I get my Matrix back in I'll give you hand. My current Matrix is a bit more efficient than what you have. (1.219v stock) But my last one was spot on to what you have stock. It shouldn't really take more than .035-.040(1.292-1.297) to get 1200mhz. That's tops. I don't have a block on this Matrix yet and I'll be testing it to see if I want to put a block on it. The last 2 I've had, were solid oc'ers.


----------



## Clockster

Quote:


> Originally Posted by *leyzar*
> 
> What are your thoughts on its power drain on this test ?
> Im trying to figure out what 280x to buy... and its not been easy as there is no clear winner... The Toxic has the best out of the box clocks with a pretty respectable OC.. but dose that make it the best Choice ?
> Theoretically i can buy any 280x ... but i just do not know what to chose ... what factors are the most important ?...
> With all options on the table... which would be the best choice... and why ? ...


Not bothered with that power test, furmark is null and void as far as I am concerned, there really is no need to put a GPU through so much stress as it'll never get close to that power draw while gaming unless your able to clock the living hell out of it.
As far as which card to buy, I would say the Toxic depending on what your paying for it, I really can't fault this Gigabyte card either.
1100 core out the box is pretty respectable. Its gonna come down to what you really want.

I am not a fan of the Toxic color scheme, but there is no denying its a cracking gpu.


----------



## hotrod717

Quote:


> Originally Posted by *Clockster*
> 
> Not bothered with that power test, furmark is null and void as far as I am concerned, there really is no need to put a GPU through so much stress as it'll never get close to that power draw while gaming unless your able to clock the living hell out of it.
> As far as which card to buy, I would say the Toxic depending on what your paying for it, I really can't fault this Gigabyte card either.
> 1100 core out the box is pretty respectable. Its gonna come down to what you really want.
> 
> I am not a fan of the Toxic color scheme, but there is no denying its a cracking gpu.


So after close to 2 years of Asus making a 1100mhz monster and MSI's Lightning, Gigabyte finally gets off its laurels and comes out with " The Toxic". Where are the records? Oh, that's right, Asus and MSI have them. Lol!


----------



## Clockster

Quote:


> Originally Posted by *hotrod717*
> 
> So after close to 2 years of Asus making a 1100mhz monster and MSI's Lightning, Gigabyte finally gets off its laurels and comes out with " The Toxic". Where are the records? Oh, that's right, Asus and MSI have them. Lol!


lol well the Toxic is actually a Sapphire card

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202044&Tpk=toxic%20r9%20280x

The thing is locally the Asus Matrix is roughly R1000 ($100) more than the Toxic. The toxic is also slightly faster than the matrix.
As for the Gigabyte, the Gigabyte has the exact same clocks out of the box as the Matrix yet it costs a heck of a lot less locally.


----------



## black7hought

Quote:


> Originally Posted by *hotrod717*
> 
> So after close to 2 years of Asus making a 1100mhz monster and MSI's Lightning, Gigabyte finally gets off its laurels and comes out with " The Toxic". Where are the records? Oh, that's right, Asus and MSI have them. Lol!


Edit: You beat me to it Clockster.

The Toxic is a Sapphire card.

http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&lid=1&pid=2023&leg=0


----------



## Purejoke

I will go with the toxic. Anyone know if the color of LED logo can be changed ?


----------



## Shurtugal

Quote:


> Originally Posted by *hotrod717*
> 
> Temps will dictate your voltage. Unfortunately I have another card I'm in the middle of testing, but after I get my Matrix back in I'll give you hand. My current Matrix is a bit more efficient than what you have. (1.219v stock) But my last one was spot on to what you have stock. It shouldn't really take more than .035-.040(1.292-1.297) to get 1200mhz. That's tops. I don't have a block on this Matrix yet and I'll be testing it to see if I want to put a block on it. The last 2 I've had, were solid oc'ers.


I assume by block you mean it's watercooled?
I can get to 1200 MHz with under 1.292, I'm not sure how stable it is, but it doesn't flicker when running benchmarks.
Also, the highest temps i've had are 72 degrees, and that is with firestrike and 1.4V. (Plus both fans running at 100%)
With the settings you mentioned earlier, I can get it to 1250, though that is around 1.380mV (approx, will check later, not at home) And when I went up to 1260 it flickered once, and as i'm new to this i'm not sure but I presume that means it's not stable?
What is the max clock you can get with your current Matrix?
Cheers, Shurtugal


----------



## KeepWalkingGTX

In this week i want to buy r9 280x or gtx 760, tell me what is good for me?
My choice for now is msi gtx 760 hawk, but for the same price i can buy gygabite r9 280x.
What do I do with my choice?

Thanks!!!


----------



## leyzar

Quote:


> Originally Posted by *KeepWalkingGTX*
> 
> In this week i want to buy r9 280x or gtx 760, tell me what is good for me?
> My choice for now is msi gtx 760 hawk, but for the same price i can buy gygabite r9 280x.
> What do I do with my choice?
> 
> Thanks!!!


There is no debate between these 2 friend ...
The 280x just destroys the GTX 760. The GTX 760 is on par with the 270x. The 280x is on par with the GTX 770.


----------



## eternal7trance

Yea the 760 is closer to the 270x. The 280x is meant to compete with 770.


----------



## -Droid-

Quote:


> Originally Posted by *leyzar*
> 
> There is no debate between these 2 friend ...
> The 280x just destroys the GTX 760. The GTX 760 is on par with the 270x. The 280x is on par with the GTX 770.


The 760 is on par with the 7950 Boost, not the 270x.


----------



## KeepWalkingGTX

But good overclocking 1250/7000mhz GTX 760 pair with 280x(7970)... ?


----------



## sugarhell

Quote:


> Originally Posted by *KeepWalkingGTX*
> 
> But good overclocking 1250/7000mhz GTX 760 pair with 280x(7970)... ?


Also a good overclocking 270x can match easily a stock 280x. But last time that i checked you know you can oc on amd too.


----------



## rdr09

Quote:


> Originally Posted by *KeepWalkingGTX*
> 
> But good overclocking 1250/7000mhz GTX 760 pair with 280x(7970)... ?


so does a good clocking 7870. but the 760 has to oc much higher than that to match the 7950 at only 1100/1575

http://www.3dmark.com/3dm/1382633


----------



## sugarhell

Quote:


> Originally Posted by *rdr09*
> 
> so does a good clocking 7870. but the 760 has to oc much higher than that to match the 7950 at only 1100/1575
> 
> http://www.3dmark.com/3dm/1382633


Dont steal my comments


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*
> 
> Dont steal my comments


lol. noted.


----------



## sugarhell

Quote:


> Originally Posted by *rdr09*
> 
> lol. noted.


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*


even a 770 will have a hard time catching up with Snuckie's 7950 . . .

http://www.3dmark.com/fs/782400


----------



## sugarhell

Quote:


> Originally Posted by *rdr09*
> 
> even a 770 will have a hard time catching up with Snuckie's 7950 . . .
> 
> http://www.3dmark.com/fs/782400


http://www.3dmark.com/fs/1005312

http://www.3dmark.com/fs/571791


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*
> 
> http://www.3dmark.com/fs/1005312
> 
> http://www.3dmark.com/fs/571791


1600 core. lol

so, what's the best 280X the toxic even though it is voltage locked?


----------



## sugarhell

Quote:


> Originally Posted by *rdr09*
> 
> 1600 core. lol
> 
> so, what's the best 280X the toxic even though it is voltage locked?


I wouldnt buy a voltage locked card.


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*
> 
> I wouldnt buy a voltage locked card.


i would not either not that i plan on buying one.


----------



## leyzar

Well ok then , which of the current 280x have unlocked voltage ?


----------



## Clockster

Completely stock clocks

Gigabyte 280X Windforce 3 Edition

http://www.3dmark.com/3dm11/7349020


----------



## leyzar

Quote:


> Originally Posted by *Clockster*
> 
> Completely stock clocks
> 
> Gigabyte 280X Windforce 3 Edition
> 
> http://www.3dmark.com/3dm11/7349020


Wow that aint bad at all considering the low price


----------



## Ashuiegi

just ordered a toxic to CF with my matrix hd 7970 platinium

my Matrix does 1280 mhz with 1,296V core and 1850mhz vram no change to mem voltage and with a 4670k at 4,5 gh i did 7997 in firestrike (my screen is 1440p native)


----------



## eternal7trance

Quote:


> Originally Posted by *Clockster*
> 
> Completely stock clocks
> 
> Gigabyte 280X Windforce 3 Edition
> 
> http://www.3dmark.com/3dm11/7349020


Too bad it's probably voltage locked like it's 7970 copy


----------



## Modus

What would you guys suggest for playing BF4 at 1440p? Can't decide if I should buy a 280x now and a second one later for CF or wait for a 290x.


----------



## leyzar

Quote:


> Originally Posted by *Modus*
> 
> What would you guys suggest for playing BF4 at 1440p? Can't decide if I should buy a 280x now and a second one later for CF or wait for a 290x.


The 290x should be released on 24th October (in 3 days) .. why not wait and see some pricing... maybe you can get a 290 for 400 bucks







who knows...


----------



## Ashuiegi

with all these rebrand of 2 years old gpu and little perf imporvement of the new one , i start to think that both amd and nivida are hitting a wall and they can only increase the die size to win some real performance , hence the rummors of next to no more new gpu until 2015 after 290 and 780ti. so a 280x-7970 crossfire would be more then enough for the next few years and at a far better price/perf then any of the new cards. at least for single monitor up to 1600p gaming


----------



## battleaxe

The 280x performs on par or very close to the GTX770 (which is $400) and ATI 7970 from last year. The 7970 was over $400 when introduced. The price for the 280x is $300 at intro. That's $100 less. That alone makes it a stellar deal.

The 290/290x is the new tech for ATI. The 280x is not and it makes sense as to why. They put all their R&D dollars into the flagship GPU's (290x).

It makes total sense to put all your new R&D budget into the flagship and just re-badge the lower cards from last years flagship and on down the line. This is totally logical and will offer us (the consumers) the most bang or our buck for years to come.

The pace at which these graphics card companies can come out with better and better cards will increase if this is the model that they adopt; and it seems both companies are using this approach now for most of their top cards. The cards will get better and better every year as a result because they are really only working hard on the very top models, the rest are getting trickle down tech from years past. And we will get better cards for our money on the less expensive models.

I think what they are doing makes a ton of sense when we really think about what is going on. The reason ATI had to do this is because the graphics card market is getting really competitive with Nvidia offering the Titan and 780/770.

Titan was all new tech, so 780,770,760 used trickle down tech. ATI followed suit and now we have what looks like a competitor for the 780 (r9 290)and Titan (r9 290x) at much lower prices overall. This is great for the market. It pushes Nvidia and ATI to compete and that is a win/win for us. I'm happy to pay $300 for last years 7970 card. If I had the money I'd be happy to drop $600 on a 290x if they come in around that price. If the 290x does in fact compete with the Titan (or just get close) then we've got a really good reason to buy it over the Titan. Which will push Nvidia to either up the ante' in the form of an even better single GPU, or lower the price on the Titan.

Either way; we win and the process continues.


----------



## rdr09

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *battleaxe*
> 
> The 280x performs on par or very close to the GTX770 (which is $400) and ATI 7970 from last year. The 7970 was over $400 when introduced. The price for the 280x is $300 at intro. That's $100 less. That alone makes it a stellar deal.
> 
> The 290/290x is the new tech for ATI. The 280x is not and it makes sense as to why. They put all their R&D dollars into the flagship GPU's (290x).
> 
> It makes total sense to put all your new R&D budget into the flagship and just re-badge the lower cards from last years flagship and on down the line. This is totally logical and will offer us (the consumers) the most bang or our buck for years to come.
> 
> The pace at which these graphics card companies can come out with better and better cards will increase if this is the model that they adopt; and it seems both companies are using this approach now for most of their top cards. The cards will get better and better every year as a result because they are really only working hard on the very top models, the rest are getting trickle down tech from years past. And we will get better cards for our money on the less expensive models.
> 
> I think what they are doing makes a ton of sense when we really think about what is going on. The reason ATI had to do this is because the graphics card market is getting really competitive with Nvidia offering the Titan and 780/770.
> 
> Titan was all new tech, so 780,770,760 used trickle down tech. ATI followed suit and now we have what looks like a competitor for the 780 (r9 290)and Titan (r9 290x) at much lower prices overall. This is great for the market. It pushes Nvidia and ATI to compete and that is a win/win for us. I'm happy to pay $300 for last years 7970 card. If I had the money I'd be happy to drop $600 on a 290x if they come in around that price. If the 290x does in fact compete with the Titan (or just get close) then we've got a really good reason to buy it over the Titan. Which will push Nvidia to either up the ante' in the form of an even better single GPU, or lower the price on the Titan.
> 
> Either way; we win and the process continues.






+rep.


----------



## motherpuncher

Quote:


> Originally Posted by *rdr09*
> 
> 
> +rep.


Same here, well said!


----------



## roudabout6

Do the new 280x need a display port adapter to run eyeyinfty. I know the new 290x does not


----------



## hotrod717

Quote:


> Originally Posted by *Shurtugal*
> 
> I assume by block you mean it's watercooled?
> I can get to 1200 MHz with under 1.292, I'm not sure how stable it is, but it doesn't flicker when running benchmarks.
> Also, the highest temps i've had are 72 degrees, and that is with firestrike and 1.4V. (Plus both fans running at 100%)
> With the settings you mentioned earlier, I can get it to 1250, though that is around 1.380mV (approx, will check later, not at home) And when I went up to 1260 it flickered once, and as i'm new to this i'm not sure but I presume that means it's not stable?
> What is the max clock you can get with your current Matrix?
> Cheers, Shurtugal


As I said, only have had current in for a quick check and had to move on to test another card. Should have it back in within a day or so.
That is quite a jump from 1200 @ less than 1.292v to 1250 @ 1.38v. You have power limit to 120 and LLC @ 75%? You could try bumping vddci to 925, but doubt that is the issue. You may also have hit the chips wall. Don't give up. You have to spend some time figuring these things out.

Quote:


> Originally Posted by *Clockster*
> 
> lol well the Toxic is actually a Sapphire card
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202044&Tpk=toxic%20r9%20280x
> 
> The thing is locally the Asus Matrix is roughly R1000 ($100) more than the Toxic. The toxic is also slightly faster than the matrix.
> As for the Gigabyte, the Gigabyte has the exact same clocks out of the box as the Matrix yet it costs a heck of a lot less locally.


My bad, too many beers and a long day. However, the Sapphire and Gigabyte card are both voltage locked, I believe. I simply won't buy a voltage locked card. There's no value in it. I want something I can tweak and oc. If you want to buy a card, put it in your system and be done with it, most any card is fine. Other than saying " I have a toxic", I don't see the value in it.


----------



## eternal7trance

Quote:


> Originally Posted by *roudabout6*
> 
> Do the new 280x need a display port adapter to run eyeyinfty. I know the new 290x does not


All the 280x reviews I read said yes. That's one of the tweaks on the 280x vs the 7970 is that now you can without the adapter.


----------



## amd655

Has anyone here got hands on experience with the Sapphire TOXIC 280x?

My GTX 480 trumped the other day rendering my hag rig useless for gaming.... so i am after a cheap but powerful alternative to Nvidia's rather high pricing, and i am not settling for GTX 670/680.

Also this worries me........ incredibly loud in comparison to my 480 (custom cooled).


----------



## roudabout6

Quote:


> Originally Posted by *eternal7trance*
> 
> All the 280x reviews I read said yes. That's one of the tweaks on the 280x vs
> the 7970 is that now you can without the adapter.


Thank you for replying I have been searching for an answer for hours.


----------



## Slomo4shO

Quote:


> Originally Posted by *Cid*
> 
> If you do want to OC yourself and you don't mind a blue PCB: *His IceQ X² Turbo*


Any particular reason why you would recommend the Turbo over the $20-30 cheaper standard His IceQ X² to someone that wants to OC themselves? They are the exact same card except one is factory overclocked...


----------



## eternal7trance

Quote:


> Originally Posted by *Slomo4shO*
> 
> Any particular reason why you would recommend the Turbo over the $20-30 cheaper standard His IceQ X² to someone that wants to OC themselves? They are the exact same card except one is factory overclocked...


No point really. You're right they are both the same.

Also, if you get one of the 280x cards the $25 off 250 works for them. I'm waiting for mine to be shipped but I got them to apply the promo.


----------



## Cid

Quote:


> Originally Posted by *Slomo4shO*
> 
> Any particular reason why you would recommend the Turbo over the $20-30 cheaper standard His IceQ X² to someone that wants to OC themselves? They are the exact same card except one is factory overclocked...


I'm assuming the ones that couldn't even reach the factory OC were thrown in with the normal ones, so chances of a dud are higher when buying the stock one.

I base this on absolutely no empirical evidence whatsoever.


----------



## Opentoilet

Quote:


> Originally Posted by *eternal7trance*
> 
> No point really. You're right they are both the same.
> 
> Also, if you get one of the 280x cards the $25 off 250 works for them. I'm waiting for mine to be shipped but I got them to apply the promo.


The promo seems to have expired for me.


----------



## Slomo4shO

Quote:


> Originally Posted by *Opentoilet*
> 
> The promo seems to have expired for me.


The latest code *NAFSAVE25COMP* expires 10/22/13


----------



## leyzar

Quote:


> Originally Posted by *Cid*
> 
> I'm assuming the ones that couldn't even reach the factory OC were thrown in with the normal ones, so chances of a dud are higher when buying the stock one.
> 
> I base this on absolutely no empirical evidence whatsoever.


See this right here is what i am pondering aswell...
Why the high clocks put out by... for example the Toxic or even the gigabyte one (what ? 1100 is solid if you ask me) are the safer way to go, as you can buy a DCU2 or even a matrix with its absurdly high price tag and not be able to get 10 mhz more off it. That will leave a bad taste in your mouth.


----------



## b3ka

@Ashuiegi

I've finally found stability in CFing my Matrix 280x. Could you tell me what needs to be done to play with GPU Voltage? I can change anything in GPU Tweak but it always stays at 1,2560 in Stress.


----------



## BackwoodsNC

What is max allowable voltage for the MSI r9 280x gaming? The ASUS 280x nonmatrix is 1.3.


----------



## SupahSpankeh

Hey.

Picked up a DCU II Top - it's been over 2 years since I OC'd a GPU, so I'd really apreciate some software/technique pointers.

1. What's the preferred OC application please?
2. What's the best testing app? Still Furmark?
3. What's the thing I've been reading about re: getting the CPU and MEM frequency to be in a particular ratio to one another?
4. What's the highest safe temp of the DCU-ii?
5. What's the best way to control the DCU fan? the profile is a little agressive at stock...

The rep fair will visit those who help


----------



## Theroty

Quote:


> Originally Posted by *leyzar*
> 
> See this right here is what i am pondering aswell...
> Why the high clocks put out by... for example the Toxic or even the gigabyte one (what ? 1100 is solid if you ask me) are the safer way to go, as you can buy a DCU2 or even a matrix with its absurdly high price tag and not be able to get 10 mhz more off it. That will leave a bad taste in your mouth.


I got my Matrix to 1200 with memory still at stock for the moment. Ran fine with no problems.

I ran 3dmark11 with cpu 4.5/gpu 1.2 and got like 50 points shy of 10k. My graphics score was well into the 11k range so I know my processor is holding back my overall score. I took a screenshot and will post it later. I have not had time to push things and test properly so I have not done so.


----------



## Theroty

Quote:


> Originally Posted by *SupahSpankeh*
> 
> Hey.
> 
> Picked up a DCU II Top - it's been over 2 years since I OC'd a GPU, so I'd really apreciate some software/technique pointers.
> 
> 1. What's the preferred OC application please?
> 2. What's the best testing app? Still Furmark?
> 3. What's the thing I've been reading about re: getting the CPU and MEM frequency to be in a particular ratio to one another?
> 4. What's the highest safe temp of the DCU-ii?
> 5. What's the best way to control the DCU fan? the profile is a little agressive at stock...
> 
> The rep fair will visit those who help


You can use MSI afterburner or Asus gpu tweak. Many use afterburner for sure.


----------



## eternal7trance

Quote:


> Originally Posted by *SupahSpankeh*
> 
> Hey.
> 
> Picked up a DCU II Top - it's been over 2 years since I OC'd a GPU, so I'd really apreciate some software/technique pointers.
> 
> 1. What's the preferred OC application please?
> 2. What's the best testing app? Still Furmark?
> 3. What's the thing I've been reading about re: getting the CPU and MEM frequency to be in a particular ratio to one another?
> 4. What's the highest safe temp of the DCU-ii?
> 5. What's the best way to control the DCU fan? the profile is a little agressive at stock...
> 
> The rep fair will visit those who help


1. MSI Afterburner for me, it's pretty popular
2. Furmark is overkill, Heaven or Valley is better for making sure you'd be ok in a game. But some games can vary and may cause a crash where other games won't.
3. Don't know
4. Around 100c is still ok, but you'd normally want these cards below 90
5. MSI Afterburner can do that too


----------



## Slomo4shO

Quote:


> Originally Posted by *SupahSpankeh*
> 
> 1. What's the preferred OC application please?
> 2. What's the best testing app? Still Furmark?
> 3. What's the thing I've been reading about re: getting the CPU and MEM frequency to be in a particular ratio to one another?
> 4. What's the highest safe temp of the DCU-ii?
> 5. What's the best way to control the DCU fan? the profile is a little agressive at stock...


1. The software comes down to preference since they all are pretty much capable of the same results. I prefer MSI Afterburners but Sapphire Trixx and Asus GPU Tweak are also good alternatives.
2. Furmark is a great stress test but Unigine Valley can also be used to test stability in addition to running any games that you play.
3. Are you referring to the GPU core and Mem frequencies? The ratios are more important for mining, you should focus on the core clock first and then move to memory once you find the maximum stable core clock.
4. I wouldn't suggest higher than 85C for real world scenarios (gaming).
5. OC software does have the capacity to set your own fan curve.


----------



## Clockster

Gigabyte R9 280X Windforce 3 Edition @1150 core, stock mem and cpu

http://www.3dmark.com/3dm11/7350764



Considering that's just a 50Mhz bump I'd say its pretty damn good.


----------



## Slomo4shO

Quote:


> Originally Posted by *Clockster*
> 
> Gigabyte R9 280X Windforce 3 Edition @1150 core, stock mem and cpu


You should OC the memory to ~1750MHz


----------



## Ashuiegi

well a 7970 cf destroy a titan so i guess it will destroy a 290x for the same price ,.... sure you have a few issu in cf , but you own 2 card , you can use them in 2 different computer which can be an huge advantage over a single card , plus you can add another one for cheap later. one titan or one 290x is not good enough to have 60 fps with high settings on 1440p or above , but a 280x cf is just what you need ,.... so the only reason for these higher tier card is to buy more then one for some special setup like 6 screen eyefinity or just for computing power.

i dont have a problem with new gpu but if they spend maibe 2 years insteed of 1 years to develop it and give us some real performance improvement i will be happy to buy the top tier card , until then they will never be worth their price.


----------



## Clockster

Quote:


> Originally Posted by *Slomo4shO*
> 
> You should OC the memory to ~1750MHz


Neah giving the card to a client tomorrow







Was only testing it while awaiting my pretty 290X cards..whenever they'll arrive


----------



## battleaxe

Quote:


> Originally Posted by *Slomo4shO*
> 
> The latest code *NAFSAVE25COMP* expires 10/22/13


+1 to you buddy!


----------



## battleaxe

Newegg has both Asus models back in stock now.

But seeing how the 290/290x is only a week away, I'm waiting to see what prices they come in at.


----------



## ducknukem86

Asus comes back when i already bought the sapphire vapor x!









Good thing Sapphire Vapor x is already on its way! hopefully it won't dissapoint. I've seen one review that said the voltage was unlocked and one where it was mentioned it was locked. I'll know soon enough


----------



## battleaxe

Quote:


> Originally Posted by *ducknukem86*
> 
> Asus comes back when i already bought the sapphire vapor x!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good thing Sapphire Vapor x is already on its way! hopefully it won't dissapoint. I've seen one review that said the voltage was unlocked and one where it was mentioned it was locked. I'll know soon enough


Be sure to let us know when it arrives if unlocked or locked. Good to know.


----------



## BackwoodsNC

Quote:


> Originally Posted by *battleaxe*
> 
> Newegg has both Asus models back in stock now.
> 
> But seeing how the 290/290x is only a week away, I'm waiting to see what prices they come in at.


Just ordered two MSI 280x then the ASUS come in stock, DANG! Oh well! Gonna have fun with these until 20nm.


----------



## eternal7trance

Quote:


> Originally Posted by *BackwoodsNC*
> 
> Just ordered two MSI 280x then the ASUS come in stock, DANG! Oh well! Gonna have fun with these until 20nm.


It's a shame you didn't have more than one newegg account so you could use the promo code twice and save $50


----------



## BackwoodsNC

Quote:


> Originally Posted by *eternal7trance*
> 
> It's a shame you didn't have more than one newegg account so you could use the promo code twice and save $50


Never thought about that. Thanks for the idea, I will attempt that next time.


----------



## black7hought

OP, there is a new review for the XFX R9-280X DD on Fudzilla.

Link


----------



## NinjaToast

Quote:


> Originally Posted by *black7hought*
> 
> OP, there is a new review for the XFX R9-280X DD on Fudzilla.
> 
> Link


Interesting find between the Default and OC bios on that card.. Unlocked voltage but the oc bios also gets hotter even on reference clock and voltage.. Very interesting indeed, still looks sexy.


----------



## Phixit

Just installed my new Sapphire Vapor-X R9 280X, sorry for the pics quality (smartphone).


----------



## Modus

I bit the bullet and bought the XFX R9 280x, couldn't handle the stuttering with my GTX 570.


----------



## olliiee

I don't live in the US and I am not seeing the value in the TOXIC or the MATRIX over the lesser non-oc versions. For example. the Dual X Sapphire is $375, the DCU2 $395 and the VaporX for $410. In comparison the MATRIX is $470 and the TOXIC $430. Obviously you probably think these prices are crazy but stuff in Australia is always more than in america (a 7970 after price drops starts at $360 for non GHZ edition, 7950 for $290).

To me the 280x for more than $400 just doesn't make sense as if I spend another $100 on top of a MATRIX 280x I could get two 7950's :S

edit: words


----------



## BackwoodsNC

Quote:


> Originally Posted by *Phixit*
> 
> Just installed my new Sapphire Vapor-X R9 280X, sorry for the pics quality (smartphone).


Is that Voltage locked? I think it is?.....


----------



## Phixit

Quote:


> Originally Posted by *BackwoodsNC*
> 
> Is that Voltage locked? I think it is?.....


I don't think so, I've just set it to 1250mV from 1200mV (stock) in MSI Afterburner.


----------



## ducknukem86

That's excellent news!


----------



## Cool Mike

Received my very hard to find 280x Toxic today. Running 1200 Core 1800 memory in CCC (Stock voltages). The Hynix memory on the toxic is great. I will add volts after the card runs in for a day or two.


----------



## ducknukem86

Toxic looks amazing!


----------



## Cribbs

Quote:


> Originally Posted by *Cool Mike*
> 
> Received my very hard to find 280x Toxic today. Running 1200 Core 1800 memory in CCC (Stock voltages). The Hynix memory on the toxic is great. I will add volts after the card runs in for a day or two.


I heard in a review that the Toxic is voltage locked, is that true?


----------



## Fniz92

Here I am praying for a 290X Toxic, let the gods be fair - for that thing is beyond gorgeous.
1200Mhz on stock voltage is sick!


----------



## Cid

Well, the Toxic's "stock voltage" is 1.256 or something so, you know. Not exactly stock stock.


----------



## Skylark71

Ok guys,

I bought a Gigabyte R9 280X oc, and the card runs 1195Mhz @ core.
What is weird that if i oc the memory by even 1Mhz, i see performance decrease, but i can oc it to 1600Mhz.
Am i missing something or what?


----------



## bencher

Quote:


> Originally Posted by *Skylark71*
> 
> Ok guys,
> 
> I bought a Gigabyte R9 280X oc, and the card runs 1195Mhz @ core.
> What is weird that if i oc the memory by even 1Mhz, i see performance decrease, but i can oc it to 1600Mhz.
> Am i missing something or what?


In gpuz what brand memory is on your card?


----------



## Skylark71

Quote:


> Originally Posted by *bencher*
> 
> In gpuz what brand memory is on your card?


Elpida, are those really that bad?


----------



## bencher

Quote:


> Originally Posted by *Skylark71*
> 
> Elpida, are those really that bad?


Yes those are teribad.


----------



## leyzar

Quote:


> Originally Posted by *bencher*
> 
> Yes those are teribad.


Do we know what brands are sapphire and asus using ?


----------



## hoevito

Well, I'm new to the overclocking/benchmarking game but I'm coming right along...ran Firestrike for the first time and here's my results. Had my Matrix running at 1220/1700mhz as that seems to be a nice, stable clock for my particular card...never topped 69*

I can get it to run at 1240/6800mhz in Valley and Heaven, but after 72 or so degrees at those clocks it starts to show tearing and artifacts...


----------



## leyzar

Quote:


> Originally Posted by *hoevito*
> 
> Well, I'm new to the overclocking/benchmarking game but I'm coming right along...ran Firestrike for the first time and here's my results. Had my Matrix running at 1220/1700mhz as that seems to be a nice, stable clock for my particular card...never topped 69*
> 
> I can get it to run at 1240/6800mhz in Valley and Heaven, but after 72 or so degrees at those clocks it starts to show tearing and artifacts...


Uhmm is it me or do these seem low in comparison to the gigabyte scores posted earlier ?... im sure im missing something


----------



## hoevito

Quote:


> Originally Posted by *leyzar*
> 
> Uhmm is it me or do these seem low in comparison to the gigabyte scores posted earlier ?... im sure im missing something


Different tests if you're referring to the one posted 2 pages back...mine was Firestrike only, normal mode.


----------



## Skylark71

Quote:


> Originally Posted by *bencher*
> 
> Yes those are teribad.


Crap, so basically there`s nothing i can do but change the card.
This piece of **** is voltage locked too









My previous cards were 2 x Gigabyte GTX660 and those were bad overclockers too, but i decided to give one more change to Gigabyte and this is it.
This sure is my last Gigabyte card.


----------



## 50shadesofray

Anyone else having problems with their MSI 280X's? besides that one unlucky guy that blew a cap...


----------



## leyzar

Quote:


> Originally Posted by *50shadesofray*
> 
> Anyone else having problems with their MSI 280X's? besides that one unlucky guy that blew a cap...


Tbh mate i never buy MSI out of principle ... i consider them a low quality brand (tons of marketing but little substance)... I always advice against MSI, all this from previous intersections with this brand. Go with sapphire or asus...or gigabyte... If you want unlocked voltage im not sure which are not voltage locked... maybe some else can shed some light on this.


----------



## Cool Mike

Yes, the toxic stock voltage is 1.256. This is great. I would prefer not using a overclocking software like Afterburner or Trixx as they just add overhead. Letting my Toxic run in all day today to settle in. I will see how far I can push the core and memory tonight with afterburner. I read that some are hitting 1900 (7600 eff.) on the memory. Hitting 1800 now.
Not sure if the Toxic is voltage locked, See tonight.


----------



## leyzar

So Cool Mike, can you irrevocably confirm that the voltage is locked at 1.256 on the toxic sapphire 280x and can not be modified at all ?


----------



## amd655

With new cards you are best burning them in a bit, normal gaming for a week will work out the kinks.


----------



## b3ka

Hmmm just found that I have a different BIOS on my matrix cards, yet their serial number are one after another.





look at voltage, ***?


----------



## leyzar

oh my... thats some high voltage you got there friend ...


----------



## b3ka

That's just to show that their max voltages are different. I'm not using such lol


----------



## leyzar

Wait a damn second ... Asus is adjusting the voltage on each card so that it can match the advertised clocks ?...that cant be... can it ?
Cause if it actually needs THAT much voltage to reach those cloaks then it would mean that the chips are kinda weak... right ?.. im just finishing here dont shoot me.
It would also mean that the chips on the Sapphire Toxic are top notch as the voltage is default at 1.256 with 1100 mhz / 1150mhz BOOST.


----------



## b3ka

I'm confused at the moment as HWInfo shows that Voltage for both cards is 1,2560 in stress, yet GPUTweak shows it is lower. Who do I believe?


----------



## leyzar

Quote:


> Originally Posted by *b3ka*
> 
> I'm confused at the moment as HWInfo shows that Voltage for both cards is 1,2560 in stress, yet GPUTweak shows it is lower. Who do I believe?


I think ASUS GPU TWEAK is a piece of crap...always has been.
I would trust HWINFO64 and i would double check by installing MSI Afterburner which is the best Graphics card software OC tool. Try that mate.


----------



## BackwoodsNC

Quote:


> Originally Posted by *50shadesofray*
> 
> Anyone else having problems with their MSI 280X's? besides that one unlucky guy that blew a cap...


I wonder if he flashed the cards to the new bios that Msi rolled out? The bios that is on the cards makes it run extremely hot. I have two cards coming so i will find out just how hot the stock bios is.


----------



## Greg28

Between sapphire R9 280x vapor-x and 280x toxic, which is more quiet and cool?


----------



## Cid

Oddly enough, it seems the Toxic is cooler and quieter despite its higher stock voltage.

http://www.tomshardware.de/amd-radeon-r9-280x-roundup-test,testberichte-241401-6.html

Maybe we should sticky that link in the OP? It's turned up a few times now. It's not exactly scientific, but it gives a rough idea and lets us figure out what sort of pitch or whine the fans make. Pure dB numbers don't account for much if the lower dB value is an annoying type of sound to us. Honestly I wish all reviewers recorded the GPU fans and build up a proper database we can use to help us decide. As long as the methodology remains the same, we can still compare them to each other.


----------



## amd655

Quote:


> Originally Posted by *amd655*
> 
> Has anyone here got hands on experience with the Sapphire TOXIC 280x?
> 
> My GTX 480 trumped the other day rendering my hag rig useless for gaming.... so i am after a cheap but powerful alternative to Nvidia's rather high pricing, and i am not settling for GTX 670/680.
> 
> Also this worries me........ incredibly loud in comparison to my 480 (custom cooled).


----------



## eternal7trance

It know it's kind of ugly but why aren't people buying the HIS version? It's voltage unlocked and has cooling that rivals the 3 slot Asus cooler. I will be getting mine today hopefully.


----------



## Slomo4shO

Quote:


> Originally Posted by *eternal7trance*
> 
> It know it's kind of ugly but why aren't people buying the HIS version? It's voltage unlocked and has cooling that rivals the 3 slot Asus cooler. I will be getting mine today hopefully.


Did you get the turbo or the regular model?


----------



## battleaxe

Quote:


> Originally Posted by *eternal7trance*
> 
> It know it's kind of ugly but why aren't people buying the HIS version? It's voltage unlocked and has cooling that rivals the 3 slot Asus cooler. I will be getting mine today hopefully.


Probably cause it is kinda ugly. You're right though, it does sit in a case and isn't seen most of the time. Still when you open the case to show your buddies its nice to have a nice looking piece of hardware to show off, and guessing this is the primary reason most don't get them. Its a good card though.


----------



## bencher

Quote:


> Originally Posted by *leyzar*
> 
> Do we know what brands are sapphire and asus using ?


On my card is hynx ram.
Quote:


> Originally Posted by *Skylark71*
> 
> Crap, so basically there`s nothing i can do but change the card.
> This piece of **** is voltage locked too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My previous cards were 2 x Gigabyte GTX660 and those were bad overclockers too, but i decided to give one more change to Gigabyte and this is it.
> This sure is my last Gigabyte card.


have you tried flashing the bios?


----------



## eternal7trance

Quote:


> Originally Posted by *Slomo4shO*
> 
> Did you get the turbo or the regular model?


Just the regular one. They are both exactly the same minus the turbo one has a bios that has a higher clock.


----------



## Slomo4shO

Quote:


> Originally Posted by *eternal7trance*
> 
> Just the regular one. They are both exactly the same minus the turbo one have a bios that has a higher clock.


Yes, I am aware









Only the turbo has been reviewed and it seems to overclock to 1200MHz with relative ease. I would love to see your overclock results


----------



## leyzar

Hynx is good...right? Also...what card exactly do you have?


----------



## bencher

Quote:


> Originally Posted by *leyzar*
> 
> Hynx is good...right? Also...what card exactly do you have?


Hynx is the best I heard.

I have an Asus 7970 DC II. ram overclocks like a champ. currently at 1600mhz on ram.

Core didn't overclock well cause voltage is locked, however I put all my fears aside last night and flashed the bios unlocking the voltage.

Now I am at 1200mhz on core


----------



## eternal7trance

Quote:


> Originally Posted by *Slomo4shO*
> 
> Yes, I am aware
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Only the turbo has been reviewed and it seems to overclock to 1200MHz with relative ease. I would love to see your overclock results


I will find out today then since it says out for delivery. Going to test my 670 again before I take it out and then compare it to the new card.


----------



## Skylark71

Quote:


> Originally Posted by *bencher*
> 
> On my card is hynx ram.
> have you tried flashing the bios?


No i haven`t, which bios should i flash, or is there any to try?


----------



## bencher

Quote:


> Originally Posted by *Skylark71*
> 
> No i haven`t, which bios should i flash, or is there any to try?


Save your bios from your card.

then use this tool to change the mem clocks or anything else you want to change.

Save the bios with a different name(No space). So you have you original bios as backup.

Then flash using this tool.


----------



## battleaxe

Quote:


> Originally Posted by *bencher*
> 
> Save your bios from your card.
> 
> then use this tool to change the mem clocks or anything else you want to change.
> 
> Save the bios with a different name(No space). So you have you original bios as backup.
> 
> Then flash using this tool.


+1 to you man. Thanks!


----------



## Ashuiegi

damn in europe toxic cards are yet to arrive at any reseller , no one has them ,......, i hope i will get mine in less then 2 weeks ,....


----------



## Phixit

Well, I did some tests with my VAPOR-X and the card is kind of loud starting at around 50% fan speed. I played to Crysis 3 this morning and my case is on the top on my desk right beside me. I removed my headset and it just sounds like a big airplane jet.









It also doesn't seems to be voltage locked.


----------



## leyzar

Quote:


> Originally Posted by *Ashuiegi*
> 
> damn in europe toxic cards are yet to arrive at any reseller , no one has them ,......, i hope i will get mine in less then 2 weeks ,....


Yeah... cant find the damn 280x toxic anywhere in stock ... but let us look on the brig side...
If the price is right we can opt for a 290...cause we should have pricing very soon....sooner then the Toxic will be in stock ofc...
AND OFC after all that we will want a 290 Toxic that will be available next spring in Europe


----------



## battleaxe

Quote:


> Originally Posted by *leyzar*
> 
> Yeah... cant find the damn 280x toxic anywhere in stock ... but let us look on the brig side...
> If the price is right we can opt for a 290...cause we should have pricing very soon....sooner then the Toxic will be in stock ofc...
> AND OFC after all that we will want a 290 Toxic that will be available next spring in Europe


We have to wait til next spring for a 290 toxic? WTH


----------



## Cid

Quote:


> Originally Posted by *Ashuiegi*
> 
> damn in europe toxic cards are yet to arrive at any reseller , no one has them ,......, i hope i will get mine in less then 2 weeks ,....


I've got one store which has an ETA date for November 4. The others just say "not a clue buuuddeh". And the Sapphire rep here said it'd take 3-4 weeks last week. So yeah, I wouldn't hold my breath I'm afraid.
Quote:


> Originally Posted by *eternal7trance*
> 
> It know it's kind of ugly but why aren't people buying the HIS version? It's voltage unlocked and has cooling that rivals the 3 slot Asus cooler. I will be getting mine today hopefully.


It's ugly, which matters to a lot of people.
HIS has an unearned bad reputation as a small-time crappy company. No idea why, but ask people who don't keep up with AMD cards and they'll all ask "HIS? Shouldn't I go for ASUS or MSi or something? Aren't they better quality?"
This is probably just me, because the 1 DVI, 1 HDMI & 2 mini-DisplayPort thing seems standard on AMD cards, but this means I've got to purchase a new mini-DP to DP/DVI cable which bumps up the price significantly.
Not everyone wants to OC themselves so locked or unlocked voltage doesn't matter.

But it is probably the best 280X out there, and I'd forgive its horrible, horrible blue PCB if it had 2 DVI ports. Or I'd forgive its lack of DVI ports if the card looked sexy. But as it stands, it's ugly and has the wrong ports, and that's one too many cons for me.


----------



## bencher

Quote:


> Originally Posted by *battleaxe*
> 
> +1 to you man. Thanks!


any success?


----------



## Ashuiegi

i m not in for a 290 or 290x because i m sure it wont Cf with 7970 and i won't get a huge upgrade from a single 7970 at 1280 1850 to a stock cooled 290 ,.... i would rather crossfire my matrix with a toxic and be sure to pull off a 1200-1800 CF , which will destroy a single 290


----------



## tsm106

Quote:


> Originally Posted by *Ashuiegi*
> 
> i m not in for a 290 or 290x because i m sure it wont Cf with 7970 and i won't get a huge upgrade from a single 7970 at 1280 1850 to a stock cooled 290 ,.... i would rather crossfire my matrix with a toxic and be sure to pull off a 1200-1800 CF , which will destroy a single 290


Destroy...


----------



## miklkit

Quote:


> Originally Posted by *BackwoodsNC*
> 
> I wonder if he flashed the cards to the new bios that Msi rolled out? The bios that is on the cards makes it run extremely hot. I have two cards coming so i will find out just how hot the stock bios is.


I flashed the bios in my MSI R9 280X, but not because of overheating. Mine has hit 62C max and is usually around 56C.

This one has the clocks fluctuating between 300 and 500mhz when playing old games, which gives them poor performance. I want higher clocks with light loads.


----------



## battleaxe

Quote:


> Originally Posted by *Ashuiegi*
> 
> i m not in for a 290 or 290x because i m sure it wont Cf with 7970 and i won't get a huge upgrade from a single 7970 at 1280 1850 to a stock cooled 290 ,.... i would rather crossfire my matrix with a toxic and be sure to pull off a 1200-1800 CF , which will destroy a single 290


Why not just get a 280x matrix? They look identical right?


----------



## Ashuiegi

Quote:


> Originally Posted by *tsm106*
> 
> Destroy...


from the first dubious bench we can assume it , i can remember this 7970 toxic CF vs titan review and the titan was far behind, the 7970 toxic are close to 1200-1800 with the special bios.


----------



## Slomo4shO

Quote:


> Originally Posted by *Ashuiegi*
> 
> from the first dubious bench we can assume it , i can remember this 7970 toxic CF vs titan review and the titan was far behind, the 7970 toxic are close to 1200-1800 with the special bios.


And if the 290 (non-X) ends up being $449 (as rumored) then you are paying only $100 less for the factory overclocked 280X. If the 290 outperforms the stock 280X by 20-25% then the 25-30% price premium over the Toxic or Matrix is definitely justified.


----------



## Ashuiegi

i already have the matrix platinium 7970 , that's the point , i dont want to spend 400 or 600 and ending with 2 single card that can't cf
it will be a bigger performance boost to go with matrix 7970 and toxic 280x in CF at 1200 mhz 1800 mhz , i'm pretty sure it will be really close to stock 290-290x cf and we have no idea how the 290 really perform and overclock. again from the first dubious review it seem to be overheating and consuming an insane amount of power.

i m only paying 300€ for the toxic , so it would be 50% or 100% less then the 290's , i don't think the 290 will perform 50% faster then a 7970 at 1200 and i m sure the 290x wont do twice as good ,.....


----------



## Slomo4shO

Quote:


> Originally Posted by *Ashuiegi*
> 
> i already have the matrix 7970 , that's the point , i dont want to spend 400 or 600 if for ending with 2 single card that can't cf


It is a good thing that the 7970 Matrix still has a good resale value


----------



## Skylark71

Quote:


> Originally Posted by *bencher*
> 
> Save your bios from your card.
> 
> then use this tool to change the mem clocks or anything else you want to change.
> 
> Save the bios with a different name(No space). So you have you original bios as backup.
> 
> Then flash using this tool.


+1 for that, thanks man, i`ll give it a shot


----------



## Slomo4shO

Quote:


> Originally Posted by *Ashuiegi*
> 
> i m only paying 300$ for the toxic


Are you buying the card used or something?


----------



## Ashuiegi

no it s listed at 309 euro where i ordered it , and i have a 13 euro asset from a previous command , they send me some 16/12 insteed of 13/10 fittings and refounded me with an asset without asking for the fittings back !


----------



## nastytime

Quote:


> Originally Posted by *BackwoodsNC*
> 
> I wonder if he flashed the cards to the new bios that Msi rolled out? The bios that is on the cards makes it run extremely hot. I have two cards coming so i will find out just how hot the stock bios is.


I did not flash my card with the new bios.... that MSI card vrms run hot for sure. Once that cap popped, I found out they had an update







lol
I must have had a defective unit. Funny thing is I was still benching stuff with a blown cap. O snap.


----------



## Ashuiegi

a cap that litteraly blow is a cap that is plug with the - on the + , it can still take a while for it to blow even soldered in the wrong way , but i m surprised it would happen on this level of electronics.

i had one blow up on a home made lab psu while i was testing the pcb at the end , i had one cap the wrong way , after maibe 5 min trying to troubleshoot , the thing blow up and torn apart the resistor next to it , both 0,8 mm legs of the resistor were cut straight at the level of the pcb







and it was a small 63 nF 23v . something like that, still bigger then the one on video card do.


----------



## Slomo4shO

Quote:


> Originally Posted by *Ashuiegi*
> 
> no it s listed at 309 euro where i ordered it , and i have a 13 euro asset from a previous command , they send me some 16/12 insteed of 13/10 fittings and refounded me with an asset without asking for the fittings back !


309€ = $425 USD

For all intensive purposes, I'll assume you made the purchase from pccomponentes and the same site has the Sapphire R9 280X Dual-X for 269€ so you are paying a 40€ premium for the Toxic edition which equates a difference of $55 USD which is roughly equivalent to the difference in the $299 Dual-X and $359 Toxic price points observed in the States. Also, the price of the Toxic without VAT is actually 255.37€ which is roughly $352. Since the MSRP pre VAT seems to be comparible to the US prices, a $450 USD card would end up being roughly 326€ pre VAT and around 395€ with the 21% VAT. 395/309 = 1.278 which is still a 27.8% premium over the Toxic.

Comparing apples to apples helps


----------



## Ashuiegi

no it s a french one,materiel.net , damn i was comparing with swiss franc because im from there and i converted it 1:1 , which 2 years ago was accurate , maibe the dollar went down a bit since i dunno . around here it s the price for it anyway give 5-10 euro more or less.
i really didn't check the rate because the swiss franc use to be at 1:1 with dollar and euro-swiss franc use to be at 1:1,2

edit right the dollar went down a little since , and the euro went up. I overlooked the change that 0.1-0.2 would make but i m paid in euro atm anyway so ,...
i m paying it 408$
if you add the premium and taxe that we pay in europe , like it s often said you can take the price in usd and putr a € in front of it and you have the european price. but i forgot it didn't work the other way around


----------



## 50shadesofray

Quote:


> Originally Posted by *miklkit*
> 
> I flashed the bios in my MSI R9 280X, but not because of overheating. Mine has hit 62C max and is usually around 56C.
> 
> This one has the clocks fluctuating between 300 and 500mhz when playing old games, which gives them poor performance. I want higher clocks with light loads.


Definitely use the BIOS that are on tech power up forum, it solved a lot of throttling issues with the gpu since the vrm was heating up, plus also the more i game with it the better it performs... kinda like breaking in a pair of shoes


----------



## miklkit

Quote:


> Originally Posted by *50shadesofray*
> 
> Definitely use the BIOS that are on tech power up forum, it solved a lot of throttling issues with the gpu since the vrm was heating up, plus also the more i game with it the better it performs... kinda like breaking in a pair of shoes


I tried it and noticed no difference in old games that load it lightly. I did not know there is an overheating problem as I have not seen it.

I should mention that it replaced a 6970 that was running in the 85-95C range.


----------



## 50shadesofray

Quote:


> Originally Posted by *miklkit*
> 
> I tried it and noticed no difference in old games that load it lightly. I did not know there is an overheating problem as I have not seen it.
> 
> I should mention that it replaced a 6970 that was running in the 85-95C range.


The issues with old games is the beta drivers. heres a link to the unnofical beta 3 drivers of 13.11 http://www.overclock.net/t/1436104/amd-catalyst-13-11-beta3
people been saying better framerate with dx9, maybe itll help older versions of dx?


----------



## Jesse D

Soooo.. hard not to grab one of these, but still holding out for the 290x. Hopefully in the next couple days. *twists fingers*

In the mean time... (25 off 250 at the egg... good through midnight... yada yada yada.)

Code:



Code:


NAFSAVE25COMP

Here's hoping they have one up and ready on release day of 290x.


----------



## sugarhell

So we know if toxic has voltage locked or not?


----------



## eternal7trance

Finally got my HIS in today and it's doing really well. I haven't touched anything besides the core clock and at 1.075mV I am stable at 1151, I will try to add voltage and see how that works. I have plenty of headroom since the card at load needs 66% fan speed to keep it at 59c.

It is voltage unlocked and the dual bios on it seems handy for if I mess up a flash.


----------



## BackwoodsNC

Op should post MSI R-9 280X BIOS update in first post. Link


----------



## Cool Mike

I will be trying Afterburner with my 280x Toxic within the hour. Hope to give info soon if voltage is locked.
Stock I am running 1200 core and 1800 memory.


----------



## miklkit

Quote:


> Originally Posted by *50shadesofray*
> 
> The issues with old games is the beta drivers. heres a link to the unnofical beta 3 drivers of 13.11 http://www.overclock.net/t/1436104/amd-catalyst-13-11-beta3
> people been saying better framerate with dx9, maybe itll help older versions of dx?


Downloading now, but suspect it will not do anything for this specific issue. After switching VCs there was a drop in FPS that looks to be caused by the clocks being too low. How is it supposed to perform well at 300-500 mhz?


----------



## eternal7trance

Is there something I'm missing with voltage control? I can move the slider all over but the voltage doesn't actually change. Maybe there's no support for it yet?


----------



## hoevito

Quote:


> Originally Posted by *eternal7trance*
> 
> Is there something I'm missing with voltage control? I can move the slider all over but the voltage doesn't actually change. Maybe there's no support for it yet?


I'm having the same issue sadly...voltage is never where it's set, and way lower usually.


----------



## Crowe98

Very close to buying an ASUS R9 280x DCUII TOP, hearing that they may need BIOS flashing to unlock core volts. Could someone please give me a quick 'tutorial', on how to save and flash the BIOS, as i've never done it before.

Ta.


----------



## Cool Mike

Using Afterburner with my 280x Toxic. GOOD NEWS! TOXIC HAS CORE VOLTAGE CONTROL. With Afterburner beta 14. Voltage control up to 1.3V. Memory voltage control is available to adjust but does not operate properly. My sweet spot seems to be 1200 core 1850 Memory at 1.263V.

Temps are in the mid 70's running Valley bench at 1440P. This card has the best memory overclock I have ever benched. Keep in mind this is stock voltage of 1.6V. 7400 Effective


----------



## EnToxication

Quote:


> Originally Posted by *hoevito*
> 
> I'm having the same issue sadly...voltage is never where it's set, and way lower usually.


Undervoltage it, you won't be able to run benchmarks. I'm wondering if this is actually afterburner glitch or what because my temp rises as I OC my cards. GPU tweak and afterburner are also both reading my gpu load and fan speed all wrong also. it's weird.


----------



## eternal7trance

My temp doesn't rise as I raise the voltage. So nothing is happening.


----------



## EnToxication

Quote:


> Originally Posted by *eternal7trance*
> 
> My temp doesn't rise as I raise the voltage. So nothing is happening.


Weird. IF you are using afterburner, go to the folder, goto msiafterburner.cfg and try putting
UnofficialOverclockingMode value to 1.

I had to do that for my 580gtx quite a while back.

edit: For me I only have issues when crossfiring cards, but only the readings are wrong.


----------



## Cribbs

Quote:


> Originally Posted by *Cool Mike*
> 
> Using Afterburner with my 280x Toxic. GOOD NEWS! TOXIC HAS CORE VOLTAGE CONTROL. With Afterburner beta 14. Voltage control up to 1.3V. Memory voltage control is available to adjust but does not operate properly. My sweet spot seems to be 1200 core 1850 Memory at 1.263V.
> 
> Temps are in the mid 70's running Valley bench at 1440P. This card has the best memory overclock I have ever benched. Keep in mind this is stock voltage of 1.6V. 7400 Effective


>7400 Effective
holy crap that's impressive.
Give us some firestrike scores!


----------



## Cribbs

Quote:


> Originally Posted by *eternal7trance*
> 
> My temp doesn't rise as I raise the voltage. So nothing is happening.


The HIS Turbo is $45 cheaper than the asus here in New Zealand, please keep us all updated in your adventures with the card, I'm tempted to pull the trigger.


----------



## eternal7trance

Quote:


> Originally Posted by *Cribbs*
> 
> The HIS Turbo is $45 cheaper than the asus here in New Zealand, please keep us all updated in your adventures with the card, I'm tempted to pull the trigger.


The cooling is good like the reviews said. I just can't find a tool that changes the voltage like it says.


----------



## fl62

Just bought an MSI R9 270x last week as an upgrade to an HD 6950. Card ran really well, but decided I should upgrade as far as I could afford to right now.

I returned that card to Fry's for an MSI R9 280x, only to get it home and find it DOA!

Another 70 mile round-trip back to Fry's, and they exchanged it for the last MSI 280x in stock.

This one boots up and runs fine, however I noticed that it is reporting as a 7970 in both GPUz and the driver! The clocks are at 1020 / 1500

Has anyone else seen this?

I realize the 280x is just a rebrand, but now my BIOS is even the same......


----------



## Cribbs

Quote:


> Originally Posted by *eternal7trance*
> 
> The cooling is good like the reviews said. I just can't find a tool that changes the voltage like it says.


I have always liked HIS cards, the 7950 iceq looked like an amazing overclocker, but that blue pcb, and only one dvi port? I'm struggling between the HIS, the VAPOR X, and the Asus


----------



## miklkit

Quote:


> Originally Posted by *fl62*
> 
> Just bought an MSI R9 270x last week as an upgrade to an HD 6950. Card ran really well, but decided I should upgrade as far as I could afford to right now.
> 
> I returned that card to Fry's for an MSI R9 280x, only to get it home and find it DOA!
> 
> Another 70 mile round-trip back to Fry's, and they exchanged it for the last MSI 280x in stock.
> 
> This one boots up and runs fine, however I noticed that it is reporting as a 7970 in both GPUz and the driver! The clocks are at 1020 / 1500
> 
> Has anyone else seen this?
> 
> I realize the 280x is just a rebrand, but now my BIOS is even the same......


Mine too. Since everyone says it is a buffed up 7970 I thought this was ok.


----------



## NinjaToast

Quote:


> Originally Posted by *fl62*
> 
> Just bought an MSI R9 270x last week as an upgrade to an HD 6950. Card ran really well, but decided I should upgrade as far as I could afford to right now.
> 
> I returned that card to Fry's for an MSI R9 280x, only to get it home and find it DOA!
> 
> Another 70 mile round-trip back to Fry's, and they exchanged it for the last MSI 280x in stock.
> 
> This one boots up and runs fine, however I noticed that it is reporting as a 7970 in both GPUz and the driver! The clocks are at 1020 / 1500
> 
> Has anyone else seen this?
> 
> I realize the 280x is just a rebrand, but now my BIOS is even the same......


Well if it's a 280x you could flash the bios techpowerup provided on to it.


----------



## miklkit

Quote:


> Originally Posted by *NinjaToast*
> 
> Well if it's a 280x you could flash the bios techpowerup provided on to it.


Hah! Yur right! I just switched the switch to the newest bios and in MSI AB it now shows as a R9 200 series.

UPDATE: I tried 3DMARK just now and got a rating of 81% on a preset with everything about maxed. The last time I tried it with a 6970 it got 51%.
The max temp was 65C @ stock clocks with the new bios.

But it must be a corrupted install as there was no time marks and the hardware stats were incomplete. Downloading a new one now.


----------



## Cool Mike

Firestrike Score - *Sapphire 280x Toxic
*
Had to lower memory speed slightly from 1850 (7400 Eff) to 1800 (7200 Eff) This Hynix memory is Great.









Core at 1210 with Stock Voltage - 1.256V

Running a I7 4930K @ 4.6Ghz


----------



## Cribbs

Quote:


> Originally Posted by *Cool Mike*
> 
> Firestrike Score - *Sapphire 280x Toxic
> *
> Had to lower memory speed slightly from 1850 (7400 Eff) to 1800 (7200 Eff) This Hynix memory is Great.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Core at 1210 with Stock Voltage - 1.256V
> 
> Running a I7 4930K @ 4.6Ghz


Haha, damn, that's a better score than what I get running my 780 at stock settings.


----------



## eternal7trance

Quote:


> Originally Posted by *Cribbs*
> 
> I have always liked HIS cards, the 7950 iceq looked like an amazing overclocker, but that blue pcb, and only one dvi port? I'm struggling between the HIS, the VAPOR X, and the Asus


It's not so bad, the mini DP to DVI cord I bought was cheap. I only use two monitors so it doesn't bother me.


----------



## Cribbs

Quote:


> Originally Posted by *eternal7trance*
> 
> It's not so bad, the mini DP to DVI cord I bought was cheap. I only use two monitors so it doesn't bother me.


I think I've decided on the Vapor-x, the dual dvi ports, sexy leds, backplate, and decent stock OC is just so nice, not to mention its one of the cheapest 280x's in NZ.
http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-280x-vapor-x-review/
According to Kitguru's review it's quiet, and super cool.

And can you imagine how good these would look in CF?


----------



## Skylark71

Quote:


> Originally Posted by *bencher*
> 
> Save your bios from your card.
> 
> then use this tool to change the mem clocks or anything else you want to change.
> 
> Save the bios with a different name(No space). So you have you original bios as backup.
> 
> Then flash using this tool.


Ok, so far that was a dead end, because VBE7 tool didn`t recognise my VRM type and i was unable to adjust vddc. My max is 1256mV.
Still performance decrease with memory @ 1550 and 1600Mhz
If i raise the power limit and/or tdp, will i give me more headroom, or is it useless with this voltage locked card.
Thanks,

-Mika-


----------



## Manbish

Quote:


> Originally Posted by *eternal7trance*
> 
> Just the regular one. They are both exactly the same minus the turbo one has a bios that has a higher clock.


eternal7trance! Would you be kind enough to upload the bios to TPU? I have the 7970 iceq turbo and I'd like to flash it with it. Thanks


----------



## Cid

AAAAH it was too cheap. I found a place that also sold cheap mini-DP to DP cables so everything combined it was still a fair bit cheaper than the DC2T. Maybe I'll eventually find some sort of backplate for the thing. As long as it's silent I'll be happy. If it's loud-ish but cool I can always turn down the fans myself. I just hope it doesn't turn out to be a card from hell like my current one, which is hot and loud no matter what.

I'm talking of course about the HIS 280X IceQ X² Turbo. Turns out if you sell your card cheap enough I'll forgive any sort of superficial flaws. It should get here sometime early next week, maybe even this week. In any case, I should be good for BF4.

Uh, so ugly though.


----------



## melodystyle2003

Do we know which 280x have unlocked voltage?


----------



## black7hought

Quote:


> Originally Posted by *melodystyle2003*
> 
> Do we know which 280x have unlocked voltage?


I know the XFX 280X DD has unlocked voltage (according to Fudzilla).

Gigabyte 280X is locked

Sapphire 280X Toxic is unlocked

ASUS 280X Matrix is unlocked

Both the Gigabyte and XFX have Elpida memory chips which do not seem good for OC'ing. However, the ASUS Matrix has a good brand of memory for OC'ing.

I'm not sure about the other cards.


----------



## leyzar

Quote:


> Originally Posted by *black7hought*
> 
> I know the XFX 280X DD has unlocked voltage (according to Fudzilla).
> 
> Based on other posts I believe the Gigabyte WindForce 3 280X and ASUS 280X Matrix are unlocked but the Sapphire 280X Toxic is not.
> 
> Both the Gigabyte and XFX have Elpida memory chips which do not seem good for OC'ing. However, the ASUS Matrix has a good brand of memory for OC'ing.
> 
> I'm not sure about the other cards.


no. Giga locked. Toxic unlocked. Matrix unlocked


----------



## Cid

According to these guys the HIS uses Hynix memory, which is the good kind, and seems to be voltage unlocked.
Quote:


> The memory consists of 3 GB GDDR5 SDRAM sitting on a 384 Bit bus width. Hynix H5GQ2H24AFR-ROC is used for the memory modules, which usually overclock quite nicely. These particular memory chips are rated for 1.5 V and 6.0 Gb/s speed, and that's exactly what they are set to. We'll see how far we can take them later in the review.


Quote:


> The HIS R9 280X iPower IceQ X2 Turbo is the flagship R9 series for HIS at the present time. It's factory overclocked and promises voltage control via their iTurbo software.


Source.


----------



## black7hought

Quote:


> Originally Posted by *leyzar*
> 
> no. Giga locked. Toxic unlocked. Matrix unlocked


Thanks, I'll update my post.


----------



## melodystyle2003

Yes HIS Radeon R9-280X IceQ X2 Turbo has unlocked voltage:
"_Next I like to know what the voltage and clock limits are, so I fired up the HIS iTurbo overclocking utility. I was able to adjust the *vCore (1218mV~1337mV)* and *mCore (1500mV~1575mV)* voltages as well as a board power limit pecentage slider (for an extra 20%) thanks to the excellent power management capabilities of this aftermarket design. Clock speeds were adjustable far beyond the speeds I managed to overclock to._"
Link


----------



## ducknukem86

According to another user the Vapor X is unlocked too


----------



## Cool Mike

From my previous post.

My Toxic is unlocked and uses Hynix memory. Running Firestrike and Valley at 1800 (7200 Effective) speeds on memory. Toxic is using top shelf memory.


----------



## eternal7trance

Quote:


> Originally Posted by *Manbish*
> 
> eternal7trance! Would you be kind enough to upload the bios to TPU? I have the 7970 iceq turbo and I'd like to flash it with it. Thanks


I tried to but they said it's already on it, even though it's not. Maybe it thinks the 7970 version is the same thing.
Quote:


> Originally Posted by *Cid*
> 
> AAAAH it was too cheap. I found a place that also sold cheap mini-DP to DP cables so everything combined it was still a fair bit cheaper than the DC2T. Maybe I'll eventually find some sort of backplate for the thing. As long as it's silent I'll be happy. If it's loud-ish but cool I can always turn down the fans myself. I just hope it doesn't turn out to be a card from hell like my current one, which is hot and loud no matter what.
> 
> I'm talking of course about the HIS 280X IceQ X² Turbo. Turns out if you sell your card cheap enough I'll forgive any sort of superficial flaws. It should get here sometime early next week, maybe even this week. In any case, I should be good for BF4.
> 
> Uh, so ugly though.


It is ugly but it's quiet, good cooling and voltage unlocked. The rest just depends on your chip luck.


----------



## DeviousAddict

My R9 280x has arrived.
I ran 3D mark Vantage to see how well it does and it didn't disappoint


----------



## Ashuiegi

look nice but they could have fitted some 10 mm bigger fan on it ,it would have been less nice but better cooling and less noise,....


----------



## Amhro

In my opinion that xfx looks horrible


----------



## melodystyle2003

Can we see a valley run pls?


----------



## leyzar

I think the XFX is elegant in its simplicity ...
Hey what brand memory on it mate ? elpida or hinix ?


----------



## black7hought

Quote:


> Originally Posted by *DeviousAddict*
> 
> My R9 280x has arrived.
> I ran 3D mark Vantage to see how well it does and it didn't disappoint
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Enjoy!

Are you able to test it in FireStrike?


----------



## Modus

Quote:


> Originally Posted by *leyzar*
> 
> I think the XFX is elegant in its simplicity ...
> Hey what brand memory on it mate ? elpida or hinix ?


would really like to know what kind of memory it uses too. I have one ordered.


----------



## battleaxe

Quote:


> Originally Posted by *Amhro*
> 
> In my opinion that xfx looks horrible


I think it looks awesome. Best looking GPU award if you ask me. But I don't see it staying cool very well though...


----------



## VegetarianEater

I think the XFX 280x looks great, but you can't ignore the better performance (speed, temps, and noise) of the competition.


----------



## battleaxe

Quote:


> Originally Posted by *VegetarianEater*
> 
> I think the XFX 280x looks great, but you can't ignore the better performance (speed, temps, and noise) of the competition.


That's true.

Most sales of XFX products are going to come from Best Buy stores I would imagine.

So they are not necessarily an enthusiast product designed for people like us... Still, they look awesome if you ask me and still perform pretty well overall.


----------



## eternal7trance

I really liked the design of the xfx card but the internals of the HIS card sold me.


----------



## ironhide138

Can we get a pic of the XFX card installed? I could never find a good pick online of what the side of the card looked like


----------



## battleaxe

Quote:


> Originally Posted by *eternal7trance*
> 
> I really liked the design of the xfx card but the internals of the HIS card sold me.


Has anyone unlocked the BIOS on these to adjust the voltage?

What kind of voltage is killing these cards? I saw one post that 1.4v did one in, but that could be a single incident which doesn't really mean much. (speaking of the Saphire card)


----------



## eternal7trance

Good news, I found that the iturbo tool HIS has works perfectly with the HIS 280x and lets you adjust VDDC and MVDDC so when I have more time I am going to see what clocks I can hit.


----------



## DeviousAddict

Quote:


> Originally Posted by *leyzar*
> 
> I think the XFX is elegant in its simplicity ...
> Hey what brand memory on it mate ? elpida or hinix ?


I'd love to tell you but i don't know where i would find that info. I can't see it on GPU-Z. Where would i look to find that?
Quote:


> Originally Posted by *black7hought*
> 
> Enjoy!
> 
> Are you able to test it in FireStrike?


I ran fire strike earlier. link to results http://www.3dmark.com/fs/1032764

I've got a 2nd one coming on Saturday but got to wait a couple of weeks before i'm home again to connect it up









Edit: some one asked for a valley bench.


----------



## DeviousAddict

Quote:


> Originally Posted by *ironhide138*
> 
> Can we get a pic of the XFX card installed? I could never find a good pick online of what the side of the card looked like


Sure can


----------



## bencher

Quote:


> Originally Posted by *DeviousAddict*
> 
> Sure can


What is the overclock like?


----------



## DeviousAddict

Quote:


> Originally Posted by *bencher*
> 
> What is the overclock like?


Haven't done any overclocking with it yet. won't be doing any until I'm home on leave next month. I'll be sure to post when i do though


----------



## battleaxe

Quote:


> Originally Posted by *DeviousAddict*
> 
> Sure can


That is a smart looking card man!


----------



## eternal7trance

With a little tinkering I am doing valley runs at 1255/1800 so far so it doesn't seem that bad. I feel like I can go farther because I'm maxing out at 63c but iturbo won't let me go any farther than 1.279 on the voltage.

That's a 255mhz difference from stock so I can't complain.


----------



## DeviousAddict

Quote:


> Originally Posted by *battleaxe*
> 
> That is a smart looking card man!


I know right! i think it's the neatest looking card I've ever seen


----------



## FeelKun

Quote:


> Originally Posted by *eternal7trance*
> 
> With a little tinkering I am doing valley runs at 1255/1800 so far so it doesn't seem that bad. I feel like I can go farther because I'm maxing out at 63c but iturbo won't let me go any farther than 1.279 on the voltage.
> 
> That's a 255mhz difference from stock so I can't complain.


Settings for valley? ATM I'm hitting 1200/1750 @ 1.275v... Asus DC 2 280x. Past 1200 core it artifacts. I can probably go higher on the memory.


----------



## hoevito

Quote:


> Originally Posted by *eternal7trance*
> 
> With a little tinkering I am doing valley runs at 1255/1800 so far so it doesn't seem that bad. I feel like I can go farther because I'm maxing out at 63c but iturbo won't let me go any farther than 1.279 on the voltage.
> 
> That's a 255mhz difference from stock so I can't complain.


I'm right around the same mark after extensive tweaking myself finally. 1250/1775 topping 65*

I started playing with the voltage buttons on the card itself, bu values still don't seem to break 1.276 when being monitored. Also, has anyone with a Matrix tried playing with the hotwire cables yet? I would LOVE to have some pointers/tips since I'm very much a newb when it comes to overclocking!!!


----------



## ironhide138

Quote:


> Originally Posted by *DeviousAddict*
> 
> Sure can


hmm... I cant tell if I like it of not. think I would have prefered covered heat sinks. Looks like an icecream sandwhich


----------



## eternal7trance

Quote:


> Originally Posted by *hoevito*
> 
> I'm right around the same mark after extensive tweaking myself finally. 1250/1775 topping 65*
> 
> I started playing with the voltage buttons on the card itself, bu values still don't seem to break 1.276 when being monitored. Also, has anyone with a Matrix tried playing with the hotwire cables yet? I would LOVE to have some pointers/tips since I'm very much a newb when it comes to overclocking!!!


Well hopefully soon there will be a working msi after burner out that will let me take the voltage higher. For now I'm stuck with iTurbo.
Quote:


> Originally Posted by *Resme*
> 
> Settings for valley? ATM I'm hitting 1200/1750 @ 1.275v... Asus DC 2 280x. Past 1200 core it artifacts. I can probably go higher on the memory.


Just using the extreme HD option.


----------



## amd655

The XFX is one of the hotter running cards of the bunch from reviews, but it's looks are nice


----------



## DeviousAddict

Quote:


> Originally Posted by *amd655*
> 
> The XFX is one of the hotter running cards of the bunch from reviews, but it's looks are nice


Mines sitting at 34'c on idol. i'll get some readings for you on stress next time i run bunch.


----------



## black7hought

Quote:


> Originally Posted by *DeviousAddict*
> 
> I'd love to tell you but i don't know where i would find that info. I can't see it on GPU-Z. Where would i look to find that?
> I ran fire strike earlier. link to results http://www.3dmark.com/fs/1032764
> 
> I've got a 2nd one coming on Saturday but got to wait a couple of weeks before i'm home again to connect it up
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: some one asked for a valley bench.


Thanks for the link and pics. According to the review by Fudzilla the XFX is using Elpida memory chips.


----------



## DeviousAddict

Quote:


> Originally Posted by *black7hought*
> 
> Thanks for the link and pics. According to the review by Fudzilla the XFX is using Elpida memory chips.


You're welcome, thank you for finding out about the memory chips.
Can't wait to Xfire this and see what gain i get on the benches


----------



## hoevito

Feeling better about this card, although I'm going to be VERY tempted to return it and step up to the R9 290x if it comes in at the right price...but I've managed to improve my scores in Firestrike by about 200 points compared to before....


----------



## Cribbs

For those that care, I spotted this: http://www.computerlounge.co.nz/components/componentview.asp?partid=20663
I called up the store and was told that they had stock and the 290x's would be available to purchase in a matgter of hours, not sure if it's the same in the us though.


----------



## dade_kash_xD

Ok, so here's a question I have. I just received 2x XFX DD 280x but the Crossfire Bridges they sent me aren't long enough to connect both cards in crossfire. Is it bad to leave the second card seated and plugged into the PSU without having the crossfire bridge on it? Also, I can't play any game for more than 2 seconds without giving me a DX11 crash.


----------



## eternal7trance

Quote:


> Originally Posted by *dade_kash_xD*
> 
> Ok, so here's a question I have. I just received 2x XFX DD 280x but the Crossfire Bridges they sent me aren't long enough to connect both cards in crossfire. Is it bad to leave the second card seated and plugged into the PSU without having the crossfire bridge on it? Also, I can't play any game for more than 2 seconds without giving me a DX11 crash.


You have to have the bridge hooked up


----------



## dade_kash_xD

Quote:


> Originally Posted by *eternal7trance*
> 
> You have to have the bridge hooked up


I know but what I'm asking is if I leave that second card seated in that slot but unplugged until I get a longer Crossfire Bridge, will that damage the lanes/gpu/etc?


----------



## eternal7trance

Quote:


> Originally Posted by *dade_kash_xD*
> 
> I know but what I'm asking is if I leave that second card seated in that slot but unplugged until I get a longer Crossfire Bridge, will that damage the lanes/gpu/etc?


No, but there's really no point of it being in there if you aren't using it.


----------



## c0ld

Considering getting 2 of these on crossfire over a single 290X. Anyone have this card?

http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=8573836&CatId=7387


----------



## dade_kash_xD

So here is my opinion on the 280x. ABSOLUTE PIECE OF GARBAGE. I cant play a single round of bf3 for more than 2 minutes before I get a red screen and system reboot.


----------



## eternal7trance

Quote:


> Originally Posted by *dade_kash_xD*
> 
> So here is my opinion on the 280x. ABSOLUTE PIECE OF GARBAGE. I cant play a single round of bf3 for more than 2 minutes before I get a red screen and system reboot.


Not sure what to say, I switched from my 670 to the 280x and BF3 is just fine on both cards.


----------



## miklkit

Yes I do have an MSI R9 280X.









It is suggested that you flash the bios right away as the bios that comes with it is for the 7970 and it is supposed to run hot. I never noticed any overheating but flashed it anyway.
http://www.techpowerup.com/vgabios/index.php?architecture=ATI&manufacturer=MSI&model=R9+280X&interface=&memType=&memSize=

This one also seems to need some burn in time. It started off really slow but has been getting faster daily. It looks to be faster than the X2 6970s I had before and is quite a bit less noisy.

So far so good.


----------



## Cribbs

http://www.youtube.com/watch?v=-lZ3Z6Niir4
Well, what a disappointment.


----------



## NinjaToast

Quote:


> Originally Posted by *Cribbs*
> 
> http://www.youtube.com/watch?v=-lZ3Z6Niir4
> Well, what a disappointment.


How? It matches and even beats a titan in most cases, the only issues with it is the cards run hot and use a little more juice. You should explore the news section where there are more reviews..


----------



## Cribbs

Quote:


> Originally Posted by *NinjaToast*
> 
> How? It matches and even beats a titan in most cases, the only issues with it is the cards run hot and use a little more juice. You should explore the news section where there are more reviews..


I like to take aesthetics into account, and the 290x looks like crap tbh, and with a card running that hot, you have to wonder how much overclocking you will be able to do (keep in mind an oc'd 780 destroys the titan aswell)


----------



## NinjaToast

Quote:


> Originally Posted by *Cribbs*
> 
> I like to take aesthetics into account, and the 290x looks like crap tbh, and with a card running that hot, you have to wonder how much overclocking you will be able to do (keep in mind an oc'd 780 destroys the titan aswell)


Aesthetics should not account for a card being a disappointment IMO but to each their own. Reference cards in general are used by those who water cool rather than air cool and If I'm not mistaken AMD said to one of the reviewers that the cards are perfectly safe running at the temps that they are, how much people believe that is up to them. OC wise I'm sure once the card is underwater they'll do quite well and when the non-reference cards come the air cooling will see a boost in OC territory but that's just what I think. I was not referring to the 290x beating the titan with an OC, I was referring to stock, which it beats many times.


----------



## FernTeixe

I want to see r9 290x/290 with vapo-x cooler or something. until there it's hard to know if the card is really good, or just an AMD gtx780...
My guess is that when it get custom coolers it will be amazing.
my 7970 overclocked to 1150/1550 is getting 4048 graphic score (always same score) in Fire strike extreme. almost all reviews of gtx780 with stock clocks get around 4400... so if with this horrible and with stock clock it's faster.... it will be amazing overclocked.

and when nvidia lower the price AMD lower it too... so it will be very cheap in less than 1 year









sorry my engRish


----------



## Cribbs

Quote:


> Originally Posted by *FernTeixe*
> 
> I want to see r9 290x/290 with vapo-x cooler or something. until there it's hard to know if the card is really good, or just an AMD gtx780...
> My guess is that when it get custom coolers it will be amazing.
> my 7970 overclocked to 1150/1550 is getting 4048 graphic score (always same score) in Fire strike extreme. almost all reviews of gtx780 with stock clocks get around 4400... so if with this horrible and with stock clock it's faster.... it will be amazing overclocked.
> 
> and when nvidia lower the price AMD lower it too... so it will be very cheap in less than 1 year
> 
> 
> 
> 
> 
> 
> 
> 
> 
> sorry my engRish


Word is AMD is allowing any vendors sell any units with custom cooling for a while.


----------



## FernTeixe

I think everybody can't hold their anxiety haha... almost 2 years after 7970 came out and it's figthing with gtx 780 in some cases... and when 680 came out people said it was way better than 7970
so if people wait 2-3 month r9 290x will get some nice coolers and will overclock well...

I don't know sometimes it's like people get addicted to have the best graphic card, like it's the game itself... for those who work with benchmark I can understand... for people who just buy cards like rich lady buy expensive shoes... I guess it's just anxiety


----------



## Ash568

about to get r9 280x are they wotth it ?


----------



## melodystyle2003

Quote:


> Originally Posted by *Ash568*
> 
> about to get r9 280x are they wotth it ?


Depending on the budget.
Amd R9 280x-290-290x looks best vfm.


----------



## rdr09

Quote:


> Originally Posted by *dade_kash_xD*
> 
> So here is my opinion on *MY* 280x. ABSOLUTE PIECE OF GARBAGE. I cant play a single round of bf3 for more than 2 minutes before I get a red screen and system reboot.


fixed.


----------



## Ash568

low budget about £280 just looking for something to max out bf4 my 570 couldn't handle the beta on mid lol


----------



## Cribbs

Quote:


> Originally Posted by *rdr09*
> 
> fixed.


I assume you have tried pretty much everything so just send it back, I got 2 gtx 780's DOA in a row, sometimes you just get unlucky.


----------



## Kraanipea

Is it ok if I throw out the fat 2mm Thermal tape from that big VRM heatsink and replace it with 1mm thermal tape and covering the whole heatsink with it. The reason I'm asking this is because I don't know if its good when thermal tape touches the transistors, which are also under the VRM heatsink and which is also the reason why they use so fat thermal tape in the first place, so the heatsink wouldn't touch the transistors.
Also, my core temp in Idle is sitting at 50 C, everywhere i look, i see them around 30-40. Guess I'm going to have to change the TIM


----------



## Ashuiegi

Quote:


> Originally Posted by *dade_kash_xD*
> 
> So here is my opinion on the 280x. ABSOLUTE PIECE OF GARBAGE. I cant play a single round of bf3 for more than 2 minutes before I get a red screen and system reboot.


do you have a asus top card ? with gpu tweak ? it was a known problem with gpu tweak and gtx 670 dcuii top that were overclock very high from stock you got driver crash and red screen, removing or updating gpu tweak solved the issue , but bf3 was the one giving the most problem to the gtx 670 that were buggy.


----------



## olliiee

Sapphire Dual X vs Vapor X (870 boost 1020 vs 950 boost 1050 core) vapor X worth it over the dual?


----------



## dade_kash_xD

Quote:


> Originally Posted by *Ashuiegi*
> 
> do you have a asus top card ? with gpu tweak ? it was a known problem with gpu tweak and gtx 670 dcuii top that were overclock very high from stock you got driver crash and red screen, removing or updating gpu tweak solved the issue , but bf3 was the one giving the most problem to the gtx 670 that were buggy.


no, I have 2 XFX DD 280x. I fixed the crash issue. It was actually my fault. I set the cpu voltage max to 1.2v @ 4.6ghz and that was causing the crash.

I was just really upset because the crossfire bridges XFX sent me are too short to crossfire my cards and I cant find any longer crossfire bridges anywhere.


----------



## b3ka

Quote:


> Originally Posted by *dade_kash_xD*
> 
> no, I have 2 XFX DD 280x. I fixed the crash issue. It was actually my fault. I set the cpu voltage max to 1.2v @ 4.6ghz and that was causing the crash.
> 
> I was just really upset because the crossfire bridges XFX sent me are too short to crossfire my cards and I cant find any longer crossfire bridges anywhere.


How come your CF Cable is too short? Damn, I bought Matrix 3 slot for CF and my mobo setup for CF is with no next to each other setup, yet CF cable was perfectly fine?



BTW. Running CFsetup on 1,250V right now with 1175/1725


----------



## dade_kash_xD

The thing is the XFX DD 280x cards have a fan shroud that sticks up like a .5" higher than the Crossfire fingers. Thats why im trying to figure out how long the crossfire bridges xfx sends you are.


----------



## eternal7trance

Quote:


> Originally Posted by *dade_kash_xD*
> 
> The thing is the XFX DD 280x cards have a fan shroud that sticks up like a .5" higher than the Crossfire fingers. Thats why im trying to figure out how long the crossfire bridges xfx sends you are.


How long are the ones they gave you?


----------



## eTheBlack

*Got my R9 280X!*








Works great and I got quite of improvement, except when PhysX is in questions.

Proof:

or http://i.imgur.com/qt8GyCu.jpg


or http://i.imgur.com/RaAefGZ.png

Some benchmarks:
CPU: i7 950 @3.8GHz (4 core, 8 thread)
GPU: Gigabyte R9 280X OC (Windforce 3) [|||] MSI TwinFrozr II 470 SLI
RAM: Corsair 12GB (6x 2GB) @1810MHz
OS.: Win7 64bit
Res: 1920x1080

Check Spoilers or check album: Album Here!


Spoiler: Warning: Spoiler!



Gigabyte R9 280X OC (Windforce 3)
Drivers: 13.11 beta3

MSI TwinFrozr II 470 SLI
Drivers: 331.58

*Heaven Benchmark 4.0*
FPS..................: 36.8 ............ n/a
Min/Max FPS......: 7.6 / 82.3 ..... n/a
Score...............: 926 ............ n/a

*Valley Benchmark 1.0*
FPS.................: 45.0 ............ n/a
Min/Max FPS.....: 19.2 / 86.7 ..... n/a
Score..............: 1884 ............ n/a

*3DMark*
Ice Storm.......: 128232 .......... 121251
Cloud Gate......: 21311 ........... 18869
Fire Strike.......: 7279 ............. 3277

*Max Payne 3* (Steam, Fraps)
Min/Max FPS.....: 18.0 / 62.0 ..... 0.0 / 62.0
Avg.................: 58.8 .............. 36.5

*Company of Heroes 2* (Steam, In-game Performance Test)
Min/Max FPS.....: 20.0 / 51.3 ..... 6.59 / 17.9
Avg................: 36.3 ............... 12.8

*Mafia 2* (Steam, In-game Benchmark)
Avg.................: 23.4 ............ 42.3
PhysX Off Avg...: 59.4 ............ n/a

*Tomb Raider (2013)* (Steam, In-game Benchmark)
Min/Max FPS......: 25.6 / 48,6 ..... 22.0 / 40.9
Avg..................: 44.2 .............. 30.5
TressFX On Avg..: 36.0 .............. n/a

*Metro 2033* (Steam, Standalone Benchmark)
Min/Max FPS.....: 7.7 / 263 ...... 4.2 / 178
Avg.................: 42.0 ............ 22.3
PhysX Off.........: 62.0 ............ n/a



And goodbye to my 470 SLI









or http://i.imgur.com/goWvat8.jpg


----------



## amd655

Quote:


> Originally Posted by *eTheBlack*
> 
> *Got my R9 280X!*
> 
> 
> 
> 
> 
> 
> 
> 
> Works great and I got quite of improvement, except when PhysX is in questions.
> 
> Proof:
> 
> or http://i.imgur.com/qt8GyCu.jpg
> 
> 
> or http://i.imgur.com/RaAefGZ.png
> 
> Some benchmarks:
> CPU: i7 950 @3.8GHz (4 core, 8 thread)
> GPU: Gigabyte R9 280X OC (Windforce 3) [|||] MSI TwinFrozr II 470 SLI
> RAM: Corsair 12GB (6x 2GB) @1810MHz
> OS.: Win7 64bit
> 
> Check Spoilers or check album: Album Here!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Gigabyte R9 280X OC (Windforce 3)
> Drivers: 13.11 beta3
> 
> MSI TwinFrozr II 470 SLI
> Drivers: 331.58
> 
> *Heaven Benchmark 4.0*
> FPS..................: 36.8 ............ n/a
> Min/Max FPS......: 7.6 / 82.3 ..... n/a
> Score...............: 926 ............ n/a
> 
> *Valley Benchmark 1.0*
> FPS.................: 45.0 ............ n/a
> Min/Max FPS.....: 19.2 / 86.7 ..... n/a
> Score..............: 1884 ............ n/a
> 
> *3DMark*
> Ice Storm.......: 128232 .......... 121251
> Cloud Gate......: 21311 ........... 18869
> Fire Strike.......: 7279 ............. 3277
> 
> *Max Payne 3* (Steam, Fraps)
> Min/Max FPS.....: 18.0 / 62.0 ..... 0.0 / 62.0
> Avg.................: 58.8 .............. 36.5
> 
> *Company of Heroes 2* (Steam, In-game Performance Test)
> Min/Max FPS.....: 20.0 / 51.3 ..... 6.59 / 17.9
> Avg................: 36.3 ............... 12.8
> 
> *Mafia 2* (Steam, In-game Benchmark)
> Avg.................: 23.4 ............ 42.3
> PhysX Off Avg...: 59.4 ............ n/a
> 
> *Tomb Raider (2013)* (Steam, In-game Benchmark)
> Min/Max FPS......: 25.6 / 48,6 ..... 22.0 / 40.9
> Avg..................: 44.2 .............. 30.5
> TressFX On Avg..: 36.0 .............. n/a
> 
> *Metro 2033* (Steam, Standalone Benchmark)
> Min/Max FPS.....: 7.7 / 263 ...... 4.2 / 178
> Avg.................: 42.0 ............ 22.3
> PhysX Off.........: 62.0 ............ n/a
> 
> 
> 
> And goodbye to my 470 SLI
> 
> 
> 
> 
> 
> 
> 
> 
> 
> or http://i.imgur.com/goWvat8.jpg


You can always keep a 470 as PhysX, i am sure there is some hack to get it working with AMD GPU as the dedicated rendering card









Congrats on the purchase mate!


----------



## dade_kash_xD

I called XFX and the one they send standard is the 70mm version so I just ordered the ROG 100mm version. If that doesnt fit, im going to return these 2 280xs and get another brand.


----------



## Durvelle27

Hard decision guys. Crossfire or single 290X


----------



## amd655

290X


----------



## leyzar

I consider the 290x a smeiflop, so i would suggest crossfire of 280xs , any other day i would have suggested a single card.


----------



## Ashuiegi

crossfire 280 x is superior in most game , but you have to deal with CF. but i m like a lot of people a 95c card with a reference cooler for 600 bucks dont interest me at all , i don't like the way everything is based on a temperature and it wont clock the same from card to card , some will run slower then other,.... which seems like a nightmare for CF, the top card will be soo hot that it will limit the CF potential because of the crappy cooler and stange way to manage the boost/power.


----------



## tsm106

Quote:


> Originally Posted by *Ashuiegi*
> 
> crossfire 280 x is superior in most game , but you have to deal with CF. but i m like a lot of people a 95c card with a reference cooler for 600 bucks dont interest me at all , i don't like the way everything is based on a temperature and it wont clock the same from card to card , some will run slower then other,.... which seems like a nightmare for CF, the top card will be soo hot that it will limit the CF potential because of the crappy cooler and stange way to manage the boost/power.


Quote:


> Originally Posted by *leyzar*
> 
> I consider the 290x a smeiflop, so i would suggest crossfire of 280xs , any other day i would have suggested a single card.


You two are hilarious. You don't like it fine, no need to talk smack on the new card. It's pretty ironic though cuz I have the fastest tahiti rig on ocn and I am moving on up. And I'm not dissing tahiti as I move on.


----------



## rdr09

imagine a 290X in your custom loop.


----------



## sugarhell

Quote:


> Originally Posted by *rdr09*
> 
> imagine a 290X in your custom loop.


Soon tm


----------



## Ashuiegi

we don't talk smack , we just say what we think , and i m sorry but a 280x crossfire give you better fps in game then a 290x , and that was his question , and most review say the cooler is bad and the temp is a little high for their taste , it s not like i was the only one thinking that,..... sure if it s just for epeen and saying i have the best gpu available then feel free to go for the 290x but don't come and say it s performing better then a 280x CF.


----------



## eTheBlack

Quote:


> Originally Posted by *amd655*
> 
> You can always keep a 470 as PhysX, i am sure there is some hack to get it working with AMD GPU as the dedicated rendering card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats on the purchase mate!


There is, but not sure if it is working. If I will go that way then I will rather buy some 6xx low entry to medium entry card with one 6pin cable. 470 uses 2x 6pin...


----------



## sugarhell

Quote:


> Originally Posted by *Ashuiegi*
> 
> we don't talk smack , we just say what we think , and i m sorry but a 280x crossfire give you better fps in game then a 290x , and that was his question , and most review say the cooler is bad ans the temp is a little high for their taste , it s not like i was the only one thinking that,..... sure if it s just for epeen and saying i have the best gpu available then feel free to go for the 290x but don't come and say it s performing better then a 280x CF.


Oced and under water yeah it will perform better than cf 280x. Also space heat and scaling are not perfect. Its not like nvidia situation that you need to give your heart to buy their best single gpu. Here i can get a 290x for 499 or 280x cf for 600. What you choose


----------



## amd655

Quote:


> Originally Posted by *eTheBlack*
> 
> There is, but not sure if it is working. If I will go that way then I will rather buy some 6xx low entry to medium entry card with one 6pin cable. 470 uses 2x 6pin...


True that!

470 SLi is still a fast combo, put them up for sale or throw them into another rig i say


----------



## leyzar

Nobody is talking smack, the man asked for a opinion , i gave it, end of story.


----------



## Ashuiegi

From the review i can also get that 290x don't oc that well , i hope it will be better with water but at the same time you can oc a 280x too and you can already know what to expect from it.

On air you can expect something like that :1200-1800 280x CF vs 1150-1800 290x , i pretty sure the 290x wont win this one ,..... but again it's a new card and new card is never the best bet for best perf/cost ratio , it s not news,.... but i have nothing against the 290x gpu , just i don't like this reference card,....

water is great but it s not for everyone , i recently build my first custom loop and i m sometimes thinking about going back to air , just because you can swap parts much more easily and you have far less time/attention/maintenance to do. sure it look nicer but time is precious too ,....

and i choose the 280x CF, because even for 100$ more you have 2 cards , you can't split a 290x and make 2 gaming computer , you can't sell half of it or add half of it later ,....
and even more in my case since i already own a 7970Gz so i have the choice between going CF for 300 or single card for 500 , it's just an evidence .
the only thing that i like in this 290x is the trueaudio feature , that's the only thing i will miss if it really reduce the cpu load like they said it would ,....

stock 7990 is above the 290x in nearly every bench ,..... the 290 is 499 the 290x is 549 , without bf4 or early premiums,.....


----------



## Sk1llS

@Cool Mike, any chance of a Toxic bios upload? Thanks!


----------



## leyzar

Also do take into consideration that its the first ever GPU launch that kind of twists your arm into buying water blocks for it but yet it voids your warranty ... and NO i do not believe AMDs story "its suppose to run at 95c" .. you have to be a little stupid to believe that, but hey fanboys will be fanboys.
What is also a bit crazy is suggesting that a crossfire of 280xs would actually be weaker then a single 290x... seriously ?







Where is your proof ? got benches ? cause if not then its pretty much fanboy gargle-talk. And if you are talking price/performance please do try to remember to add the cost of waterblocks to your 290x.
I do not understand why rational intelligent people cant keep a objective vision during new product launches and suddenly transform into groupies.
There is no escaping the fact that the 290x is a rushed product with overheating issues, most likely because of the BF4 situation. No doubt the non-reference designs will fix this issue.


----------



## DeviousAddict

Those of you with xfire 280x's mind posting a firestrike or valley bench please. I have my 2nd card on the way and I want to compare.


----------



## sugarhell

Quote:


> Originally Posted by *tsm106*
> 
> Two more for the block list.


Lol. I agree.


----------



## NinjaToast

Everyone has a different viewpoint on the 290x, however this discussion is not about that card, so can we get back to the 280x and leave the 290x to the review threads and official club please? Comparisons are one thing but when you start throwing out an opinion and calling people fanboys is when it becomes a different subject entirely, quite frankly OCN is far to familiar with insulting one another while holding intelligent conversations where both sides always believes it's right.

It's like watching a Circus without the entertainment.


----------



## b3ka

http://www.3dmark.com/fs/1036907


----------



## smaudioz

I've just bought a 280x, what driver are you guys running for it? I've heard the beta one AMD recommends causes black screens or something like that.


----------



## eTheBlack

Quote:


> Originally Posted by *smaudioz*
> 
> I've just bought a 280x, what driver are you guys running for it? I've heard the beta one AMD recommends causes black screens or something like that.


beta3 for me works great, except in BF3 I see green "stuff" as trail when I move mouse, but that only happens when I'm in building or object is in "shadow". It is really wierd, and my head hurts, I should play little more with BF3 settings to fix that issue.


----------



## smaudioz

Quote:


> Originally Posted by *eTheBlack*
> 
> beta3 for me works great, except in BF3 I see green "stuff" as trail when I move mouse, but that only happens when I'm in building or object is in "shadow". It is really wierd, and my head hurts, I should play little more with BF3 settings to fix that issue.


That doesn't sound good, you shouldn't be having things like that happen no matter what settings you use. I am already doubting my decision to buy this card


----------



## Cid

Haha, yeah that's like saying your sandwich is great, except you've got this funny feeling in your gut and your vision is slowly fading. And oh, last night you pooped blood. But just a little bit.


----------



## eTheBlack

Quote:


> Originally Posted by *smaudioz*
> 
> That doesn't sound good, you shouldn't be having things like that happen no matter what settings you use. I am already doubting my decision to buy this card


Note: *beta3*


----------



## smaudioz

Yes, so are there any non-beta drivers that work properly available for it?


----------



## NinjaToast

Quote:


> Originally Posted by *smaudioz*
> 
> Yes, so are there any non-beta drivers that work properly available for it?


they are basically re-badged 7970GE cards so you could probably use WHQL drivers.


----------



## Conissah

I'm really wanting to get an MSI R9 280x for my build, but I'm worried about heat. I want to build it in an EVGA Hadron Air, and don't want my CPU getting too hot. What do you guys think?


----------



## NinjaToast

Quote:


> Originally Posted by *Conissah*
> 
> I'm really wanting to get an MSI R9 280x for my build, but I'm worried about heat. I want to build it in an EVGA Hadron Air, and don't want my CPU getting too hot. What do you guys think?


Heat issue with the MSI 280x is solved/lessened by flashing the bios with a new one MSI provided Techpowerup who in turn provided it for everyone else, it *Shouldn't* have heat issues after flashing your card to the new one.


----------



## miklkit

MSI R9 280X here. I have tried 13.4, 13.9, 13.10, and currently am using 13.11 B3 and really could not tell any difference. Tweaking settings does more.

Max temps so far is 65C with the new bios. 3DMark says it is in the 81% range stock.


----------



## Conissah

Quote:


> Originally Posted by *NinjaToast*
> 
> Heat issue with the MSI 280x is solved/lessened by flashing the bios with a new one MSI provided Techpowerup who in turn provided it for everyone else, it *Shouldn't* have heat issues after flashing your card to the new one.


Okay thanks. But there is still the "issue" of the card heating up the entire computer because its an open style GPU in a SFF pc.


----------



## NinjaToast

Quote:


> Originally Posted by *Conissah*
> 
> Okay thanks. But there is still the "issue" of the card heating up the entire computer because its an open style GPU in a SFF pc.


Pretty much any card you buy is gonna dissipate heat into the case, this particular style just generates more of it and my advice would be to mount a fan on the outside of the HDD bays and have it positioned to blow over the card to push it out the back. As for what fan to mount I couldn't tell you which would be the best but I do recommend Corsair's line of fans, I have a 140mm fan mounted in my 3.5inch drive bays and it blows cool air toward my CPU cooler, helps keep temps down.


----------



## black7hought

UPS tracking status says my card is delayed due to a "train derailment"! It was supposed to be here today!

I will be 6950ing a little while longer.


----------



## miklkit

Quote:


> Originally Posted by *Conissah*
> 
> Okay thanks. But there is still the "issue" of the card heating up the entire computer because its an open style GPU in a SFF pc.


The you need to fix your case air flow situation. This layout works for me. Four intake fans and no exhaust fans.


----------



## NinjaToast

Quote:


> Originally Posted by *black7hought*
> 
> UPS tracking status says my card is delayed due to a "train derailment"! It was supposed to be here today!
> 
> I will be 6950ing a little while longer.


Well at least you have yours ordered.. I wont be ordering mine till next week maybe (boss is forgetful on writing checks







) and the wait is killing me a little bit.


----------



## smaudioz

So... is anybody using the stable drivers for it? Do they work with it? Also am i likely to just have to uninstall nvidia drivers, swap cards and install AMD drivers and it will be fine, or is it usually not that simple if changing from nvidia to amd?


----------



## battleaxe

Quote:


> Originally Posted by *smaudioz*
> 
> So... is anybody using the stable drivers for it? Do they work with it? Also am i likely to just have to uninstall nvidia drivers, swap cards and install AMD drivers and it will be fine, or is it usually not that simple if changing from nvidia to amd?


I've had all kinds of trouble in the past when leaving Nvidia drivers installed and then installing AMD drivers also. Last time I had to uninstall the Nvidia drivers first before it would work correctly.

I'm using the latest BETA, working fine.


----------



## BackwoodsNC

I am having one little issue that maybe you guys can help me with. When I try to add voltage and hit apply, it does not apply the voltage. All that happens is pstates are disabled.







The only way I have been able to apply more voltage is by editing the BIOS and flashing the card. Doing it that I can't control the memory voltage.

Any thoughts??


----------



## battleaxe

Quote:


> Originally Posted by *BackwoodsNC*
> 
> I am having one little issue that maybe you guys can help me with. When I try to add voltage and hit apply, it does not apply the voltage. All that happens is pstates are disabled.
> 
> The only way I have been able to apply more voltage is by editing the BIOS and flashing the card. Doing it that I can't control the memory voltage.
> 
> Any thoughts??


Voltage is probably locked. Only BIOS flash will remedy that. GTX 660 is same way and that's what I had to do.


----------



## BackwoodsNC

Quote:


> Originally Posted by *battleaxe*
> 
> Voltage is probably locked. Only BIOS flash will remedy that. GTX 660 is same way and that's what I had to do.


Well I know I have no control of it in Afterburner or Trixx. After some more tinkering, I only lose the power-states with I mess with ram clocks.

I flashed the bios with 1.263 and here it is. They both showed the same voltage in valley and GPU-z, only changed when I alt-tabbed.



Extreme HD preset


----------



## theilya

Hi gents,

Just ordered 2x MSI 280x.

This is my first AMD card.

Few questions:

1. What FPS can I expect in BF3 everything on ultra @ 1440? (crossfire)
2. I heard CF doesnt work in fullscreen windowed mode, is this true?
3. what version of MSI after burner should I be using? the latest beta?
4. What is the avg overclock on those cards? 1150-1200 / 7000 ?
5. should I flash the card with new bios only if I'm having heat issues? the bios provided by techpowerup works with all MSI 280x cards? What software do I use to flash the bios?
6. Should I be worried with corsair 750 watt PSU? do i need to upgrade?


----------



## Crowe98

Quote:


> Originally Posted by *NinjaToast*
> 
> Everyone has a different viewpoint on the 290x, however this discussion is not about that card, so can we get back to the 280x and leave the 290x to the review threads and official club please? Comparisons are one thing but when you start throwing out an opinion and calling people fanboys is when it becomes a different subject entirely, quite frankly OCN is far to familiar with insulting one another while holding intelligent conversations where both sides always believes it's right.
> 
> It's like watching a Circus without the entertainment.


+REP.


----------



## BackwoodsNC

Quote:


> Originally Posted by *theilya*
> 
> 5. should I flash the card with new bios only if I'm having heat issues? the bios provided by techpowerup works with all MSI 280x cards? What software do I use to flash the bios?


Use this to flash your cards with the new BIOS from TPU


----------



## MooseHead

Does anybody know if the Sapphire Vapor-x 280x is voltage unlocked?


----------



## FernTeixe

I don't think you'll need unlocked vapor-x if you'll stay with the cooler.... all vapor-x line have voltage at 1.256 it's a bit high to air cooling

anyway 7970 vapor-x was unlocked. so I guess it's unlocked to. BUT vapor-x need trixxx utility to change voltage


----------



## leyzar




----------



## Cribbs

Quote:


> Originally Posted by *leyzar*


I want two of those so bad, they cost 500us in NZ


----------



## leyzar

Quote:


> Originally Posted by *Cribbs*
> 
> I want two of those so bad, they cost 500us in NZ


Yeah me to.. but my advice to you is to stay calm and play it smart.. why ? the 290 should be priced around 450 bucks and it should kick derier so hard! ... i hope...


----------



## EnToxication

Quote:


> Originally Posted by *theilya*
> 
> Hi gents,
> 
> Just ordered 2x MSI 280x.
> 
> This is my first AMD card.
> 
> Few questions:
> 
> 1. What FPS can I expect in BF3 everything on ultra @ 1440? (crossfire)
> *100+ on single player with 2xmsaa, multi is more CPU dependant.*
> 2. I heard CF doesnt work in fullscreen windowed mode, is this true?
> *Yes*
> 3. what version of MSI after burner should I be using? the latest beta?
> *Yes*
> 4. What is the avg overclock on those cards? 1150-1200 / 7000 ?
> *1175/1650 for me even with trifire on air. I just got my blocks, going to OC tomorrow. Even at those clocks, it overheats like a mofo*
> 5. should I flash the card with new bios only if I'm having heat issues? the bios provided by techpowerup works with all MSI 280x cards? What software do I use to flash the bios?
> *ATIWINFLASH*
> 6. Should I be worried with corsair 750 watt PSU? do i need to upgrade?
> *NOPE, depends on what model, if it's high efficiency PSU it shouldn't matter, since I was runing 17 fans+ x79 platform and 280x trifire all on HX1000 for few days and it was stable.*


----------



## Kraanipea

Quote:


> Originally Posted by *MooseHead*
> 
> Does anybody know if the Sapphire Vapor-x 280x is voltage unlocked?


My 280x Vapor-X is voltage locked. The voltage doesn't change with afterburner nor sapphire trixx. Also the temps are crazy high and the thing is loud as hell under load.


----------



## EnToxication

Have to say, custom loop makes it really low. I don't think these cards are hot at all in general, comparing these to 580gtx I had. Coolers are piece of garbage. They hit about 43C about at 30min mark. 30 idle, MSI model FYI. Going to OC and run firestrike tomorrow. But downloading batman right now


----------



## BackwoodsNC

Quote:


> Originally Posted by *EnToxication*
> 
> Have to say, custom loop makes it really low. I don't think these cards are hot at all in general, comparing these to 580gtx I had. Coolers are piece of garbage. They hit about 43C about at 30min mark. 30 idle, MSI model FYI. Going to OC and run firestrike tomorrow. But downloading batman right now


Are you able to keep powerstates enabled when overclocking ram? If I even touch my ram clocks from 1500 to even 1501 I lose p-states

What blocks did you get?


----------



## Cid

Do we have a HIS representative on these here forums? Because it turns out the place I ordered it from was full of it with their "in stock" lies and so my card is on back-order and they said their supplier can't give an ETA either. So maybe someone straight from HIS could.


----------



## Tobiman

Quote:


> Originally Posted by *Kraanipea*
> 
> My 280x Vapor-X is voltage locked. The voltage doesn't change with afterburner nor sapphire trixx. Also the temps are crazy high and the thing is loud as hell under load.


I had that card (7970 version) and it was my best performer of all 4 7970s. I only experienced temps like yours when I was bitcoin mining and running furmark (which is pointless as far as stress testing is concerned). Normal gaming with 100% load gave me about 50 - 60 degress celsius depending on ambient and it never went above that even when playing Crysis or any game for that matter.
If you want to improve temps, make sure your case has good air flow and your cards are well seperated, if you are running CFX.


----------



## eternal7trance

Quote:


> Originally Posted by *Cid*
> 
> Do we have a HIS representative on these here forums? Because it turns out the place I ordered it from was full of it with their "in stock" lies and so my card is on back-order and they said their supplier can't give an ETA either. So maybe someone straight from HIS could.


Sorry to hear that man. It's a really good card from what I've been doing so far.

It's too bad they don't have a newegg for people outside North America.


----------



## Cid

A EU Newegg would be amazing. PC retailers here rarely cater to the enthusiast. I already had to order mine in the Netherlands because all of the Belgian retailers seem to have made a deal with MSi and ASUS, as those are pretty much the only two brands you can even get here. Some shops just straight up dropped AMD altogether, only offering Intel and nVidia.

Oh well, at least some Dutch shops sometimes have something that isn't MSi or ASUS. Even if that means waiting an undisclosed amount of time to get what you ordered.


----------



## smaudioz

I have only just found out after reading a newegg review that on the box of the Asus DCU II TOP 280x I've bought it says it needs recommended power supply of 750W minimum even though only 24A on the 12V, and it does as I have just checked myself. I have a 680W but with over 50A on the 4 12V rails, What should I do?


----------



## eternal7trance

Quote:


> Originally Posted by *smaudioz*
> 
> I have only just found out after reading a newegg review that on the box of the Asus DCU II TOP 280x I've bought it says it needs recommended power supply of 750W minimum even though only 24A on the 12V, and it does as I have just checked myself. I have a 680W but with over 50A on the 4 12V rails, What should I do?


Your card only pulls 370ish watts at full load so you should be just fine. I've seen people with 620 watt PSUs from Seasonic that run a 7970 with a 2500k like yours and the 280x from Asus is a little better on power usage.


----------



## smaudioz

Quote:


> Originally Posted by *eternal7trance*
> 
> Your card only pulls 370ish watts at full load so you should be just fine. I've seen people with 620 watt PSUs from Seasonic that run a 7970 with a 2500k like yours and the 280x from Asus is a little better on power usage.


Yeah I saw the actual power consumption figures on some other sites and it was around what you said. I know companies over-spec PSU's because of bad quality ones but 750W seems like a massive exaggeration...

Apparently my 570 used up to around 450W at load yet only specced a 550W minimum PSU, why is the recommended wattage so high for this 280x?


----------



## sugarhell

Wait a 7970 370 watt? No way except if it can bypass the ocp.

The tdp pcb limit is always 300 watt.250watt+20% powerlimit


----------



## smaudioz

Quote:


> Originally Posted by *sugarhell*
> 
> Wait a 7970 370 watt? No way except if it can bypass the ocp.
> 
> The tdp pcb limit is always 300 watt.250watt+20% powerlimit


----------



## smaudioz

Quote:


> Originally Posted by *sugarhell*
> 
> Wait a 7970 370 watt? No way except if it can bypass the ocp.
> 
> The tdp pcb limit is always 300 watt.250watt+20% powerlimit


----------



## sugarhell

Total system load


----------



## eternal7trance

Quote:


> Originally Posted by *smaudioz*
> 
> Yeah I saw the actual power consumption figures on some other sites and it was around what you said. I know companies over-spec PSU's because of bad quality ones but 750W seems like a massive exaggeration...
> 
> Apparently my 570 used up to around 450W at load yet only specced a 550W minimum PSU, why is the recommended wattage so high for this 280x?


Because they want you to buy a better PSU. Look at the 7970s they use more power but only have 500w recommended for some of them.


----------



## smaudioz

Quote:


> Originally Posted by *sugarhell*
> 
> Total system load


Oh right you're saying it shouldn't use as much as 370.


----------



## Ribozyme

Hello fellow OCN'ers,

I am in dire need of some first hand experiences with the MSI gaming 280x and the ASUS dc2 280x.
I love the bang for buck ratio on the 280x but I am very picky about noise. Now I don't really care about noise under load because you can adjust the fan speed under load as pleased.
But the one thing that annoys me is idle noise of a GPU. I owned the MSI 760 gaming which was quite quiet but still too audible in idle I felt. I also owned the MSI 680 TF but that was way too loud in idle for me. These cards scored 25dbA and 27dbA respectively in idle according to techpowerup reviews.
Now the reviews of the MSI gaming 280x and asus dc 2 280x show 28 and 31 dbA in idle respectively.
So that worries me a lot! I don't know in how far I can trust/compare these results from techpowerup, but still.
Anyone wants to comment on the idle noise levels of their GPU's? As I am looking for the most quiet on in idle.
Could the minimum idle fan speed be lowered by a BIOS mod and is someone here capable of doing that?

My alternative is buying a 3 month old asus dc2 670 which is legendary for low idle and load noise. But I am worried that the 2gb will limit me on the long run.

Techpowerup reviews of the asus and MSI:
http://www.techpowerup.com/reviews/ASUS/R9_280X_Direct_Cu_II_TOP/25.html

http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/25.html

Also wondering why the asus scores a lot higher than the MSi? Apart from apparent heat issues. Anyone noticed bad coil whine on their MSI 280x? As that seems to be an issue judging from the new egg feedbacks.


----------



## Malik

Guys i have problem and i dont know what is wrong. This is my first AMD card, so here we go. I have Asus R9 280X Direct CU 2 TOP and when i play Mafia 2 i have huge FPS drop. I notice that clocks are going down ( from 1070mhz to 300-500mhz, on memory too ). What is wrong ?

I checked with other games like: Farcry3, Crysis3, Grid2 and Battlefield3 - everything is working great. Only with Mafia 2 i have this problem.

Drivers: ATI Display Driver 13.101WHQL for Win7 32/64 ( from Asus site ).


----------



## eTheBlack

Quote:


> Originally Posted by *Malik*
> 
> Guys i have problem and i dont know what is wrong. This is my first AMD card, so here we go. I have Asus R9 280X Direct CU 2 TOP and when i play Mafia 2 i have huge FPS drop. I notice that clocks are going down ( from 1070mhz to 300-500mhz, on memory too ). What is wrong ?
> 
> I checked with other games like: Farcry3, Crysis3, Grid2 and Battlefield3 - everything is working great. Only with Mafia 2 i have this problem.
> 
> Drivers: ATI Display Driver 13.101WHQL for Win7 32/64 ( from Asus site ).


PhysX On? Turn it off.


----------



## leyzar

A excellent vid showing off Crossfire Configs of the current R9 line-up... to bad the 290 non X is not there








So to sum it up... a crossfire config of 270xs will give you 290x-like performance for those who asked..pretty awesome peformance for a lower price then that of the 290x if you dont mind the disatvantages of a dual card setup.
A crossfire of 280xs vaporizes the single 290x and ofc a Crossfire of 290xs is just... Godlike


----------



## Malik

Thanks, now its working


----------



## eldukay20

Didn't realise there was a 3 slot version of the DC2 TOP when i bought it... oh well


----------



## NinjaToast

^ There isn't a 3 slot version of the 280x DC2 TOP far as I've seen.


----------



## battleaxe

The three slot version is the Matrix isn't it?


----------



## Tobiman

Yup


----------



## eldukay20

https://www.asus.com/Graphics_Cards/R9280XDC2T3GD5V2


----------



## NinjaToast

Quote:


> Originally Posted by *battleaxe*
> 
> The three slot version is the Matrix isn't it?


That aint the matrix though. xD

Quote:


> Originally Posted by *eldukay20*
> 
> https://www.asus.com/Graphics_Cards/R9280XDC2T3GD5V2


They show on their site but you don't see it at any retailers, interesting.


----------



## Jophess

Quote:


> Originally Posted by *NinjaToast*
> 
> That aint the matrix though. xD
> They show on their site but you don't see it at any retailers, interesting.


Superbiiz sells the 3 slot 280x DC2 TOP.


----------



## smaudioz

Quote:


> Originally Posted by *leyzar*
> 
> ofc a Crossfire of 290xs is just... Godlike


Yes, there would just be the small problem of them melting eachother


----------



## NinjaToast

Quote:


> Originally Posted by *Jophess*
> 
> Superbiiz sells the 3 slot 280x DC2 TOP.


Ehhhhh, superbiiz.. The trust level hangs mostly on how much experience you have with them, though I don't doubt they have the card.


----------



## candy_van

Ewiz/Superbiiz are legit.
Ive bought my share of vid cards from them over the years since they often had the best prices whem running promo codes.


----------



## MooseHead

Luckily me getting paid and the ASUS DC2T getting back in stock on newegg coincided today. Should have the card hopefully tomorrow or at the latest Tuesday given I live in SoCal








Just in time for BF4









Also don't want to forget to mention. If you use Newegg mobile to make your purchase, you can get 5% off with promo code: MBLEMC10G

Received $15.50 off the cost to sweeten the deal.


----------



## NinjaToast

Quote:


> Originally Posted by *candy_van*
> 
> Ewiz/Superbiiz are legit.
> Ive bought my share of vid cards from them over the years since they often had the best prices whem running promo codes.


^hmm.. May have to give them a shot sometime then, dunno what I'd buy from them but something. xD


----------



## leyzar

Quote:


> Originally Posted by *smaudioz*
> 
> Yes, there would just be the small problem of them melting eachother










True.. however AMD still insist that its OK... i do not believe it... however if you get those cards under water then i guess it should be fine...


----------



## eTheBlack

Quote:


> Originally Posted by *Malik*
> 
> Thanks, now its working


No problem, you said BF3 is working fine for you, did you notice any green "stuff" behind object when you move your mouse? It might not happen all the time, I notice it when I'm in shadows or on shadow. It is like green blur effect or something, but I have blur effect off.

Can you test it? I will try to capture it tommorow.


----------



## EnToxication

Quote:


> Originally Posted by *BackwoodsNC*
> 
> Are you able to keep powerstates enabled when overclocking ram? If I even touch my ram clocks from 1500 to even 1501 I lose p-states
> 
> What blocks did you get?


Just to make information public, No, I don't lose powerstates. What are you using for reference point? Both GPU Shark and GPU-Z show that powerstates are functional. I'm using full EK CSQ blocks and AS5 for TIM and it's working really good.

Do you have force constant voltage on?


----------



## eldukay20

i got it from Memory Express


----------



## battleaxe

Quote:


> Originally Posted by *eldukay20*
> 
> 
> 
> Didn't realise there was a 3 slot version of the DC2 TOP when i bought it... oh well


I'm wondering if this isn't a 7970 with a new label stamped into the shroud? I didn't think they were making this older style shroud on the 280x....? The mem is only clocked to 6k too. So that supports my theory at least. (instead of 6400 effective)


----------



## smaudioz

Quote:


> Originally Posted by *battleaxe*
> 
> I'm wondering if this isn't a 7970 with a new label stamped into the shroud? I didn't think they were making this older style shroud on the 280x....? The mem is only clocked to 6k too. So that supports my theory at least. (instead of 6400 effective)


It is weird they are selling those. Why have a 3 slot cooler if the 2 slot design performs well for the same card?


----------



## eTheBlack

Just noticed, gonna give it a try tommorow:

http://www.overclock.net/t/1437010/amd-catalyst-13-11-beta-6
or
http://www.guru3d.com/files_details/amd_catalyst_13_11_beta6_(13_250_18_october_24)_download.html


----------



## EnToxication

I'm having hard time finding someone to compare to.

Anyone running trifire setup with 7970 or 280x?

http://www.3dmark.com/3dm/1480868?

I'm hitting 16k on firestrike, wonering what other people are getting


----------



## Modus

EDIT: Switched to the top DVI port and it worked fine.


----------



## Cid

Too old of an HDMI cable (before 1.4 I believe)? Single link DVI also only supports up to 1200p.


----------



## smaudioz

I don't know if this has already been posted in here, but what do you new 280x owners make of this?

http://wccftech.com/amd-preparing-tahiti-xtl-revision-radeon-r9-280x-graphic-card/
Quote:


> The new Tahiti XTL chip would bring lower power consumption, possibly new features such as AMD TrueAudio and full DirectX 11.2 support and even higher clocks. There would still be a large quantity of Tahiti XT2 chips in the market but those looking for the XTL based cards would have to search through their purchases since their isn't going to be any differentiating factor between the two variants. Pricing of the Radeon R9 280X will remain unaffected and supply won't be an issue since AMD always has those left-overs Tahiti XT2 chips to rebrand as the R9 280X graphic card.
> 
> Read more: http://wccftech.com/amd-preparing-tahiti-xtl-revision-radeon-r9-280x-graphic-card/#ixzz2imJRCtmn


----------



## leyzar

Quote:


> Originally Posted by *smaudioz*
> 
> I don't know if this has already been posted in here, but what do you new 280x owners make of this?
> 
> http://wccftech.com/amd-preparing-tahiti-xtl-revision-radeon-r9-280x-graphic-card/


First time i see this... always nice to get a update and less power consumption is always great... but... now the question begin.

Is it actually going to offer more performance/OC better ? or the same/weaker ?
Will it have True Audio ?
How in the blazes can we assure a XTL or XT2 purchase since there is no way to tell them apart ?

edit : +rep for you








edit 2 : going to make a separate topic about this


----------



## navit

Quote:


> Originally Posted by *battleaxe*
> 
> I'm wondering if this isn't a 7970 with a new label stamped into the shroud? I didn't think they were making this older style shroud on the 280x....? The mem is only clocked to 6k too. So that supports my theory at least. (instead of 6400 effective)


You got shamed









Quote:


> Originally Posted by *eTheBlack*
> 
> Just noticed, gonna give it a try tommorow:
> 
> http://www.overclock.net/t/1437010/amd-catalyst-13-11-beta-6
> or
> http://www.guru3d.com/files_details/amd_catalyst_13_11_beta6_(13_250_18_october_24)_download.html


My card now reads R9 200


----------



## smaudioz

Quote:


> Originally Posted by *leyzar*
> 
> edit 2 : going to make a separate topic about this


There already is one but it's not in the graphics card forum:

http://www.overclock.net/t/1436949/cl-r9-280x-to-get-updated-gpu-tahiti-xtl-in-a-few-weeks/30#post_21060051


----------



## leyzar

Quote:


> Originally Posted by *smaudioz*
> 
> There already is one but it's not in the graphics card forum:
> 
> http://www.overclock.net/t/1436949/cl-r9-280x-to-get-updated-gpu-tahiti-xtl-in-a-few-weeks/30#post_21060051


Damn it i already made it.... %^&K


----------



## smaudioz

Quote:


> Originally Posted by *leyzar*
> 
> Damn it i already made it.... %^&K


Haha I actually linked that in the first place but I thought better to just link straight to an article than a thread on here so I edited it.

And to answer one of your questions you won't be able to tell unless looking at the bios on it. So you would just have to wait a while, over a month, and then hope the store you bought from had a new supply in.


----------



## leyzar

Quote:


> Originally Posted by *smaudioz*
> 
> Haha I actually linked that in the first place but I thought better to just link straight to an article than a thread on here so I edited it.


My bad for jumping the gun mate...


----------



## sidp96

Quote:


> Originally Posted by *eldukay20*
> 
> 
> 
> Didn't realise there was a 3 slot version of the DC2 TOP when i bought it... oh well


I ordered mine from Superbiiz and it came like this too, all software recognizes it as 280x and its sitting at 1070Mhz, I was pondering whether to get money back or not...any thoughts?


----------



## leyzar

Quote:


> Originally Posted by *sidp96*
> 
> I ordered mine from Superbiiz and it came like this too, all software recognizes it as 280x and its sitting at 1070Mhz, I was pondering whether to get money back or not...any thoughts?


I am interested in this aswell.. should it not have better cooling because of the beefier sink ?


----------



## smaudioz

Quote:


> Originally Posted by *sidp96*
> 
> I ordered mine from Superbiiz and it came like this too, all software recognizes it as 280x and its sitting at 1070Mhz, I was pondering whether to get money back or not...any thoughts?


If it's not what you want then send it back, it's that simple (that is if they let you return it, and I'm sure they will if you tell them it's not what you thought you ordered).


----------



## sidp96

Quote:


> Originally Posted by *leyzar*
> 
> I am interested in this aswell.. should it not have better cooling because of the beefier sink ?


Honestly I really like the card, the brushed metal backplate is definitely sexy and I OC'd to 1150Mhz on 1.22v, gets up to around 70C on most GPU intensive games like Crysis and BF and 80 on Furmark, I think I'm gonna end up keeping it unless anyone else has thoughts about this.


----------



## Cid

I believe this is the heatsink they used on the 7970, and seeing as 280x is basically a 7970 old reviews should provide an answer.


----------



## leyzar

Quote:


> Originally Posted by *sidp96*
> 
> Honestly I really like the card, the brushed metal backplate is definitely sexy and I OC'd to 1150Mhz on 1.22v, gets up to around 70C on most GPU intensive games like Crysis and BF and 80 on Furmark, I think I'm gonna end up keeping it unless anyone else has thoughts about this.


Sounds like you have a decent overclocker there , i would keep it aswell because the Toxic for example runs at 1150 MHZ but on 1.256V...at stock.


----------



## fuark

Quote:


> Originally Posted by *eldukay20*
> 
> 
> 
> Didn't realise there was a 3 slot version of the DC2 TOP when i bought it... oh well


Could someone who has this version please take a photo of the PCB? I hear there is a backplate on this version?


----------



## leyzar

Quote:


> Originally Posted by *fuark*
> 
> Could someone who has this version please take a photo of the PCB? I hear there is a backplate on this version?


Yes it dose have a back-plate and here is the link to the product http://www.asus.com/uk/Graphics_Cards/R9280XDC2T3GD5V2/#overview
its name is "R9280X-DC2T-3GD5-V2"


----------



## qdlaty

I found something strange about my Msi 280x Gaming. Tried few versions of afterburner and it always shows my stock core voltage at 1075 mv. Is it normal ? I'm thinking its a bit too low and maybe thats why I can't OC it past 1050 / 1500.


----------



## smaudioz

Quote:


> Originally Posted by *leyzar*
> 
> Yes it dose have a back-plate and here is the link to the product http://www.asus.com/uk/Graphics_Cards/R9280XDC2T3GD5V2/#overview
> its name is "R9280X-DC2T-3GD5-V2"


It's funny that this page just states all the information for the normal 2 slot version but just adds a few images of the old 3 slots cooler.


----------



## leyzar

Quote:


> Originally Posted by *sidp96*
> 
> Honestly I really like the card, the brushed metal backplate is definitely sexy and I OC'd to 1150Mhz on 1.22v, gets up to around 70C on most GPU intensive games like Crysis and BF and 80 on Furmark, I think I'm gonna end up keeping it unless anyone else has thoughts about this.


See ? he actually has the card.
And that Asus dose not update their product info very fast is nothing new


----------



## smaudioz

Quote:


> Originally Posted by *leyzar*
> 
> See ? he actually has the card.
> And that Asus dose not update their product info very fast is nothing new


Or at all. I actually took a 7970 matrix platinum back the other week because the dimensions on the website differed to the dimensions on the box and it didn't fit. In the end I could have kept it because I sawed my drive cage in half to make room for a longer card, but I'm glad I returned it now anyways as when I got it back there they told me a lot of people had returned them because they artifact out of the box.


----------



## miklkit

Quote:


> Originally Posted by *qdlaty*
> 
> I found something strange about my Msi 280x Gaming. Tried few versions of afterburner and it always shows my stock core voltage at 1075 mv. Is it normal ? I'm thinking its a bit too low and maybe thats why I can't OC it past 1050 / 1500.


Mine shows a core voltage of 1194mv stock.


----------



## navit

The core volts on my Asus is 1075, just sayin as a comparison.


----------



## Cribbs

GRRRR why are so many 280x's voltage locked!
I'm prettty close to buying the HIS just because of HIS coolers being Amazing, and voltage unlocked.


----------



## dade_kash_xD

Finally overclocked one of my XFX DD R9-280x (should have my 10cm Crossfire Bridge on Monday) and I have to say, I am pretty impressed!

GPU Clock: 1190mhz
MEM Clock: 1800mhz(7200mhz)
Voltage: 1.3v

This is with my i7-4770k @ 4.6ghz and my Corsair Dominator Platinum @ 2400mhz.

*Performance Score:*



Xtreme Score


----------



## Jophess

Quote:


> Originally Posted by *dade_kash_xD*
> 
> Finally overclocked one of my XFX DD R9-280x (should have my 10cm Crossfire Bridge on Monday) and I have to say, I am pretty impressed!
> 
> GPU Clock: 1190mhz
> MEM Clock: 1800mhz(7200mhz)
> Voltage: 1.3v
> 
> This is with my i7-4770k @ 4.6ghz and my Corsair Dominator Platinum @ 2400mhz.


Did you have to do anything special for the voltage control to work? Mine just won't change no matter what I do.

Edit: Nevermind... Apparently it works now after not changing anything and trying again.


----------



## eternal7trance

Quote:


> Originally Posted by *Cribbs*
> 
> GRRRR why are so many 280x's voltage locked!
> I'm prettty close to buying the HIS just because of HIS coolers being Amazing, and voltage unlocked.


I would never pick any of the other cards imo

Unlocked voltage, sits around 65c if I crank the voltage to 1.279 and turn the vrm voltage up. Once I find a program that lets me go past 1.279 I'll take it even further. MSI Afterburner still doesn't work for my HIS 280x so I am using iturbo


----------



## dade_kash_xD

Quote:


> Originally Posted by *Jophess*
> 
> Did you have to do anything special for the voltage control to work? Mine just won't change no matter what I do.


Download the latest version of MSI Afterburner and click settings. Make sure the three voltage options are checked/enabled. Unlock Voltage Control / Unlock Voltage Monitoring / Force Constant Voltage.

If you're asking if I downloaded the XFX Overclock BIOS or whatever, no I did not.


----------



## theilya

just received my 2x 280x

one card is always 10c hotter than another...is that normal?


----------



## dade_kash_xD

Quote:


> Originally Posted by *theilya*
> 
> just received my 2x 280x
> 
> one card is always 10c hotter than another...is that normal?


Which 280x did you buy?


----------



## theilya

MSI, I flashed the bios to the ones posted.


----------



## navit

Well it's always a hit or miss kind of thing really
Nice looking rig thought, 540?


----------



## theilya

yes, sir.

business in the front:










party in the back:


----------



## D3TH.GRUNT

Quote:


> Originally Posted by *theilya*
> 
> yes, sir.
> 
> business in the front:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> party in the back:


I had a chuckle at that







I love that case BTW. I am considering getting one myself.


----------



## theilya

best case I ever owned.

btw, I'm trying to find a stable clock for my cards.

So far it seems to be stable on heaven at 1.3v, 1135/1750

it gets bit too hot tho:
1 card 75
second 64

any suggestions?

should i lower clock / wattage?


----------



## eldukay20

Quote:


> Originally Posted by *fuark*
> 
> Could someone who has this version please take a photo of the PCB? I hear there is a backplate on this version?


----------



## BackwoodsNC

Quote:


> Originally Posted by *EnToxication*
> 
> Just to make information public, No, I don't lose powerstates. What are you using for reference point? Both GPU Shark and GPU-Z show that powerstates are functional. I'm using full EK CSQ blocks and AS5 for TIM and it's working really good.
> 
> Do you have force constant voltage on?


Well, i now know why i losing power-states. If I change my monitor refresh rate from 144Hz to 120Hz I no longer lose power-states.

Changing gpu voltage in afterburner for me is still useless. I even did a fresh install of windows. I can change mem volts just fine, but cant apply higher volts in AF. So I just flash the bios with the voltage I want to make it work.

Running 1205Mhz at 1.225 volts and 1700mem at 1619volts


----------



## Cribbs

Quote:


> Originally Posted by *theilya*
> 
> best case I ever owned.
> 
> btw, I'm trying to find a stable clock for my cards.
> 
> So far it seems to be stable on heaven at 1.3v, 1135/1750
> 
> it gets bit too hot tho:
> 1 card 75
> second 64
> 
> any suggestions?
> 
> should i lower clock / wattage?


All r9 290x's are running around 95c under load, I think you'll be fine with those temps


----------



## theilya

another problem I encountered.

I connected my second display as "extended" via HDMI and max resolution I get is 1650x1050 while monitor native resolution is 1920


----------



## MarlowXim

Are the MSI 280x's voltage locked, since I was looking at Newegg reviews stating MSI was still working on Afterburner compatibility? Wondering what kind of overclocks you guys are getting thinking of picking one up since the 7970 prices are climbing that have a reference layout.


----------



## dade_kash_xD

I would love to be able to buy those matrix backplates to put on the back of my xfx dd 280x cards.


----------



## battleaxe

Quote:


> Originally Posted by *navit*
> 
> You got shamed
> 
> 
> 
> 
> 
> 
> 
> 
> My card now reads R9 200


How do you figure? I didn't buy one. Just a question. Do you have the answer?


----------



## theilya

is this normal for crossfire?



I have ULPS disables in regestry and afterburner


----------



## eternal7trance

Quote:


> Originally Posted by *smaudioz*
> 
> There already is one but it's not in the graphics card forum:
> 
> http://www.overclock.net/t/1436949/cl-r9-280x-to-get-updated-gpu-tahiti-xtl-in-a-few-weeks/30#post_21060051


That's really messed up.


----------



## Ashuiegi

for all the matrix talk my 7970 matrix platinium with the stock cooler and a 1280mhz 1850 oc . custom fan profil , i could keep my card under 70C while gamming and everything else vrm ram pcb vram power were under 50 C, if you use this card with gpu tweak you get a toon of info on the monitor software , way more then with a normal dcuii. you got more overclocking settings too.


----------



## Ashuiegi

Quote:


> Originally Posted by *dade_kash_xD*
> 
> I would love to be able to buy those matrix backplates to put on the back of my xfx dd 280x cards.


the matrix is 50% wider then your card , it will look ridiculous ,...... the matrix is as wide as a dvd box


----------



## theilya

for some reason after I restarant my computer all the overclock get reset using anything but CCC.

pretty much it is showing me 1110 core in MSI, but on the graph its 1050


----------



## candy_van

Answering really for the quoted below, but sorry I haven't been able to OC + bench guys.
Have had a lot going on in my personal life lately (complicated/busy), will try to contribute as soon as I can manage a quiet evening to myself (man can't wait for that)








Quote:


> Originally Posted by *NinjaToast*
> 
> ^hmm.. May have to give them a shot sometime then, dunno what I'd buy from them but something. xD


Yea no-one should feel sketched by them, business-wise.

I even RMA'd with them once, they 3D-Marked the card and sent the scren shots back to me, displaying how the card failed.
They then offered me a refund or replacement. As fast as Newegg? Perhaps not, but they are honest, had a great price and offered good service, even with a bum product.


----------



## Amhro

Quote:


> Originally Posted by *navit*
> 
> The core volts on my Asus is 1075, just sayin as a comparison.


ouch, my neck


----------



## grimgnaw

Hey guys,

My first post here, though I've been following you for quite a while.

So yesterday I finally got my Asus R280x DCUII and decided to see how it goes. So just basic OC, since I'm just a noob at this.



I don't think I can get the core any higher, I tried 1230, but there were some artifacts and I stopped the test. Any suggestions?


----------



## Ashuiegi

you can push the memory higher , on my matrix i can go up to 7400 without any tweak in voltage


----------



## grimgnaw

I can't even reach 7000 with voltage at 1.3. My highest was 1220 core clock and 6900 memory, with a small increase compared to the previous results that I've posted.

Maybe I can try increasing the memory clock and leaving a lower core clock, but I don't think this can bring extra performance.


----------



## Ribozyme

Quote:


> Originally Posted by *grimgnaw*
> 
> Hey guys,
> 
> My first post here, though I've been following you for quite a while.
> 
> So yesterday I finally got my Asus R280x DCUII and decided to see how it goes. So just basic OC, since I'm just a noob at this.
> 
> I don't think I can get the core any higher, I tried 1230, but there were some artifacts and I stopped the test. Any suggestions?


Hello, welcome to OCN! 1210 mhz core will give you amazing performance though if it is game stable. 1300mhz and up is only for the top 10 percentage of the GPU's. Could you measure power consumption while gaming with a kill a wat or something? Because I figure 1.3V is going to make it suck quite a bit of juice.

Can you tell me a little about your impressions of the card as I'm looking into buying an asus dc2 280x myself but the only thing stopping me is the apparently unreasonable high idle sound. Which is stupid because it is such a capable cooler and they could get away with letting it spin ultra slow on idle. Could you give me your impression of the idle noise in a silent room with your case fans and cpu cooler as low as possible? Thanks mate!


----------



## grimgnaw

I'm going to have to do some more stability tests since I've only ran Firestrike.

I don't have the means to measure the power consumption unfortunately. However, I can tell you that my 500W PSU doesn't seem to be troubled, the CPU being OCed to 4.5 GHz as well.

About the noise, it seems pretty quiet to me, especially in full load. I just turned down all the other coolers in my case and the GPU fan is definitely audible, though not disturbing for me. When idle it is covered by the other fans. But I guess that this is rather subjective. My guess is that you won't hear it if you put it in a case designed for silence.


----------



## smaudioz

Quote:


> Originally Posted by *Ribozyme*
> 
> Can you tell me a little about your impressions of the card as I'm looking into buying an asus dc2 280x myself but the only thing stopping me is the apparently unreasonable high idle sound.
> !


Where did you hear this? All the reviews say this is the quietest one there is.


----------



## Ribozyme

Quote:


> Originally Posted by *grimgnaw*
> 
> I'm going to have to do some more stability tests since I've only ran Firestrike.
> 
> I don't have the means to measure the power consumption unfortunately. However, I can tell you that my 500W PSU doesn't seem to be troubled, the CPU being OCed to 4.5 GHz as well.
> 
> About the noise, it seems pretty quiet to me, especially in full load. I just turned down all the other coolers in my case and the GPU fan is definitely audible, though not disturbing for me. When idle it is covered by the other fans. But I guess that this is rather subjective. My guess is that you won't hear it if you put it in a case designed for silence.


Well I have an inaudible idle build so all I hear in idle is the GPU sound... Guess I'll have to look somewhere else.

Quote:


> Originally Posted by *smaudioz*
> 
> Where did you hear this? All the reviews say this is the quietest one there is.


From techpowerup and hardware.info

http://www.techpowerup.com/reviews/ASUS/R9_280X_Direct_Cu_II_TOP/25.html

Under load it is fine and you can adjust the noise to your liking while trading some temperature for it but idle noise is lower limit and you cannot ge tower without modding bios.
31dbA idle is way too high, other cards idle around 24-25dbA so with a doubling of sound pressure every 3 dab, 31 dbA idle is stupid.


----------



## steelkevin

Just recieved my Asus DC2T R9 280X. How do I join ?

This card is incredible ! @idle its fans only spin @20% and can't be heard at all, idles at 33°C. Playing BF3 I removed my headset and could barely hear it and it only went up to 70°C. I'm just waiting on EK's waterblocks now







.

HWMonitor:


Here's a quick 3DMark 11: http://www.3dmark.com/3dm11/7376300

EDIT: And a 3DMark (2013 ?): http://www.3dmark.com/3dm/1484921

Heaven Screenshot:


I'll start overclocking before tonight but the real fun will start whenever EK decide to finally release their blocks. I LOVE this GPU. My first proper GPU and I'm glad I chose this one. Used to have a weird 560Ti for over a year then sold it and had been running an OEM GTS 240 with a fan defect that made it make an unbearable noise even on idle. I can't believe I'm so close to completing my rig after all these years ^^.

EDIT: and a terrible photo of my rig as of a couple hours ago that I sent to a friend just to show him, I'll take better pics later.


----------



## Ribozyme

Quote:


> Originally Posted by *steelkevin*
> 
> Just recieved my Asus DC2T R9 280X. How do I join ?
> 
> This card is incredible ! @idle its fans only spin @20% and can't be heard at all,


Hmm or Asus shipped the card with an other bios than techpowerup reviewed it with or you have a high tolerance for noise. Could you see the rpm of the fans in idle in Afterburner or the like?


----------



## steelkevin

Quote:


> Originally Posted by *Ribozyme*
> 
> Hmm or Asus shipped the card with an other bios than techpowerup reviewed it with or you have a high tolerance for noise. Could you see the rpm of the fans in idle in Afterburner or the like?


High tolerance for noise ? Well I didn't used to and that's one of the many reasons I went with watercooling but I guess that having the GTS 240 and its warn out fan bearing for so long could've made me more tolerant to noise. I highly doubt it though as most of the time I have my headset on anyway and I found my friend's 7870 TF2 made too much noise (way more than this Asus card I have now) before he put his whole rig under water a week ago.
Here's a bad picture of his rig around the same time I took my photo:


Spoiler: Warning: Spoiler!







AfterBurner does indeed display the fans actual speed and not only in %. Right now it's @37°C and the fan's spinning at 1011rpm (=20%).

EDIT: that link you posted says this card is louder than a 560 Ti. No way. The 560 Ti was quite noisy @load and could be heard on idle.


----------



## Ribozyme

Quote:


> Originally Posted by *steelkevin*
> 
> High tolerance for noise ? Well I didn't used to and that's one of the many reasons I went with watercooling but I guess that having the GTS 240 and its warn out fan bearing for so long could've made me more tolerant to noise. I highly doubt it though as most of the time I have my headset on anyway and I found my friend's 7870 TF2 made too much noise (way more than this Asus card I have now) before he put his whole rig under water a week ago.
> Here's a bad picture of his rig around the same time I took my photo:
> AfterBurner does indeed display the fans actual speed and not only in %. Right now it's @37°C and the fan's spinning at 1011rpm (=20%).


Hmm 1000rpm shouldn't sound too loud. Now if the fans spin at the same speed as they do on the 780 asus dc2(as it is the same cooler), I do not see why there is a discrepancy between the idle noise measurements of the two cards, as the 780 asus dc2 is regarded very quiet in idle around 25dbA instead of the 31dbA. Will ask a 780 owner what speed his fans spin at.


----------



## steelkevin

Would you perhaps have a good guide / tutorial to overclocking these cards ?

That would be great.

Maybe OP could add a datasheet with User Names, Card, Model, Clocks, Voltage, etc... ?
Oh and a signature for this club, lets not forgot about that


----------



## BackwoodsNC

Quote:


> Originally Posted by *theilya*
> 
> is this normal for crossfire?
> 
> 
> 
> I have ULPS disables in regestry and afterburner


I had that same issue; mine was my monitor refresh rate. If I set my monitor to 144hz it would not downclock the volts. Changing to 120hz fixed that for me.

You running dual monitors or eyefinity?

Quote:


> Originally Posted by *MarlowXim*
> 
> Are the MSI 280x's voltage locked, since I was looking at Newegg reviews stating MSI was still working on Afterburner compatibility? Wondering what kind of overclocks you guys are getting thinking of picking one up since the 7970 prices are climbing that have a reference layout.


No matter what i move the volts slider to in afterburner it does not change. The only way I can change volts so to mod the bios.


----------



## Ribozyme

Quote:


> Originally Posted by *steelkevin*
> 
> Would you perhaps have a good guide / tutorial to overclocking these cards ?
> 
> That would be great.
> 
> Maybe OP could add a datasheet with User Names, Card, Model, Clocks, Voltage, etc... ?
> Oh and a signature for this club, lets not forgot about that


You don't need a guide to GPU overclocking as it is dead simple. just raise the sliders until it crashes. Not sure about voltage control though and if sliding it up really changes the voltages.


----------



## navit

Quote:


> Originally Posted by *battleaxe*
> 
> How do you figure? I didn't buy one. Just a question. Do you have the answer?


YES, the answer is I was drunk on the sauce


----------



## steelkevin

Quote:


> Originally Posted by *Ribozyme*
> 
> You don't need a guide to GPU overclocking as it is dead simple. just raise the sliders until it crashes. Not sure about voltage control though and if sliding it up really changes the voltages.


Yeah I know. What I was really asking for was a place where I could see other people's results I guess







.


----------



## BackwoodsNC

Here is another run in valley(extreme HD preset) at 1200 and 1700mem. I need to download firestrike....


----------



## Ashuiegi

firestike is not always a very good reference to check for gpu perf only , from one driver to another i've notice 1000 point difference , the same goes for sdd vs hdd , even if firestike is not on it , just to have the os on a sdd give you a considerably higher score .


----------



## sugarhell

Just compare graphic score


----------



## battleaxe

Quote:


> Originally Posted by *navit*
> 
> YES, the answer is I was drunk on the sauce


LOL.... awesome.


----------



## smaudioz

Well I returned mine. In my opinion what AMD are doing bringing out the same card with a superior chip and possibly superior features just weeks after the launch is not on. I doubt I'll be rebuyi9ng at a later date either.


----------



## Ashuiegi

even the graphic score , i went from 9680 to 8000 just by install on hdd instead of a pcie raid sdd


----------



## sugarhell

Quote:


> Originally Posted by *Ashuiegi*
> 
> even the graphic score , i went from 9680 to 8000 just by install on hdd instead of a pcie raid sdd


Corruption. Is not possible on a gpu limited bench to get 1000 pts by changing to raid ssd.


----------



## theilya

I'm very unhappy with my MSI 280x CF

1. Cant OC using afterburner, asus gpu tweak or trixx. Everytime I restart my computer and test it out the clock goes back to stock even though afterburner is showing it as overclocked, the monitoring and gpu z is shpwing it as running at 1020 (stock).
The only program that keeps the overclock settings after restart is AMD overdrive.

2. with both coolers on 70% 1 card runs at 80c under load and another 70c

3. cant OC voltage either

4. max stable OC i was able to get is 1135/1650

I got 30 days on my refund.

Going to wait for non ref 290x or 780ti


----------



## eTheBlack

Quote:


> Originally Posted by *sugarhell*
> 
> Corruption. Is not possible on a gpu limited bench to get 1000 pts by changing to raid ssd.


Did he mean OS install or drivers install? In either way it doesn't make any sense.


----------



## eTheBlack

Quote:


> Originally Posted by *theilya*
> 
> I'm very unhappy with my MSI 280x CF
> 
> 1. Cant OC using afterburner, asus gpu tweak or trixx. Everytime I restart my computer and test it out the clock goes back to stock even though afterburner is showing it as overclocked, the monitoring and gpu z is shpwing it as running at 1020 (stock).
> The only program that keeps the overclock settings after restart is AMD overdrive.
> 
> 2. with both coolers on 70% 1 card runs at 80c under load and another 70c
> 
> 3. cant OC voltage either
> 
> 4. max stable OC i was able to get is 1135/1650
> 
> I got 30 days on my refund.
> 
> Going to wait for non ref 290x or 780ti


<
You must turn off OD if you are using any other OC software.

What load? Game benchmarks or Kombustor? I get that in Kombustor too. Isn't that normal? And that OC is quite good.


----------



## sidp96

So with my 3 slot ASUS 280x I've been able to git 1180MHz on 1.22v (thoughts?), and for some reason MSI AB will let me set higher than 1.22 but Kombustor says 1.22,

Also whenever I try to OC my VRAM the slightest bit even with upped volts the entire PC freezes on a gray screen with vertical black lines and I have to restart


----------



## theilya

Quote:


> Originally Posted by *eTheBlack*
> 
> <
> You must turn off OD if you are using any other OC software.
> 
> What load? Game benchmarks or Kombustor? I get that in Kombustor too. Isn't that normal? And that OC is quite good.


yeah, I turn off OD.

pretty much everytime I restart my computer afterburner says shows that my clock is at 1110/1600 however when I run a benchmark such as valley and then check gpu-z and afterburner monitoring it shows that my gpu clock is 1020 (stock).

Then I have to go into afterburner and move slider notch left or right and hit appply and run valey again. Only then the gpu-z will show that it running at 1100/1600,

I have apply on start up and everything on.

Maybe its windows 8.1...


----------



## grimgnaw

Quote:


> Originally Posted by *sidp96*
> 
> So with my 3 slot ASUS 280x I've been able to git 1180MHz on 1.22v (thoughts?), and for some reason MSI AB will let me set higher than 1.22 but Kombustor says 1.22,
> 
> Also whenever I try to OC my VRAM the slightest bit even with upped volts the entire PC freezes on a gray screen with vertical black lines and I have to restart


1180 on 1.22 sounds very nice. You can go up to 1.3V and see how much you can get the core speed. BTW, I used GPU Tweak for this, though I think MSI AB will do the job if you already have it installed.


----------



## leyzar

Quote:


> Originally Posted by *sidp96*
> 
> So with my 3 slot ASUS 280x I've been able to git 1180MHz on 1.22v (thoughts?), and for some reason MSI AB will let me set higher than 1.22 but Kombustor says 1.22,
> 
> Also whenever I try to OC my VRAM the slightest bit even with upped volts the entire PC freezes on a gray screen with vertical black lines and I have to restart


Again sounds like a good OC to me, dono why you have that issue tho... i would put my money on the software side of things as it is not properly optimized for this line of GPUs yet...then again it is only a rebranded 7970ghz...


----------



## leyzar

Quote:


> Originally Posted by *smaudioz*
> 
> Well I returned mine. In my opinion what AMD are doing bringing out the same card with a superior chip and possibly superior features just weeks after the launch is not on. I doubt I'll be rebuyi9ng at a later date either.


Yeah that is not very pleasant ..im going to wait to see what happens to the 280x XTL chips... its either that or a 290 non x for me


----------



## sugarhell

280x already use xtl chips...


----------



## -Droid-

If they are doing this silently it means that not much will change. Maybe a few watts lower consumption, just a bit more OC headroom. It could be for the worse, too. Locked voltage, for example. Worse chips for OC.

I wouldn't hold my breath.


----------



## eTheBlack

Quote:


> Originally Posted by *theilya*
> 
> yeah, I turn off OD.
> 
> pretty much everytime I restart my computer afterburner says shows that my clock is at 1110/1600 however when I run a benchmark such as valley and then check gpu-z and afterburner monitoring it shows that my gpu clock is 1020 (stock).
> 
> Then I have to go into afterburner and move slider notch left or right and hit appply and run valey again. Only then the gpu-z will show that it running at 1100/1600,
> 
> I have apply on start up and everything on.
> 
> Maybe its windows 8.1...


Strange, coud be a bug or something, or as you said Win8.1

I have Win7 and I didn't notice any issues, well except with some kinda "green" blur effect in BF3, but I think that might be beta3 driver problem, will have to try out beta6.


----------



## leyzar

Quote:


> Originally Posted by *-Droid-*
> 
> If they are doing this silently it means that not much will change. Maybe a few watts lower consumption, just a bit more OC headroom. It could be for the worse, too. Locked voltage, for example. Worse chips for OC.
> 
> I wouldn't hold my breath.


Valid point...tbh i do not know what to make of this as we do not have clear info on the differences ... just allot of maybes
We even have people saying their bios says XTL... so... no concrete info yet ...
Would be good for AMD to come out and clear up some of the questions regarding this matter


----------



## miklkit

Quote:


> Originally Posted by *theilya*
> 
> I'm very unhappy with my MSI 280x CF
> 
> 1. Cant OC using afterburner, asus gpu tweak or trixx. Everytime I restart my computer and test it out the clock goes back to stock even though afterburner is showing it as overclocked, the monitoring and gpu z is shpwing it as running at 1020 (stock).
> The only program that keeps the overclock settings after restart is AMD overdrive.
> 
> 2. with both coolers on 70% 1 card runs at 80c under load and another 70c
> 
> 3. cant OC voltage either
> 
> 4. max stable OC i was able to get is 1135/1650
> 
> I got 30 days on my refund.
> 
> Going to wait for non ref 290x or 780ti


1, When I was running X2 6970s I had the same problem. It was AMD CCC doing it. I only used it for presets and it still interfered with MSI AB. I saved the OC to a profile in AB and on startup made sure it was running by clicking "apply".

2. Isn't it normal for the top card to run hotter than the lower card? Mine did.

3&4 When I run the sliders over the voltage goes up, but I have not messed with OCing it yet so do not know if the mv stays up under load.


----------



## eldukay20

Quote:


> Originally Posted by *sidp96*
> 
> So with my 3 slot ASUS 280x I've been able to git 1180MHz on 1.22v (thoughts?), and for some reason MSI AB will let me set higher than 1.22 but Kombustor says 1.22,
> 
> Also whenever I try to OC my VRAM the slightest bit even with upped volts the entire PC freezes on a gray screen with vertical black lines and I have to restart


i managed 1200/1550 at the stock voltage but any higher on my memory causes the grey screen as well.


----------



## brazilianloser

Well even thou I upgraded from a 660 Ti which is not that far behind my Sapphire Vapor X 280X smokes the poor thing out of the water while powering 3 monitors in Eyefinity and a TV above. Very pleased... but at the same time other than AC3 I don't really have any graphic intense games. Either way unless you upgraded from a card that sits in equal grounds with most 280X you are probably going to be pleased... and for the most part as long as my games play nice and smooth on my setup I really don't care if they release a different version of it that will give me maybe a few % in performance increase... if you are waiting on that might as well dish the money on the 290X or wait for the 290, and even wait for when they start putting out the ones with custom cooling. Because even thou the 290X is a beast those temperatures from most reviews I have seen are scary if you are going to air cool the little monsters.

Here is my setup all being powered by a single Sapphire Vapor X 280X


----------



## theilya

I figured I should expand on the issues I'm having with MSI 280x CF.

*Voltages:*

I cannot adjust the voltages at all. The top card runs at 1.2v while the bottom one at 1.12v. I have 1.3v set in afterburner. (I tried Trixx and Asus GPU, same results)

*Core/mem Clock:*

Afterburner, Trixx and Asus GPU all give me the same issue of not saving OC after the computer reboot.

For example, my stock settings are 1050/1500.
I set it to 1100/1600 and run valley/kombuster with GPU-Z monitoring the clocks.
It would stay at above-mentioned clock until I reboot the computer.

AFTER THE REBOOT:

The sliders on the afterburner would still show 1100/1600, however when I run valley/kombuster GPU-Z/monitoring shows both cards running at their stock 1050/1500 even though the sliders are still 1100/1600. To make it work, I have to move slider either left or right and click apply again.

CCC: Is the only program that will actually run the 1100/1600 before/after the reboot.

PS: I have AD off while trying the other methods.

*Temps:*

Top card maxes out at 82c while bottom one stays at 70-72c.
Fans 60%-70%



Any help would be appreciated.


----------



## sugarhell

Fail. You are using official method. Read my sig

I am pretty sure ab doesnt work with the latest beta


----------



## theilya

So the unofficial overclock will unlock my voltages?

Will it fix the other issues?
Ill give it a shot when I get home tonight.

I'm using the beta 6 drivers


----------



## theilya

Quote:


> Originally Posted by *miklkit*
> 
> 1, When I was running X2 6970s I had the same problem. It was AMD CCC doing it. I only used it for presets and it still interfered with MSI AB. I saved the OC to a profile in AB and on startup made sure it was running by clicking "apply".
> .


yeap, that what im doing.
Every time i reboot I load the profile and click apply. It wont work otherwise.

How did you fix this issue?


----------



## sidp96

Quote:


> Originally Posted by *eldukay20*
> 
> i managed 1200/1550 at the stock voltage but any higher on my memory causes the grey screen as well.


Weird, I'll have to look into this more, as soon as I go over 1550 it just doesn't agree, I'm sure VRAM OC doesn't do much at all in terms of performance in games but it would be nice to be able to do it for bragging rights









Also that's a good OC, I get around 1150 stock but over that it required a little more volts.


----------



## miklkit

Quote:


> Originally Posted by *theilya*
> 
> yeap, that what im doing.
> Every time i reboot I load the profile and click apply. It wont work otherwise.
> 
> How did you fix this issue?


I had an MSI and ASUS with the MSI on top. A few months ago the ASUS died so I was just limping along on one card until I got this 280X, which appears to be as good or better than the X2 6970s.

I never did fix the problem. I did try Radeon Pro which works great for new games.
http://www.radeonpro.info/


----------



## EnToxication

Quote:


> Originally Posted by *theilya*
> 
> I'm very unhappy with my MSI 280x CF
> 
> 1. Cant OC using afterburner, asus gpu tweak or trixx. Everytime I restart my computer and test it out the clock goes back to stock even though afterburner is showing it as overclocked, the monitoring and gpu z is shpwing it as running at 1020 (stock).
> The only program that keeps the overclock settings after restart is AMD overdrive.
> 
> 2. with both coolers on 70% 1 card runs at 80c under load and another 70c
> 
> 3. cant OC voltage either
> 
> 4. max stable OC i was able to get is 1135/1650
> 
> I got 30 days on my refund.
> 
> Going to wait for non ref 290x or 780ti


For Voltage adjustment, there are workarounds. Fellow member who was PMing me successfully hit 1200+ core on his MSI with some patience and work. I barely helped him.

For my OC, I hit 1150 1700 at stock voltage, i'm going to play around with it possibly next week when I have more time. Considering that they only hit 43C on WC with full load, I think problem is with the air coolers for temperature, load temps are way lower than 580GTXs I had.

You basically have two massive heaters blowing hot air in a limited space so either crank up the fans to 100% with some fresh air flow to the area, or you are going to get massive heat build up in a specific area.


----------



## Wappo

So got my Asus 280x and everything works perfectly while gaming. Only issue I'm having is when it is downclocked I get some weird horizontal flickering while browsing or watching videos. Not sure if its a 120hz monitor problem, Windows 8.1 driver issue, or a defective card yet. It happens randomly maybe once every 5-10 minutes so not sure if its the 120hz issue that the 7970's had, but it does look exactly the same.


----------



## andydviking

Hi guys,

I have a quick question for you guys with the R9 280X card. I am wanting to get the ASUS direct cu2 card and wondering if it has a UEFI compliant bios so I can get fast boot/secure boot running etc.


----------



## ducknukem86

Hey, just wanted to tell you that i installed my Sapphire 280x Vapor x today. Msi afterburner and Trixx let me change the voltage slider. Does that mean that the card is voltage unlocked?


----------



## Tobiman

Quote:


> Originally Posted by *ducknukem86*
> 
> Hey, just wanted to tell you that i installed my Sapphire 280x Vapor x today. Msi afterburner and Trixx let me change the voltage slider. Does that mean that the card is voltage unlocked?


Do you have the fully unlocked MSI ab? If you do, you can have a go at it and see if the voltage does change.


----------



## eternal7trance

GPUz will also show voltage as well.

Because I was able to use the slider in MSI Afterburner but my voltage never changed. Thankfully iTurbo works on my card though


----------



## theilya

is there a way to prevent CCC from starting up?

i feel its interfering with AB


----------



## Tobiman

Quote:


> Originally Posted by *theilya*
> 
> is there a way to prevent CCC from starting up?
> 
> i feel its interfering with AB


Turn off Override in catalyst. That should help with interference.


----------



## theilya

It's off


----------



## rdr09

Quote:


> Originally Posted by *theilya*
> 
> It's off


if it is one of the startup programs . . . you can disable it in msconfig. type msconfig in search bar and go under Startup and uncheck Catalyst.


----------



## Crowe98

Reasons to purchase the MSI R9 280x Gaming over the ASUS R9 280x DirectCUII TOP, and vice versa.

I can afford both, i dont want any other cards, for example, i dont like anything from Sapphire given their benefits, so please just stick to the two above.

Also, one of the reasons for considering the MSI card is that it will match my MSI GD65 Z77 motherboard, but i need more reasons to get it over the ASUS card.

All help is appreciated, thanks.


----------



## navit

Asus hands down and this coming from one who has owned the last two gen. Lightning cards.


----------



## Crowe98

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *navit*
> 
> Asus hands down and this coming from one who has owned the last two gen. Lightning cards.






What makes the ASUS card better than the MSI cards?


----------



## bencher

Quote:


> Originally Posted by *Crowe98*
> 
> 
> What makes the ASUS card better than the MSI cards?


I wouldn't choose the Asus, you will be tied to the horrible gpu tweak program.

Also voltage is locked unless you flash bios.

With MSi everything is available in Afterburner.


----------



## Crowe98

Quote:


> Originally Posted by *bencher*
> 
> I wouldn't choose the Asus, you will be tied to the horrible gpu tweak program.
> 
> Also voltage is locked unless you flash bios.
> 
> With MSi everything is available in Afterburner.


So i wouldn't be able to use Afterburner at all with the ASUS card? What is the noise levels of the cards like, which one is quieter?


----------



## theilya

Quote:


> Originally Posted by *rdr09*
> 
> if it is one of the startup programs . . . you can disable it in msconfig. type msconfig in search bar and go under Startup and uncheck Catalyst.


Thanks

******* OCC was not letting me OC anything.
I have over drive off and everything but whenever I activate OCC it resets my clocks to stock....


----------



## navit

Quote:


> Originally Posted by *Crowe98*
> 
> 
> What makes the ASUS card better than the MSI cards?


Well for the cards in question the Asus is super quiet and even in crossfire it tops out at 60c. The Msi tend to run louder and hotter. Don't get me wrong I love Msi cards as I said I have had a 6970 Lightning and still run my 7970 Lightning in crossfire with my Asus. I have really been thinking about getting rid of the lightning in favor of another Asus.
Guess it all comes down to personal want more than anything else


----------



## theilya

this is weird I wasnt able to see voltage being raised using AB, but it worked in Trixx.

I have MSI card...

used unofficial OC


----------



## steelkevin

Who just said the DC2T was voltage locked Oo ?
I highly doubt that because I used AB and my GPU only crashed at 1250MHz / 1800MHz and was fine @ 1200MHz / 1800 MHz. I would be very surpised if it could pull those clocks off with stock voltages (I didn't do any proper testing though, just FurMark).

And why would EK, out of all these 280X cards, chose to make waterblocks for the DC2T if it was voltage locked ? That simply wouldn't make sense.

Edit: what's this "unofficial OC" I've seen come up several times in the last ten pages ?


----------



## MooseHead

Quote:


> Originally Posted by *steelkevin*
> 
> Who just said the DC2T was voltage locked Oo ?
> I highly doubt that because I used AB and my GPU only crashed at 1250MHz / 1800MHz and was fine @ 1200MHz / 1800 MHz. I would be very surpised if it could pull those clocks off with stock voltages (I didn't do any proper testing though, just FurMark).
> 
> And why would EK, out of all these 280X cards, chose to make waterblocks for the DC2T if it was voltage locked ? That simply wouldn't make sense.
> 
> Edit: what's this "unofficial OC" I've seen come up several times in the last ten pages ?


OOO how i hope you are right! Have mine coming in on Monday.


----------



## piranha

I currently own one


----------



## grimgnaw

The DC2T is unlocked up to 1.3V. And it also works with MSI AB.

However, the temps are not as low as you say. Maybe it's my ambient temperature inside the case (31-33C in idle), but during a BF3 session it goes up to 75C with the fan at 45% (while the motherboard temp sensors says 47C).

In the same case (HAF 922) I have a 2500k @ 4.5GHz with 1.32V and it goes as high as 60-62C.


----------



## piranha

I tried oc card 6200mhz and 1200 didnt up score really was worse than with oc mode... otherwise was deciding on doing a 2nd one in crossfire, 3dmark firestrike I got 7051, dont know if thats too good at all
8350 @4.3
Msi 970 g43
Patriot 2x 4gb dual channel
Crucial m500 240gb ssd
Msi r9 280x oc mode
H80 crucial cooler ( runs @ 37c)

Dont get any heat issues


----------



## piranha

Oh and my case has 6 regular 120mm fans and h80 is 2 120 mm push pull


----------



## Ashuiegi

if you want 1250-1300mhz and 1850mhz oc you should have gone for the matrix -p instead of dcuii top , most of them can do it. and keep the card under 70 with stock cooler.


----------



## leyzar

While we are on the topic of OC
what would you guys consider a "safe" 24/7 OC for a 280x which you plan on keeping ... lets say 2 years ?


----------



## Ashuiegi

i m happy with 1.25-1.3 volts on mine but it s under water and it s under 50C in load


----------



## steelkevin

Alright so the Asus DC2T 280X is indeed voltage locked on AfterBurner (at least it is on the latest version although it looks otherwise). Here are a couple screenshots showing that the Voltage didn't change although I set it higher in AB:




But this actually makes me happy because that means I've got a pretty good card and will probably be able to fully benefit from putting it under water








edit: unless there is no way of unlocking the voltage :'(


----------



## Ashuiegi

did you try with gpu tweak ? , my 2 asus card can only be ajusted by gpu tweak for voltage it s seems


----------



## steelkevin

Quote:


> Originally Posted by *Ashuiegi*
> 
> did you try with gpu tweak ? , my 2 asus card can only be ajusted by gpu tweak for voltage it s seems


It looks like it works







. Thank you.


----------



## Ashuiegi

np, i ve tried for hours to make it work with afterburner because i liked more their custom fan profil application but i couldn't make it work on both my gtx 670 dcuii top and hd 7970 matrix ,...


----------



## sugarhell

Asus cards only work with gpu tweak...


----------



## grimgnaw

Quote:


> Originally Posted by *steelkevin*
> 
> It looks like it works
> 
> 
> 
> 
> 
> 
> 
> . Thank you.


Wow 1250 core and 7200 memory, that's very nice. Congrats.


----------



## steelkevin

Quote:


> Originally Posted by *grimgnaw*
> 
> Wow 1250 core and 7200 memory, that's very nice. Congrats.


It's probably not stable, it was just for testing purposes, and I won't be seriously overclocking until I get my waterblock (waiting for EK to release them). Plus the Vcore was at its highest possible setting, 1.3V so even if stable I guess it's not all that impressive but thanks







.


----------



## grimgnaw

I can't get it to run firestrike without small artifacts at 1220 and 6900 at maximum voltage, so yours looks pretty awesome.


----------



## Ashuiegi

max oc on the 7970 matrix , i get no artifact in firestrike with these settings

i can do one at 1300 -1850 with voltage at 1,35v but i get artifacts and lower score ,...


----------



## steelkevin

Quote:


> Originally Posted by *grimgnaw*
> 
> I can't get it to run firestrike without small artifacts at 1220 and 6900 at maximum voltage, so yours looks pretty awesome.


Right now I'm testing: 1170MHz | 7200MHz | 1.2V

No issues with either FurMark nor Compustor. Just ran a Heaven bench without any artifacts or problems of any kind. Now Launching Valley (never tried that one too ^^).

EDIT: all good with Valley on Extreme HD too







.

EDIT2: http://www.3dmark.com/3dm11/7383963
No issues there either except that it scored 204points less than @stock...
EDIT: huh, weird that I got 9110 Yesterday @stock and today I'm getting 8400.


----------



## BackwoodsNC

Quote:


> Originally Posted by *theilya*
> 
> this is weird I wasnt able to see voltage being raised using AB, but it worked in Trixx.
> 
> I have MSI card...
> 
> used unofficial OC


I think voltage is broke in AB for whatever reason. Does any besides him and me have that same issue with the msi 280X


----------



## Durvelle27

If you want to be added to the list please pm me the required info as the thread moves pretty fast and its tricky to keep up.


----------



## miklkit

I just did a quick and dirty test on this MSI R9 280X and maxed the volts out, ran the timings up some, and ran 3DMark Demo.
The volts went up and stayed up. It was also too much and it black screened during firestrike.


----------



## steelkevin

Quote:


> Originally Posted by *Durvelle27*
> 
> If you want to be added to the list please pm me the required info as the thread moves pretty fast and its tricky to keep up.


I would do that but the thing is OP seems incomplete. There's no mention of how to join the club or anything :/

Alright well I think it's safe to say mine is stable @1170 | 7200 at stock voltage.
I've ran Heaven, Valley, 3DMark11 twice, and now 3DMark 2013 without a single issue.

Btw, I'd never stayed in front of my montior long enough to see Firestrike. Loved it !

EDIT: (just realized I forgot to re-overclock after a 3DMark11 test )
Here's the 3DMark 2013 link btw: http://www.3dmark.com/3dm/1494866

EDIT: something's definitely wrong. Before I had to restore and reinstall the Beta 6 driver I was getting better scores on both 3DMark11 and 3DMark 2013 than I am now and the card wasn't overclocked.
This was my 3DMark 2013 score yesterday morning: http://www.3dmark.com/3dm/1484921
Compare the two scores, nothing has changed hardware-wise between both.


----------



## BackwoodsNC

Quote:


> Originally Posted by *miklkit*
> 
> I just did a quick and dirty test on this MSI R9 280X and maxed the volts out, ran the timings up some, and ran 3DMark Demo.
> The volts went up and stayed up. It was also too much and it black screened during firestrike.


umm... that's so weird, wonder why mine won't apply gpu volts? It applies mem volts but not core volts.


----------



## grimgnaw

Here's a run with Beta 6, doesn't seem to bring any problems for me. Your score is probably lower than mine due to the CPU.


----------



## steelkevin

Nope, just noticed there were still the three Nvidia entries in the windows unistaller. Couldn't remove them with CCleaner, did a bit of research and found out about driver sweeper. Ran it once, rebooted and fixed the registry issues with CCleaner. That little software just made it to my must have on usb stick at all time list ^^. I'd run Firestrike just to compare both our scores but I refuse to pay for a bench so I'd have to rerun the entire 3DMark tests and that takes way too long.

I'll run 3DMark when I go eat.

EDIT: well actually I have no idea what CPU you have since it isn't in your signature. But just to clarify myself. I wasn't saying that you having a better CPU wouldn't net you a higher score. It obviously will. I was just saying that I lost >1000 points between yesterday and today because I hadn't removed Nvidia's drivers properly.
EDIT 2: I want that Asus X GPU-Z now









EDIT3: good grief Nvidia drivers and whatnot are harder to remove than viruses !

EDIT 4:

Alright so after searching all over the place for any Nvidia related files I finally almost entirely removed all of it from my computer (almost because I'm sure there are more nvidia files hidden) and ran 3DMark with 1170MHz | 7200MHz | 1.2V:

http://www.3dmark.com/3dm/1495937

As you can see the scores now look just fine. I'm only 200 points below you which depending on your CPU would be normal.
I'm way ahead of similar systems on Firestrike too


----------



## sidp96

So after seeing other people's GPU Tweak I noticed something weird about mine, for some reason my max voltages and clocks do not go as high as they should, and when I want to raise the voltage past 1.22v I can't.


----------



## steelkevin

Quote:


> Originally Posted by *sidp96*
> 
> So after seeing other people's GPU Tweak I noticed something weird about mine, for some reason my max voltages and clocks do not go as high as they should, and when I want to raise the voltage past 1.22v I can't.


I can slide up all the way to 1.3V but Kombustor never shows anything else than 1.2V. I guess Kombustor's just buggy. And GPU-Z always shows my Vcore to be 0.1V lower than what I set it to.


----------



## BackwoodsNC

Quote:


> Originally Posted by *steelkevin*
> 
> I can slide up all the way to 1.3V but Kombustor never shows anything else than 1.2V. I guess Kombustor's just buggy. And GPU-Z always shows my Vcore to be 0.1V lower than what I set it to.


Well if you set it to 1.3 then gpu says 1.2, right? Are you sure it is even changing the voltages? I know I can't change core voltage unless I modify the BIOS.


----------



## sidp96

Quote:


> Originally Posted by *steelkevin*
> 
> I can slide up all the way to 1.3V but Kombustor never shows anything else than 1.2V. I guess Kombustor's just buggy. And GPU-Z always shows my Vcore to be 0.1V lower than what I set it to.


Yeah I'm gonna try out some different programs and see what they say, someone mentioned that MSI AB doesn't work with ASUS gpus but mine seemed to be doing fine while using it, I could change voltage, clocks, etc. But I'm not sure if the voltage was ever changed because of Kombustor.


----------



## steelkevin

Quote:


> Originally Posted by *BackwoodsNC*
> 
> Well if you set it to 1.3 then gpu says 1.2, right? Are you sure it is even changing the voltages? I know I can't change core voltage unless I modify the BIOS.


@stock voltage GPU-Z displays 1.1V or less. @1.3V GPU-Z displays 1.2V.
Which is how I know it's being changed.


----------



## sidp96

Quote:


> Originally Posted by *steelkevin*
> 
> @stock voltage GPU-Z displays 1.1V or less. @1.3V GPU-Z displays 1.2V.
> Which is how I know it's being changed.


He's right, it also shows the change in Hwmonitor, just in Kombustor it stays at 1.22 for me


----------



## Durvelle27

List Updated


----------



## steelkevin

Quote:


> Originally Posted by *Durvelle27*
> 
> List Updated


Now update the OP. What you need is:
-How to join
-Full Lineup (or remove the partial Sapphire lineup)
-Update the driver part, we're on Beta 6 now.
-add a google spreadsheet for overlocks and / or members.

etc..


----------



## Modus

from my experience they tend to overheat. I had the GTX 460 Cyclone from MSI which was ok but the GTX 570 Twinfrozr 2 was terrible.


----------



## ducknukem86

Hey there, i've been messing around with trixx utility and my vapor x 280x. I've been testing different core clock and memory settings. To test for stability i ran furmark burn in test and also unigine heaven and valley benchmarks without artifacts. I also found in google a tool called OCCT to check for errors on the gpu and i get errors when i run its test. The question is, how reliable is this test? since i'm not seeing artifacts or stability problems with the other tests.

Edit: i adjusted the voltage to 1.3 and so far haven't gotten any more errors on OCCT. By the way, i can now confirm the Sapphire 280x Vapor X has its voltage unlocked.


----------



## kpo6969

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125475

Gigabyte 280X

Can this description be trusted?
Quote:


> AMD RADEON R9 280X GPU *(Tahiti XTL)*


----------



## smaudioz

Quote:


> Originally Posted by *Modus*
> 
> from my experience they tend to overheat. I had the GTX 460 Cyclone from MSI which was ok but the GTX 570 Twinfrozr 2 was terrible.


I think it's from the Twin Frozr 3 that they've really improved. The Twin Frozr 2 was also meant to have issues with fans dying so it wasn't a great cooler and I made sure I got the 3 when I bought my 570. I'm sure there's been some twin frozr cards that have had problems no matter which version was used but you'll get that with all different types of cooler. People go on about how good the ASUS DirectCU is as well but you'll also find many people who say it's crap because on the card they had it didn't work well. apparently they are better on nvidia cards than AMD cards as well.
Quote:


> Originally Posted by *kpo6969*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125475
> 
> Gigabyte 280X
> 
> Can this description be trusted?


I suppose it depends on if that article saying it will be rolled out in a few weeks time is to be trusted or not.


----------



## steelkevin

I have no idea why but I was sure it was the 770 MSI Gaming edition card that was badly designed and had heat issues.
Turns out it's the 280X Gaming Edition that I'd read about.

Source
Quote:


> Temperatures during gaming are kind of high. We measured these temperatures with an actual game, not a stress test. The card will not go beyond 80°C when a stress test (not even Furmark) is used, because it throttles as the voltage regulator circuitry gets too hot.


----------



## smaudioz

It was a BIOS issue apparently and got fixed. Still it's going to put a lot of people off buying it, seems pretty careless by MSI if it was a widespread issue effecting all of them. Apparently though the cards are sent to the reviewers from AMD themselves and not the AIB partners so MSI wouldn't have been able to check that sample themselves before the review site were sent it. Maybe this is the reason it's one of the cheapest 280x cards, as in MSI put their msrp down to get people to buy them still after the heat issues were discovered?


----------



## Krisuuu

Hi overclockers!

My current configuration is 2500k (not oc'd), 6850, CX500W, hdd, ssd, 8gb ram etc.
I've considered of buying a new graphics card. I have thought the Asus 270x.
Few questions:
Will my psu be enough for 270x?
Isn't the Asus 270x worth buying? I have read it is probably the card in a ~200€/$/£ price range.

Thanks a lot for answers!


----------



## piranha

I dont get heat issues, msi same price as other cards except for vapor


----------



## eternal7trance

Quote:


> Originally Posted by *Krisuuu*
> 
> Hi overclockers!
> 
> My current configuration is 2500k (not oc'd), 6850, CX500W, hdd, ssd, 8gb ram etc.
> I've considered of buying a new graphics card. I have thought the Asus 270x.
> Few questions:
> Will my psu be enough for 270x?
> Isn't the Asus 270x worth buying? I have read it is probably the card in a ~200€/$/£ price range.
> 
> Thanks a lot for answers!


For brands, gigabyte, sapphire, his and asus are fine. Not sure if the gigabyte one is voltage unlocked.

The PSU you have is fine and you will still have room for overclocking.


----------



## dade_kash_xD

Anybody running 280x in crossfire having issues with dropped frames in games? Most notably, I get FPS drops into the low 30's in FarCry 3 and in BF3 I get fps drops into the low 70's. Both games are @ Ultra 1080p.


----------



## navit

Not me, in bf3 I never seem to get below 90 at any given time.


----------



## bluedevil

Why do I want this card?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125476


----------



## Durvelle27

Updated OP


----------



## MooseHead

Quote:


> Originally Posted by *bluedevil*
> 
> Why do I want this card?
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125476


I wish the 280x version looked like this. C'mon Gigabyte!


----------



## brazilianloser

Man now that Nvidia announced their 33% price drops... it puts the 770 at the same price as some of the 280x but with 3 current games for free... Come on AMD where is our free games??? If I don't see any in about two weeks since I will be under the return policy still, I will be trading in my 280x for a 770 just because of the games.


----------



## sidp96

Quote:


> Originally Posted by *brazilianloser*
> 
> Man now that Nvidia announced their 33% price drops... it puts the 770 at the same price as some of the 280x but with 3 current games for free... Come on AMD where is our free games??? If I don't see any in about two weeks since I will be under the return policy still, I will be trading in my 280x for a 770 just because of the games.


Do you think they will give the current 280x owners free games? I'm kinda bummed about this, I definitely feel where you're coming from


----------



## steelkevin

In all honesty with the time EK is taking to release the 280X DC2T blocks if I was to order my new GPU today I'd get the 770 DCUII card (300€ now just like my 280X was) and it's waterblock since it's already available and as you said comes free with three games.
I was really expecting AMD to give us BF4 free but EU release is on the 31st and I'll need time to preload the game so I just pre-ordered from Fast2Play for 33€ with the DLC.

AMD really need to do something to stay on top. The 770 custom cards are now 270-300€ just like the 280X ones except Nvidia is giving away three rather recent or not yet released games.
What happened to AMD being the cheapest and giving away loads of free games xD ?

If mantle turns out to be worthless I'll be mad.


----------



## brazilianloser

Quote:


> Originally Posted by *sidp96*
> 
> Do you think they will give the current 280x owners free games? I'm kinda bummed about this, I definitely feel where you're coming from


If they don't' with the holidays around the corner, they will no longer have the advantage over Nvidia = less sales... just a one man opinion thou. The wife was going to get me a second Sapphire 280x for Christmas but as soon as I read the news this morning I told her to hold up on the purchase since I might be changing my mind if AMD does not make a move. Only debating this thou since I do have up to 30 days to return the card... probably going to get charged restocking fees but since I plan on getting two it does not matter really and the three free games would make up for it big time. I mean after taxes you are looking at almost a extra 200 bucks of value while you get a middle finger from AMD. But hey NVIDIA said they weren't dropping the prices and they did after seeing the danger from the greatly priced AMD cards... now lets hope for the same thing from AMD.


----------



## sidp96

Quote:


> Originally Posted by *brazilianloser*
> 
> If they don't' with the holidays around the corner, they will no longer have the advantage over Nvidia = less sales... just a one man opinion thou. The wife was going to get me a second Sapphire 280x for Christmas but as soon as I read the news this morning I told her to hold up on the purchase since I might be changing my mind if AMD does not make a move. Only debating this thou since I do have up to 30 days to return the card... probably going to get charged restocking fees but since I plan on getting two it does not matter really and the three free games would make up for it big time. I mean after taxes you are looking at almost a extra 200 bucks of value while you get a middle finger from AMD. But hey NVIDIA said they weren't dropping the prices and they did after seeing the danger from the greatly priced AMD cards... now lets hope for the same thing from AMD.


I too am under the 30 day return policy and definitely giving the green side some more thought now, imagine a ~$250 280x, then causing nvidia to also drop, creating WWIII in prices







, one can only dream


----------



## brazilianloser

Quote:


> Originally Posted by *sidp96*
> 
> I too am under the 30 day return policy and definitely giving the green side some more thought now, imagine a ~$250 280x, then causing nvidia to also drop, creating WWIII in prices
> 
> 
> 
> 
> 
> 
> 
> , one can only dream


Well doubt that but if AMD just give us that Never Settle stuff they got on some cards or BF4 I am sure the competition will be intense... they just need to give something or otherwise many like myself will just give up on their good attempt of bringing gamers away from NVidia. Just hoping maybe they will drop the 4GB cards a little more thou. Still 400 and something :/


----------



## brazilianloser

I am looking around for some benchmarks that puts the 770GTX in SLI vs the 280X in Crossfire if anyone got some at easy reach...


----------



## Tobiman

Quote:


> Originally Posted by *brazilianloser*
> 
> I am looking around for some benchmarks that puts the 770GTX in SLI vs the 280X in Crossfire if anyone got some at easy reach...


http://www.guru3d.com/articles_pages/geforce_gtx_770_sli_review,4.html
You can compare with 7990.


----------



## brazilianloser

Quote:


> Originally Posted by *Tobiman*
> 
> http://www.guru3d.com/articles_pages/geforce_gtx_770_sli_review,4.html
> You can compare with 7990.


Thank you. Studying for an exam and didn't want to hunt Google down for reviews. Really helped.
And nice it has a 3 monitor section, exactly what I was needing it.


----------



## leyzar

Wait a second here..
nvidia price drop 33% ? who ? what ?where ?


----------



## black7hought

Finally received my XFX Radeon R9-280X DD.


Spoiler: Warning: Spoiler!


----------



## brazilianloser

Quote:


> Originally Posted by *Tobiman*
> 
> http://www.guru3d.com/articles_pages/geforce_gtx_770_sli_review,4.html
> You can compare with 7990.


Thank you... as I can see either of them due very well in 5760x1080... :/


----------



## brazilianloser

Anyways anyone have found a water block that for sure fits the 280X yet... I know they said they are just reusing some but want to see some for sure proof of ones that actually work... If anyone has done theirs already?


----------



## MooseHead

Hey guys,

Just received my Asus DC2T 280x. Works flawlessly out of the box. Currently running furmark testing the clocks. 1180/6800 on stock voltages seems pretty stable. Anything over 1200 so far seems to artifact. Will changing the voltages change the artifacting at all?


----------



## steelkevin

Quote:


> Originally Posted by *MooseHead*
> 
> Hey guys,
> 
> Just received my Asus DC2T 280x. Works flawlessly out of the box. Currently running furmark testing the clocks. 1180/6800 on stock voltages seems pretty stable. Anything over 1200 so far seems to artifact. Will changing the voltages change the artifacting at all?


Hi,

I have the same card and with similar clocks I had no issues running many benches and both Furmark and Combustor but BF3 showed artefacts after a bit of gameplay and I had to reboot to get rid of them as they were all over the place from the moment they showed up in BF3.

Mind you, I had my Core at 1170MHz so that's alright but my memory was at 7200MHz (set randomly).

Right now I'm just running Core at 1120MHz and Memory at 6600MHz, everything else set to stock settings. I don't really need overlocking right now anyway. I'll overclock properly when I get my waterblock.

EDIT: off to try your settings in BF3 see if my card can keep up ^^


----------



## TheImpZA

I was so close to buying the MSI R9-270X OC Gaming Edition, but now that nVidia has dropped the prices for their cards and added 2/3 new games, I'd rather wait and see what AMD's response is.

They should really introduce a Never Settle bundle for the R9 series cards. Throw in some of the slightly older games like Tomb Raider (for TressFX) and Bioshock Infinite (promote the new DLC) and then Battlefield 4 (Mantle), and I'm sold. I also read somewhere that Ubisoft are considering to add Mantle to their next-gen engines...add Mantle support to AnvilNext for AC:IV in December alongside Frostbite 3. Do it









I really hope that Mantle doesn't end up only being a good idea on paper, but because it had no backing, ends up dying out. Mantle is the reason why I'm going AMD, it makes sense...right?


----------



## eternal7trance

Quote:


> Originally Posted by *MooseHead*
> 
> Hey guys,
> 
> Just received my Asus DC2T 280x. Works flawlessly out of the box. Currently running furmark testing the clocks. 1180/6800 on stock voltages seems pretty stable. Anything over 1200 so far seems to artifact. Will changing the voltages change the artifacting at all?


Sometimes, usually when you put more voltage in you can reach higher clocks, but it's not guaranteed.

For instance mine lets me go to 1150 without touching anything, and then if I turn the voltage up some I can go to 1250. Just depends on your card.
Quote:


> Originally Posted by *TheImpZA*
> 
> I was so close to buying the MSI R9-270X OC Gaming Edition, but now that nVidia has dropped the prices for their cards and added 2/3 new games, I'd rather wait and see what AMD's response is.
> 
> They should really introduce a Never Settle bundle for the R9 series cards. Throw in some of the slightly older games like Tomb Raider (for TressFX) and Bioshock Infinite (promote the new DLC) and then Battlefield 4 (Mantle), and I'm sold. I also read somewhere that Ubisoft are considering to add Mantle to their next-gen engines...add Mantle support to AnvilNext for AC:IV in December alongside Frostbite 3. Do it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I really hope that Mantle doesn't end up only being a good idea on paper, but because it had no backing, ends up dying out. Mantle is the reason why I'm going AMD, it makes sense...right?


I was going to but I needed more VRAM for my 1440p res, so the 770 is pointless for me, the prices didn't go down much and the games suck


----------



## Roooker

Just got my XFX R9 280X delivered today. I was super excited because i am upgrading from a 6950. I decided to give windows a fresh install because i also bought an ssd. So far so good windows running, network works everything fine.
No CCC untill this point. But after installing 13.11 beta 7 driver and restarting my pc i keep getting a black screen after the windows loading screen. Can anyone help ? If i then let the driver cleaner run after booting into save mode and restart my pc again i can see my desktop and everything works.

some additional information on my system:
2500k
MSI Z77a GD65 with newest bios
8gb 1600hz ram
120 gb OCZ vertex 4 ssd
750w OCZ psu


----------



## mtcn77

Sapphire R7 [email protected] 1255mhz

Source
Asus R9 [email protected] 1290mhz

Source
Best in AMD Radeon overclocking, afaik.


----------



## rdr09

Quote:


> Originally Posted by *mtcn77*
> 
> Sapphire R7 [email protected] 1255mhz
> 
> Source
> Asus R9 [email protected] 1290mhz
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Source
> Best in AMD Radeon overclocking, afaik.


here is a 7950 @ 1100

http://www.3dmark.com/3dm/1382633


----------



## DizzlePro

update http://uk.pcpartpicker.com/p/1UtMu


----------



## Clement1987

Hi all,

Please help









----

Hi all,

Just wondering if you guys can help me with the above similar issue that i am experiencing.

This is my rig.

i5 - 4670
ASUS H87 ITX
8 GB Ram
Gigabyte R9 280x Winforce OC
CM 725w PSU

I am on the latest BETA 7 AMD driver.

When i play Borderlands 2 on high, i do not get any problems. However if i start to play BF3, i will get the above gray screen issue.
BF3 was on high settings.

Yesterday was launch of BF4 so i went to try it out and after every 10 - 15 mins, the game will freeze/hang and i will have to end task to eliminate the game.
BF4 was on high settings.

Not sure what is the issue as this card should be able to play BF3 or 4 on ultra.

Gigabyte card is voltage locked and factory standard, never change anything.
Core 1100 / Memory 1500 / Voltage 1.256v

My friend on SAPPHIRE R9 280x is also experiencing the same issue.

Thank you Thank you!


----------



## Arizonian

Quote:


> Originally Posted by *Clement1987*
> 
> Hi all,
> 
> Please help
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ----
> 
> Hi all,
> 
> Just wondering if you guys can help me with the above similar issue that i am experiencing.
> 
> This is my rig.
> 
> i5 - 4670
> ASUS H87 ITX
> 8 GB Ram
> Gigabyte R9 280x Winforce OC
> CM 725w PSU
> 
> I am on the latest BETA 7 AMD driver.
> 
> When i play Borderlands 2 on high, i do not get any problems. However if i start to play BF3, i will get the above gray screen issue.
> BF3 was on high settings.
> 
> Yesterday was launch of BF4 so i went to try it out and after every 10 - 15 mins, the game will freeze/hang and i will have to end task to eliminate the game.
> BF4 was on high settings.
> 
> Not sure what is the issue as this card should be able to play BF3 or 4 on ultra.
> 
> Gigabyte card is voltage locked and factory standard, never change anything.
> Core 1100 / Memory 1500 / Voltage 1.256v
> 
> My friend on SAPPHIRE R9 280x is also experiencing the same issue.
> 
> Thank you Thank you!


Hi Welcome to OCN with your second post.

Your memory is too high to be on stock air cooling. Lower that to 1300 Memory and your 1100 Core clock should be fine. Try that first.

EDIT _ just realized I thought you had a different GPU. Still it can be your over clock too high. Lower it and see if that does the trick. Then incrementally go higher if you like to find your stable point.


----------



## Clement1987

Quote:


> Originally Posted by *Arizonian*
> 
> Hi Welcome to OCN with your second post.
> 
> Your memory is too high to be on stock air cooling. Lower that to 1300 Memory and your 1100 Core clock should be fine. Try that first.
> 
> EDIT _ just realized I thought you had a different GPU. Still it can be your over clock too high. Lower it and see if that does the trick. Then incrementally go higher if you like to find your stable point.


Thanks Arizonian, not sure where else to go to get some help haha.

Just want to highlight that i did not overclock the card though ... it came with 1100/1500/1.256









I will try to lower is to 1300 and have a go.

Thanks!


----------



## brazilianloser

Quote:


> Originally Posted by *Clement1987*
> 
> Thanks Arizonian, not sure where else to go to get some help haha.
> 
> Just want to highlight that i did not overclock the card though ... it came with 1100/1500/1.256
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will try to lower is to 1300 and have a go.
> 
> Thanks!


I personally wouldn't reduce it. If it come saying it was to run at such speeds it better dam be doing so or otherwise its going right back to the store. Just me thou. Having problems overclocking okay but if even at stock is giving you troubles I wouldn't bother with it.


----------



## Clement1987

Quote:


> Originally Posted by *brazilianloser*
> 
> I personally wouldn't reduce it. If it come saying it was to run at such speeds it better dam be doing so or otherwise its going right back to the store. Just me thou. Having problems overclocking okay but if even at stock is giving you troubles I wouldn't bother with it.


Don't quite understand though why the problem is occuring only with Battlefield 3 and 4. Borderlands 2 was fine the 2 hours plus that i played.

Also, i have the GIGABYTE Winforce edition. My mate with the Sapphire Vapor X edition is also having the same issue!

Not sure the problem is with the card or with the game ... anybody else experiencing this?


----------



## Arizonian

Quote:


> Originally Posted by *Clement1987*
> 
> Thanks Arizonian, not sure where else to go to get some help haha.
> 
> Just want to highlight that i did not overclock the card though ... it came with 1100/1500/1.256
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will try to lower is to 1300 and have a go.
> 
> Thanks!


I apologize because this thread and the other 290X / 290 club thread look so similar and didn't see you said 280X









If it's doing it stock then there is an issue with the game itself. Try turning down game settings and see if that eliminates it. Might be too much for your card. Let us know if that worked or not.


----------



## Clement1987

Will give it a go tonight and see how it goes.
Pretty sure R280X can run BF3/4 on high settings, should not be an issue.

Update again later







Thanks for all the help!


----------



## diggiddi

Quote:


> Originally Posted by *Roooker*
> 
> Just got my XFX R9 280X delivered today. I was super excited because i am upgrading from a 6950. I decided to give windows a fresh install because i also bought an ssd. So far so good windows running, network works everything fine.
> No CCC untill this point. But after installing 13.11 beta 7 driver and restarting my pc i keep getting a black screen after the windows loading screen. Can anyone help ? If i then let the driver cleaner run after booting into save mode and restart my pc again i can see my desktop and everything works.
> 
> some additional information on my system:
> 2500k
> MSI Z77a GD65 with newest bios
> 8gb 1600hz ram
> 120 gb OCZ vertex 4 ssd
> 750w OCZ psu


I had the same issue with 13.10 beta driver . It would go to black screen immediately after login and only way to get to desktop is in safe mode.
I had to reinstall OS on my 60gb ssd and currently my 128gb Samsung pro seems to have been formatted, I'm still trying to recover the data, some programs see the data others don't.
Anyone with testdisk experience ova hier??


----------



## Shurtugal

Hey quick question, how do I update the drivers on my card? Haha, do I do it in Catalyst, or?








My old card was nVidia, and I just used geforce experience


----------



## Ashuiegi

i have a gtx 670 that can run every game at 1411 mhz core but as soon as i try BF3 the drivers crash even with the stock oc at 1306 mhz , i have to go back to 1150 mhz to even lunch a game without the driver crashing , it s pefectly happy doing firestrike run at 1411mhz but not bf3 ,....


----------



## Roooker

Just found out that windoes completey crashed after the loading screen. So not only black screen, but also windows crash. The only way to turn the pc off now is to press the power button 5 sec.


----------



## lothop

Hi guys, wondering if anyone else has this issue with R9 280x;

If I leave my computer on overnight I wake up in morning, turn the screen on / move mouse it says " No DVI Signal "
I'm wondering if its the zero power thing with the R9 280x

So card "hibernates" but then wont start up again?
Using Driver 13.101-130604a-161593C-Asus
AMD Catalyst Control Center Version 2013.0604.1838.31590


----------



## brazilianloser

Quote:


> Originally Posted by *lothop*
> 
> Hi guys, wondering if anyone else has this issue with R9 280x;
> 
> If I leave my computer on overnight I wake up in morning, turn the screen on / move mouse it says " No DVI Signal "
> I'm wondering if its the zero power thing with the R9 280x
> 
> So card "hibernates" but then wont start up again?
> Using Driver 13.101-130604a-161593C-Asus
> AMD Catalyst Control Center Version 2013.0604.1838.31590


Well the first thing anyone will probably say is to get the latest beta driver. .11 v6 something like that. See if that does any good to your woes.


----------



## mtcn77

MSI Gaming R9 280X @ 1280/1600 mhz.
Source


----------



## rdr09

mtcn77, i am sorry. i just can't let it slide but i am using the same driver and just a SB @ 4.5. these are what a stock 7970 and 7970GHz (depending on the model) should be getting in FS . . .

http://www.3dmark.com/3dm/1393672

http://www.3dmark.com/3dm/1450566

http://www.3dmark.com/3dm/1393868

that oc'ed 280 should be closer to the 780. much close.


----------



## mtcn77

Quote:


> Originally Posted by *rdr09*
> 
> mtcn77, i am sorry. i just can't let it slide but i am using the same driver and just a SB @ 4.5. these are what a stock 7970 and 7970GHz (depending on the model) should be getting in FS . . .
> 
> http://www.3dmark.com/3dm/1393672
> 
> http://www.3dmark.com/3dm/1450566
> 
> http://www.3dmark.com/3dm/1393868
> 
> that oc'ed 280 should be closer to the 780. much close.


Well, it isn't, so stop chasing ideals.
P.S: Perhaps, you meant the Titan? I didn't get your point previously, I don't get it now either.


----------



## rdr09

Quote:


> Originally Posted by *mtcn77*
> 
> Well, it isn't, so stop chasing ideals.
> P.S: Perhaps, you meant the Titan? I didn't get your point previously, I don't get it now either.


the point is the review site "seems" to be promoting one product at the expense of its readers by fudging numbers. look, they used the same driver and a faster cpu. ocn members are being misinformed.

edit: my bad. you are right - it surpassed the 780.

http://www.3dmark.com/fs/978262


----------



## Wabbit16

Hey guys...ordered my MSI TF 280x last night but since there is a back order I need to wait around two weeks for it to arrive...can't wait!


----------



## kantxcape

Any 290x owner with coil whine problems?


----------



## steelkevin

Quote:


> Originally Posted by *kantxcape*
> 
> Any 290x owner with coil whine problems?


My Asus DC2T 280X has no coil whine issues.


----------



## brazilianloser

Really curious about the price/performance of the non x 290. I am running out of time on my 30 days to return my 280.lol


----------



## leyzar

Quote:


> Originally Posted by *brazilianloser*
> 
> Really curious about the price/performance of the non x 290. I am running out of time on my 30 days to return my 280.lol


http://www.overclock.net/t/1436835/overclockers-amd-radeon-r9-290-performance-revealed


----------



## brazilianloser

Quote:


> Originally Posted by *leyzar*
> 
> http://www.overclock.net/t/1436835/overclockers-amd-radeon-r9-290-performance-revealed


That is pre-release stuff... can't really be trusted all the time. Want to see the real thing on the hand of real folks from this forum. Thats when you can get a more rounded picture of the true performance of the card. Basically my point was that would hope they would come out with it already. But that would probably steal some of their 280/290x sales thou.


----------



## SupahSpankeh

ASUS DCU here.

Is it totally forked?









Seem to be getting artifacts in GW2. Like, huge raging polygon artifacts. Card never exceeds 60C.





So, uh... ideas? I figure I should test with furmark. It's not been OC'd yet, so it's mint out of the box.


----------



## smaudioz

Quote:


> Originally Posted by *SupahSpankeh*
> 
> ASUS DCU here.
> 
> Is it totally forked?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seem to be getting artifacts in GW2. Like, huge raging polygon artifacts. Card never exceeds 60C.
> 
> 
> 
> 
> 
> So, uh... ideas? I figure I should test with furmark. It's not been OC'd yet, so it's mint out of the box.


I'd RMA it if it's unstable out of the box like that. Is it the TOP edition? The alternative is to try turning down the factory OC until the artifacting stops but that's not what you paid for.


----------



## FeelKun

Quote:


> Originally Posted by *SupahSpankeh*
> 
> ASUS DCU here.
> 
> Is it totally forked?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seem to be getting artifacts in GW2. Like, huge raging polygon artifacts. Card never exceeds 60C.
> 
> 
> 
> 
> 
> So, uh... ideas? I figure I should test with furmark. It's not been OC'd yet, so it's mint out of the box.


Getting a similar artifact in gw2... A white flashing line across the screen.... Disappears when I minimize / reopen gw2. I'll screenshot when it happens next.

Same card btw.


----------



## steelkevin

Now that BF4 is officially out for the Americans, does anybody know of a website that tested several GPUs on it ?

I'd also like to know if HT makes a difference or not and if that thing about high speed ram was true or not so I can adjust my overclocks while waiting for Fast2play to %inally send out the damn keys ^^.

@The two GW2 players,

Have either of you tried other drivers ? I'm asking because I know drivers can make a difference sometimes when overclocking so I guess they could very well cause artifacting to when your cards are in fact in no way faulty. And you have to keep in mind these drivers are Beta drivers.


----------



## SupahSpankeh

Quote:


> Originally Posted by *Resme*
> 
> Getting a similar artifact in gw2... A white flashing line across the screen.... Disappears when I minimize / reopen gw2. I'll screenshot when it happens next.
> 
> Same card btw.


k. I'll keep an eye on it. Is yours glitching anywhere else?


----------



## smaudioz

That's true, on a different driver it could be fine.


----------



## Crowe98

Im still confused. Why is Battlefield 4 being released now in the USA? Over here in 'Straya its set to be released on the 31st, why havn't we got it yet?


----------



## smaudioz

Quote:


> Originally Posted by *Crowe98*
> 
> Im still confused. Why is Battlefield 4 being released now in the USA? Over here in 'Straya its set to be released on the 31st, why havn't we got it yet?


No idea, it's spread out over around a week to different parts of the world. Japan get it last on the 7th of November.

They also have one release date for the European Union and a different one for the UK even though the UK is part of the European Union


----------



## Crowe98

Quote:


> Originally Posted by *smaudioz*
> 
> No idea, it's spread out over around a week to different parts of the world. Japan get it last on the 7th of November.
> 
> They also have one release date for the European Union and a different one for the UK even though the UK is part of the European Union


Jesus, sucks to be them







I still don't get EA's release scheme. SIlly...


----------



## SupahSpankeh

Quote:


> Originally Posted by *Crowe98*
> 
> Im still confused. Why is Battlefield 4 being released now in the USA? Over here in 'Straya its set to be released on the 31st, why havn't we got it yet?


Different territories have different "sales periods". US is tuesday to tuesday, UK is thursday to thirsday etc.

Idea being, they want to release it at the start of a sales period to maximise sales to get a high placement on the sales charts to in turn drive more sales.

Dunno about JPN/AUS though.


----------



## Crowe98

Quote:


> Originally Posted by *smaudioz*
> 
> No idea, it's spread out over around a week to different parts of the world. Japan get it last on the 7th of November.
> 
> They also have one release date for the European Union and a different one for the UK even though the UK is part of the European Union


Jesus, sucks to be them







I still don't get EA's release scheme. SIlly...


----------



## battleaxe

So what's the consensus for the best OCer 280x with best mem (hynix?)

I realize there is lots of variation, but overall what is looking like the better cards all things considered.


----------



## smaudioz

Quote:


> Originally Posted by *battleaxe*
> 
> So what's the consensus for the best OCer 280x with best mem (hynix?)
> 
> I realize there is lots of variation, but overall what is looking like the better cards all things considered.


Well the sapphire Toxic is the highest clocked with the biggest cooler and gets close to a 780 on some benches. So if you want a guaranteed high clock then that I guess, the Asus Matrix also has a cooler equipped for fairly high overclocking. I know the Asus DirectCU II uses Hynix memory, don't know which others do.


----------



## TonytotheB

2 x MSI 280X OC editions








Add to ze list pls


----------



## Tobiman

To all those that are experiencing artifacting at stock clocks in all games, try these steps:

1. Bump memory clocks down to 1300mhz.
2. Test for artifacts.
3. If you still get artifacts, then you should return the card as the memory is bad.
4. If you don't get artifacts, bump memory voltage and memory clocks by 50 until you get back to stock memory clock.


----------



## battleaxe

Quote:


> Originally Posted by *smaudioz*
> 
> Well the sapphire Toxic is the highest clocked with the biggest cooler and gets close to a 780 on some benches. So if you want a guaranteed high clock then that I guess, the Asus Matrix also has a cooler equipped for fairly high overclocking. I know the Asus DirectCU II uses Hynix memory, don't know which others do.


Okay. I like the ASUS direct CUII the best there. The toxic is so pricey, doesn't seem worth the extra coin for only a bit more Mhz. Good answer. Thanks man! +1 to you.


----------



## ducknukem86

Here are the pictures of my rig with the Sapphire 280x Vapor X installed. Do you think a Core clock of 1150 mhz and a memory clock of 6400 mhz is ok? so far i haven't ventured more into overclocking


----------



## FeelKun

Quote:


> Originally Posted by *steelkevin*
> 
> Now that BF4 is officially out for the Americans, does anybody know of a website that tested several GPUs on it ?
> 
> I'd also like to know if HT makes a difference or not and if that thing about high speed ram was true or not so I can adjust my overclocks while waiting for Fast2play to %inally send out the damn keys ^^.
> 
> @The two GW2 players,
> 
> Have either of you tried other drivers ? I'm asking because I know drivers can make a difference sometimes when overclocking so I guess they could very well cause artifacting to when your cards are in fact in no way faulty. And you have to keep in mind these drivers are Beta drivers.


Last night I had Beta 2 installed. Uninstalled BETA 2. Rebooted safemode then used driver fusion then rebooted once more and installed BETA 7. I'll see if it happens again. I doubt it's a faulty card... It's probably guild wars 2.

Quote:


> Originally Posted by *SupahSpankeh*
> 
> k. I'll keep an eye on it. Is yours glitching anywhere else?


Nope. Counter-strike GO is fine. Bioshock Infinite - Ok. TF2 - Ok.

Stock Clocks btw.


----------



## brazilianloser

Quote:


> Originally Posted by *ducknukem86*
> 
> 
> 
> 
> 
> Here are the pictures of my rig with the Sapphire 280x Vapor X installed. Do you think a Core clock of 1150 mhz and a memory clock of 6400 mhz is ok? so far i haven't ventured more into overclocking


Nice looking card. Got one powering my setup atm as well. Haven't actually tried to OC yet since I am waiting for the crazyness of low prices and the the non X 290 to be released to see if I will keep the little beast or send it right back for something else.


----------



## Pfortunato

Hi guys, do you think that a r9 280x will be able to handle 1080p gaming with everything maxed out for like 2 years or I should go for a r9 290 atleast?

Cheers

Sent from my GT-I9505 using Tapatalk


----------



## diggiddi

What cpu and mobo ? I'd say get a R290-290x


----------



## brazilianloser

Quote:


> Originally Posted by *Pfortunato*
> 
> Hi guys, do you think that a r9 280x will be able to handle 1080p gaming with everything maxed out for like 2 years or I should go for a r9 290 atleast?
> 
> Cheers
> 
> Sent from my GT-I9505 using Tapatalk


Ahh depends on how much you want to spend as well... Two years is a long time the way games are evolving so I personally if planning on keeping the same card for that long would get a 290x or 780. And a year from now when prices are even lower get a second one to push you another year or two.


----------



## brazilianloser

Not really playable unless I tune some of the settings down... but sure looks pretty. Clearly not the best screenshot but hey. Pretty soon either will be pushing this with two 280s or maybe two 290s... Lets see what AMD brings to the table when they announce the 290.


----------



## grimgnaw

Quote:


> Originally Posted by *SupahSpankeh*
> 
> ASUS DCU here.
> 
> Is it totally forked?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seem to be getting artifacts in GW2. Like, huge raging polygon artifacts. Card never exceeds 60C.
> 
> 
> 
> 
> 
> So, uh... ideas? I figure I should test with furmark. It's not been OC'd yet, so it's mint out of the box.


I experienced something similar on one BF3 map, flickering textures and other similar stuff, at stock clocks and voltages. The temps were ok (something around 75C - it's quite hot in that room).


----------



## olliiee

Anyone have any opinion on the Sapphire Vapor X vs MSI or DCUII 280x, I get the feeling the DCUII is the preferred?

also is a 280x really worth $50 more than a MSI TF3 Ghz 7970 (thats the difference in price in AUS)?


----------



## brazilianloser

Quote:


> Originally Posted by *olliiee*
> 
> Anyone have any opinion on the Sapphire Vapor X vs MSI or DCUII 280x, I get the feeling the DCUII is the preferred?
> 
> also is a 280x really worth $50 more than a MSI TF3 Ghz 7970 (thats the difference in price in AUS)?


Like I said before here... might as well wait and see what price difference it will be from the 280 to the 290. The leaked release is the 31 and the leaked price is 449 (150 more than the 280) but due to the latest Nvidia ordeal I wouldn't doubt AMD throwing this cards out (the non custom coolers that is) for 399 or something of that sort... never know... just be patient and wait out this storm dont be like me... I got the 280x as soon as it went in stock and now I am deeply regretting and depending on the outcome of this next two weeks of price and card wars I might return mine to get something else.


----------



## olliiee

Quote:


> Originally Posted by *brazilianloser*
> 
> Like I said before here... might as well wait and see what price difference it will be from the 280 to the 290. The leaked release is the 31 and the leaked price is 449 (150 more than the 280) but due to the latest Nvidia ordeal I wouldn't doubt AMD throwing this cards out (the non custom coolers that is) for 399 or something of that sort... never know... just be patient and wait out this storm dont be like me... I got the 280x as soon as it went in stock and now I am deeply regretting and depending on the outcome of this next two weeks of price and card wars I might return mine to get something else.


Yeah I probably will wait a week or so but often the price drops will likely take longer to come into effect here (seem to usually anyway). I was just wondering for an opinion from people who owned the 280x, as to whether it is really worth the extra $50 over a Ghz 7970


----------



## Crowe98

Do 280x's perform better than GTX 680's?


----------



## olliiee

Quote:


> Originally Posted by *Crowe98*
> 
> Do 280x's perform better than GTX 680's?


Yep they are about equal to a 770 (varies a few fps depending on who the title was optimized for AMD or Nvidia favored). And a 770 is either above or equal to a 680










At least from my own research into buying one thats what I'm finding


----------



## mtcn77

Quote:


> Originally Posted by *olliiee*
> 
> Yep they are about equal to a 770 (varies a few fps depending on who the title was optimized for AMD or Nvidia favored). And a 770 is either above or equal to a 680
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At least from my own research into buying one thats what I'm finding


Come on, 7970's are faster than Gtx780's, in circumstances you start bottlenecking the memory bandwidth enough. Minimum fps rates are better.
http://www.tweaktown.com/tweakipedia/27/amd-radeon-hd-7970-crossfire-benchmarked-at-7860x1440/index.html


----------



## olliiee

Quote:


> Originally Posted by *mtcn77*
> 
> Come on, 7970's are faster than Gtx780's, in circumstances you start bottlenecking the memory bandwidth enough. Minimum fps rates are better.
> http://www.tweaktown.com/tweakipedia/27/amd-radeon-hd-7970-crossfire-benchmarked-at-7860x1440/index.html


Not sure what your point is.. He asked if a 280x beats a 680 and I told him the answer based on the reviews I had read. He didnt ask if 7970's in CF beat 780's in SLI at 1440p surround...


----------



## mtcn77

Quote:


> Originally Posted by *olliiee*
> 
> Not sure what your point is.. He asked if a 280x beats a 680 and I told him the answer based on the reviews I had read. He didnt ask if 7970's in CF beat 780's in SLI at 1440p surround...


Well, the point is... give AMD some credit.


----------



## DeviousAddict

I don't think i will bother with a 290x, I'll just to my 280x in Xfire







pretty sure it will be more than enough to last me a couple years. Not going 4K any time soon, not unitl prices drop significantly anyway. I will be going eyefinity with 3 monitors though once i find some with a thin enough bezel (hate big thick black lines between screens)


----------



## Cribbs

Wait, 33% price drop for Nvidia? Damn, looks like I might as well just do 780 sli with how prices are in NZ


----------



## brazilianloser

Quote:


> Originally Posted by *olliiee*
> 
> Yeah I probably will wait a week or so but often the price drops will likely take longer to come into effect here (seem to usually anyway). I was just wondering for an opinion from people who owned the 280x, as to whether it is really worth the extra $50 over a Ghz 7970


I do own one. It's great don't take me wrong. It's all about that non x 290 price and performance once it's released.


----------



## brazilianloser

Quote:


> Originally Posted by *DeviousAddict*
> 
> I don't think i will bother with a 290x, I'll just to my 280x in Xfire
> 
> 
> 
> 
> 
> 
> 
> pretty sure it will be more than enough to last me a couple years. Not going 4K any time soon, not unitl prices drop significantly anyway. I will be going eyefinity with 3 monitors though once i find some with a thin enough bezel (hate big thick black lines between screens)


Look at my screens. That is about as thin as it gets without you removing the bezels...


----------



## dade_kash_xD

I need your help guys. I have 2x XFX DD R9-280x in Crossfire and I'm having major issues in BF4. First, I have to up the voltage at stock clocks to 1.3v for me to play without immediately crashing. Forget about any kind of overclock because even 20+ MHz to the core causes an immediate crash. The 2 kinds of crash I'm having are, Battlefield 4 has stopped responding and game freezes image stuttering high pitched sound, 10 seconds later the pc reboots.

I am really confused and frustrated with this. I even took down the OC on my CPU to 4.2ghz. I don't know what else to do. If I don't resolve this today I'm going to return both to newegg and get a single 290x.


----------



## raghu78

firstly check one card at a time in all games and benchmarks including BF4. even if one card is faulty you could get crashes. then if they are working fine as single card and still crashing in CF you can say its a driver issue.btw are you running 13.11 beta 7

http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx

monitor temps in CF. keep core temps below 70c and VRM temps below 90c.


----------



## ironhide138

So anyone with a 280x and bf4, hows it run? I'm getting 55-70fps on high with 2x msaa and ssao, on a 580.... wondering if its worth upgrading


----------



## dade_kash_xD

Quote:


> Originally Posted by *raghu78*
> 
> firstly check one card at a time in all games and benchmarks including BF4. even if one card is faulty you could get crashes. then if they are working fine as single card and still crashing in CF you can say its a driver issue.btw are you running 13.11 beta 7
> 
> http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
> 
> monitor temps in CF. keep core temps below 70c and VRM temps below 90c.


I'm running the 13.11 beta 7 drivers. I each R9-280x by itself and was able to overclock it like crazy and was stable in all benchmarks and games. I see that my core temps get in the mid to high 70's (75-79c). I'm using MSI AB for all monitoring and OC.

Why is it that I can't run these cards at their stock voltages in crossfire?


----------



## raghu78

Quote:


> Originally Posted by *ironhide138*
> 
> So anyone with a 280x and bf4, hows it run? I'm getting 55-70fps on high with 2x msaa and ssao, on a 580.... wondering if its worth upgrading


R9 280X / HD 7970 ghz are a huge upgrade in BF4. > 50% improvement in BF4. 3GB VRAM recommended for BF4. you can easily play at Ultra 4x MSAA









http://www.pcgameshardware.de/Battlefield-4-PC-238749/Tests/Battlefield-4-Beta-Test-Grafikkarten-Benchmarks-1090869/
http://hothardware.com/Reviews/Battlefield-4-Gameplay-and-Performance-Preview/?page=3

get the asus r9 28ox dcii top. the best reviewed and quietest R9 280X. around 40% faster than GTX 580

http://www.techpowerup.com/reviews/ASUS/R9_280X_Direct_Cu_II_TOP/26.html
http://techreport.com/review/25466/amd-radeon-r9-280x-and-270x-graphics-cards/10
http://hexus.net/tech/reviews/graphics/61013-asus-radeon-r9-280x-directcu-ii-top/?page=10
http://www.anandtech.com/show/7400/the-radeon-r9-280x-review-feat-asus-xfx/11
http://www.hardocp.com/article/2013/10/07/asus_r9_280x_directcu_ii_top_video_card_review/

noise tests
http://www.tomshardware.de/amd-radeon-r9-280x-roundup-test,testberichte-241401-6.html


----------



## raghu78

Quote:


> Originally Posted by *dade_kash_xD*
> 
> I'm running the 13.11 beta 7 drivers. I each R9-280x by itself and was able to overclock it like crazy and was stable in all benchmarks and games. I see that my core temps get in the mid to high 70's (75-79c). I'm using MSI AB for all monitoring and OC.
> 
> Why is it that I can't run these cards at their stock voltages in crossfire?


heat could be affecting the stability in CF. monitor VRM temps as they are very important in determining stability. increase fan speed to bring core temps to 70c and VRM temps below 90c. also in CF try and first get stock speeds to work stable before trying any overclock.


----------



## Modovich

Hey fellas, looking into getting a MSI R9 280X Gaming ed. Is it a good card? I can get the ASUS and MSI one for the same price, but the ASUS one is out of stock. So what do I get?


----------



## phillyd

Can someone with the DC2 version of the 280X measure the distance between the motherboard tray and the widest part of the card?


----------



## NV43

Quote:


> Originally Posted by *phillyd*
> 
> Can someone with the DC2 version of the 280X measure the distance between the motherboard tray and the widest part of the card?


Just checked. Its approx 15.25cm from the motherboard tray to the widest part of the card.


----------



## phillyd

Quote:


> Originally Posted by *NV43*
> 
> Quote:
> 
> 
> 
> Originally Posted by *phillyd*
> 
> Can someone with the DC2 version of the 280X measure the distance between the motherboard tray and the widest part of the card?
> 
> 
> 
> Just checked. Its approx 15.25cm from the motherboard tray to the widest part of the card.
Click to expand...

Thanks +1

How do you like this card?


----------



## NV43

Quote:


> Originally Posted by *phillyd*
> 
> Thanks +1
> 
> How do you like this card?


I think it's great. Runs at a cool 36C at 20% fan speed for me currently. Haven't seen it go over ~68C playing BF4 (on ultra, 1080) with the fans at 40%. Huge upgrade over my previous reference MSI 6950. That thing ran at 60C idle with 40% fan speed, which was much louder than I've heard the Asus reach. Haven't had the time to look at overclocking it yet, as I've been busy with school, though.

Overall, very happy I jumped on it at launch.


----------



## Tobiman

Quote:


> Originally Posted by *dade_kash_xD*
> 
> I need your help guys. I have 2x XFX DD R9-280x in Crossfire and I'm having major issues in BF4. First, I have to up the voltage at stock clocks to 1.3v for me to play without immediately crashing. Forget about any kind of overclock because even 20+ MHz to the core causes an immediate crash. The 2 kinds of crash I'm having are, Battlefield 4 has stopped responding and game freezes image stuttering high pitched sound, 10 seconds later the pc reboots.
> 
> I am really confused and frustrated with this. I even took down the OC on my CPU to 4.2ghz. I don't know what else to do. If I don't resolve this today I'm going to return both to newegg and get a single 290x.


If you go to the new AMD website. They have a function that lets you report specific driver issues. I'd try that first for now. The other solution is to uninstall drivers the proper way. Linked below. http://www.overclock.net/t/988215/how-to-properly-uninstall-ati-amd-software-drivers-for-graphics-cards
Then reinstall the beta 7 driver. Let us know if you still have the same problems.


----------



## Genma

My toxic finally came in! Unlocked too!


----------



## black7hought

Alright, I need some advice from my fellow OCNers. I left my computer on last night to fold, I started Guild Wars 2 and noticed artifcating immediately. I quit the game, restarted the computer and loaded Guild Wars 2 up again and the problem was fixed. I came back today after letting my computer fold some more and was only able to turn on one monitor, the other remained black. The monitor that was on was flickering from black to the screen image rapidly. I'm thinking of RMA'ing and wondering if I should go with the Gigabyte 280X.

Edit: temps seem to sit around 50-70C while under load and 39C during idle. I'm also running the beta 7 drivers.

Added Screenshot:


----------



## Obakemono

http://www.techpowerup.com/gpuz/ek34h/

2 ea MSI gaming R9-270X in crossfire.


----------



## sidp96

Quote:


> Originally Posted by *black7hought*
> 
> Alright, I need some advice from my fellow OCNers. I left my computer on last night to fold, I started Guild Wars 2 and noticed artifcating immediately. I quit the game, restarted the computer and loaded Guild Wars 2 up again and the problem was fixed. I came back today after letting my computer fold some more and was only able to turn on one monitor, the other remained black. The monitor that was on was flickering from black to the screen image rapidly. I'm thinking of RMA'ing and wondering if I should go with the Gigabyte 280X.
> 
> Edit: temps seem to sit around 50-70C while under load and 39C during idle. I'm also running the beta 7 drivers.
> 
> Added Screenshot:


I too get major artifacting in GW2, I think it's a driver issue because my OC is completely stable running Kombustor and other games but right when I start up GW2 there's artifacts like crazy. I'm on the ASUS DC2T @ 1200mhz

EDIT: Nevermind, updated to beta 8 and turned down OC to 1100mhz, no artifacts. Turns out 1200 is stable enough for constant furmark and kombustor but not Guild Wars. I would say try downclocking the card, if you get no artifacts then RMA it because it should be able to run at stock clocks and beyond (to an extent) right out of the box without any issues.


----------



## SupahSpankeh

I'm getting artifactingi n GW2.

I'm on BETA 6; I'll see if BETA 8 has the same problem, then contact the seller for a return.


----------



## DeviousAddict

I've played GW2 with no problems at all however that is with only one card. Once i get home and install my 2nd card i'll see if i get the same problem. hopefully by then you guys will have found out a fix


----------



## black7hought

I left the comp folding all night and haven't noticed any issues since getting on about 20 minutes ago. I'll monitor everything over the weekend.


----------



## dade_kash_xD

Quote:


> Originally Posted by *raghu78*
> 
> heat could be affecting the stability in CF. monitor VRM temps as they are very important in determining stability. increase fan speed to bring core temps to 70c and VRM temps below 90c. also in CF try and first get stock speeds to work stable before trying any overclock.


What I'm saying is, I have to max the voltage in MSI AB to 1.3v to be able to run those cards at stock clocks.


----------



## HellfireZA

I'm seriously going to virtually smack the next person who answers someone asking "How do you change the voltage?" and the reply is "I did the unofficial method".

Like someone else said FFS just explain hows its done instead of saying "Unofficial way".


----------



## Wabbit16

Quote:


> Originally Posted by *HellfireZA*
> 
> I'm seriously going to virtually smack the next person who answers someone asking "How do you change the voltage?" and the reply is "I did the unofficial method".
> 
> Like someone else said FFS just explain hows its done instead of saying "Unofficial way".


Well, unofficially...









Just kidding, although I would also like to know as my R9 280x is on its way!

EDIT: Hey wait you're the same Hellfire who bought the RAM from me on Carbonite?


----------



## eternal7trance

Quote:


> Originally Posted by *HellfireZA*
> 
> I'm seriously going to virtually smack the next person who answers someone asking "How do you change the voltage?" and the reply is "I did the unofficial method".
> 
> Like someone else said FFS just explain hows its done instead of saying "Unofficial way".


Quote:


> Originally Posted by *Wabbit16*
> 
> Well, unofficially...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just kidding, although I would also like to know as my R9 280x is on its way!
> 
> EDIT: Hey wait you're the same Hellfire who bought the RAM from me on Carbonite?


It depends on what card you bought. For instance mine needs iturbo and you really don't have to do anything special but move the sliders and GPUz will show if your VDDC and MVDDC changed or not.


----------



## DiceAir

ok ladies and gentlemen I ordered myself 2x Club3d R9-280x Royalking as they were on special here. Should be here latest tuesday but hopefully this Friday(tomorrow). I'm playing on [email protected] so hope my game can do that on ultra. Oh an BTW can't wait to see mantle. If it's anything like the claim it to be we can see some really nice performance boost here and do 120FPS on 2560x1440 ultra.


----------



## HellfireZA

Quote:


> Originally Posted by *Wabbit16*
> 
> Well, unofficially...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just kidding, although I would also like to know as my R9 280x is on its way!
> 
> EDIT: Hey wait you're the same Hellfire who bought the RAM from me on Carbonite?


Hahaha indeed I am, I have pretty much figured it out though... doing it the "unofficial way"


----------



## SupahSpankeh

Hmm. OK, this is odd.

I've got a 280X DCU TOP and I can't seem to get it higher than 1190MHz. I've set power to 120% (also tried 100%) and I just can't get it past 1200MHz. In fact, as I add more voltage it becomes less stable.

Ideas?


----------



## SupahSpankeh

Oh, and forgot; there's a review which overclocked their sample. They could bump it to 1.3v using ASUS Tweak, but I can't get past 1.263. I'm baffled by that too!

(not that the additional voltage would seem to help, it's stuck at 1190 as I said)


----------



## Takla

Quote:


> Originally Posted by *SupahSpankeh*
> 
> Oh, and forgot; there's a review which overclocked their sample. They could bump it to 1.3v using ASUS Tweak, but I can't get past 1.263. I'm baffled by that too!
> 
> (not that the additional voltage would seem to help, it's stuck at 1190 as I said)


whats your asic score? probably below 70% if it clocks like that.


----------



## FeelKun

Quote:


> Originally Posted by *SupahSpankeh*
> 
> Oh, and forgot; there's a review which overclocked their sample. They could bump it to 1.3v using ASUS Tweak, but I can't get past 1.263. I'm baffled by that too!
> 
> (not that the additional voltage would seem to help, it's stuck at 1190 as I said)





Spoiler: Warning: Spoiler!







It goes to 1.3V here... Not seeing any artifacts in guild wars 2 on Beta 7. I haven't bought BF4 yet.. I want to crossfire before I do that. I don't like lowering settings


----------



## hoevito

Quote:


> Originally Posted by *HellfireZA*
> 
> Hahaha indeed I am, I have pretty much figured it out though... doing it the "unofficial way"


Please do tell!!! I'm DYING to go above 1.275 volts on my Matrix platinum!!!!


----------



## Crowe98

Would a 4GB 680 outperform an OC'd 280x?


----------



## rdr09

Quote:


> Originally Posted by *Crowe98*
> 
> Would a 4GB 680 outperform an OC'd 280x?


silicon lottery. i've seen 7950s beat some 680s.


----------



## Crowe98

Quote:


> Originally Posted by *rdr09*
> 
> silicon lottery. i've seen 7950s beat the some 680s.


So.... is that a no?


----------



## rdr09

Quote:


> Originally Posted by *Crowe98*
> 
> So.... is that a no?


a good clocking 680 will beat a bad clocking 280 and vice versa. lottery.


----------



## Crowe98

Quote:


> Originally Posted by *rdr09*
> 
> a good clocking 680 will beat a bad clocking 280 and vice versa. lottery.


Alright, thanks.


----------



## BackwoodsNC

My latest run with waterblocks on on.


----------



## piranha

Decided go8ng different direction, got gtx 780 and will oc that since price drop


----------



## unimatrixzero

Thought you guys might like an upclose and personal picture of an Actual R9 290X


----------



## MarlowXim

Quote:


> Originally Posted by *BackwoodsNC*
> 
> My latest run with waterblocks on on.


Quote:


> Originally Posted by *BackwoodsNC*
> 
> My latest run with waterblocks on on.
> 
> Nice score I was thinking of picking this card up the MSI R9-280x and putting it under water as well. Can you check GPUTool and do an artifact fact test by doing "Test for Stability" and see if you get any artifacts after 10 mins or so, since your running yours at 1.3v at 1225 MHZ core. Although are both cards MSI 280x?


----------



## piranha

Couldnt get ahold of one in a store and I dont like tue heat issue/ how hot it runs so yea. . 280 was decent but if gonna spend the money may was well spend it lol


----------



## Shurtugal

Quote:


> Originally Posted by *Crowe98*
> 
> Would a 4GB 680 outperform an OC'd 280x?


Are you from Australia? Lol, PC Case Gear just put 4 680 4gbs for $400 AUD. If so, a XFX or MSI one would be better for the same price.


----------



## Joeking78

I just got a couple of Gigabyte 280x OC in Crossfire...struggling to find a decent up to date driver if anyone can point me in the right direction?


----------



## FeelKun

Update on gw2;



Spoiler: Warning: Spoiler!


----------



## BackwoodsNC

Quote:


> Originally Posted by *MarlowXim*
> 
> Quote:
> 
> 
> 
> Nice score I was thinking of picking this card up the MSI R9-280x and putting it under water as well. Can you check GPUTool and do an artifact fact test by doing "Test for Stability" and see if you get any artifacts after 10 mins or so, since your running yours at 1.3v at 1225 MHZ core. Although are both cards MSI 280x?
Click to expand...

Nice little test. I ran it and had to back the clocks all the way down to 1200Mhz to stop the artifacts.

Both cards at MSI 280x the serial numbers are identical accept for the last number ;-)

Vdroop on these cards are there. I am seeing max voltages at 1.277 not 1.3 I could probably set vcore to 1.320 and still be safe 24/7 (max out around 1.3)

EDIT* Changed volts to 1.325 and was only able to get another 5mhz out of the cards without artifacts. So I just settled for 1200mhz at 1.3


----------



## MarlowXim

Quote:


> Originally Posted by *BackwoodsNC*
> 
> Nice little test. I ran it and had to back the clocks all the way down to 1200Mhz to stop the artifacts.
> 
> Both cards at MSI 280x the serial numbers are identical accept for the last number ;-)
> 
> Vdroop on these cards are there. I am seeing max voltages at 1.277 not 1.3 I could probably set vcore to 1.320 and still be safe 24/7 (max out around 1.3)
> 
> EDIT* Changed volts to 1.325 and was only able to get another 5mhz out of the cards without artifacts. So I just settled for 1200mhz at 1.3


Yeah it's a good indicator of stability although you're pumping alot of voltage to get to 1200 MHZ what were your ASIC values? I'm guessing around 54%? Vdroop is alittle less compared to 7970, I think due to the populated vrm and mosfet that wasn't on reference.


----------



## BackwoodsNC

Quote:


> Originally Posted by *MarlowXim*
> 
> Yeah it's a good indicator of stability although you're pumping alot of voltage to get to 1200 MHZ what were your ASIC values? I'm guessing around 54%? Vdroop is alittle less compared to 7970, I think due to the populated vrm and mosfet that wasn't on reference.


Card one is 66%

Card two is 64%

I could take the the core upto around 1240mhz before i notice artifacts in games such as BF4.


----------



## piranha

Comparing benchs from r9 280x and gtx 780.. fire strike ranlil better on gtx and gtx wasnt that much overall higher so far when 280 was ocd


----------



## steelkevin

Hi,

Just wondering if anybody here has watercooled their 280X and if they could tell us about their temps and OC results while I painfully wait for the EK-FC R9-280X DCII to finally be released.

It looks like most people are getting the Asus DC2T 280X though so I guess not many have it underwater yet. Who is planning on watercooling their 280X btw ?


----------



## piranha

Peice vs performance I may just rwturn gtx 780 and get my 280x back...


----------



## SupahSpankeh

Quote:


> Originally Posted by *Takla*
> 
> whats your asic score? probably below 70% if it clocks like that.


71%. That's not great, right? Well, can't beat 1190. Set power to 120% with 1.2v, and the more volts I add the less stable it gets. If I bump to 1200 it gets a little glitchy, then adding voltage makes it worse. This is done in Furmark - is that a bit OTT for OC testing? I want it as stable as possible.
Quote:


> Originally Posted by *Resme*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> It goes to 1.3V here... Not seeing any artifacts in guild wars 2 on Beta 7. I haven't bought BF4 yet.. I want to crossfire before I do that. I don't like lowering settings


I've moved to beta 8, no artifacts (yet). Artifacts seem to be centred on helmet/head regions, are intermittent.
Quote:


> Originally Posted by *Resme*
> 
> Update on gw2;
> 
> 
> 
> Spoiler: Warning: Spoiler!


Ouch. I feel you bro.


----------



## FeelKun

^ What's odd is they disappear on minimize / reopen. I think it's the game personally.


----------



## mtcn77

I guess all this time, Nvidia could hypothetically be paying tribute to AMD to postpone any competition against their expensive series on the driver front.


----------



## FeelKun

Update: Getting Artifacts in league of legends now....

Borderline about to return this card.


----------



## SupahSpankeh

At the factory OC, I'm getting crashes in BF4 on my DCU TOP. It's quite annoying. Graphical lockups.


----------



## Wabbit16

Quote:


> Originally Posted by *Resme*
> 
> Update: Getting Artifacts in league of legends now....
> 
> Borderline about to return this card.


Quote:


> Originally Posted by *SupahSpankeh*
> 
> At the factory OC, I'm getting crashes in BF4 on my DCU TOP. It's quite annoying. Graphical lockups.


This isn't looking too good









Crappy driver support or just cards that are pushed too hard by AMD from the factories? I ordered a 280X last week...I hope they sort this out soon or else AMD are going to be seeing many RMA's in the future


----------



## MarlowXim

Quote:


> Originally Posted by *BackwoodsNC*
> 
> Card one is 66%
> Card two is 64%
> 
> I could take the the core upto around 1240mhz before i notice artifacts in games such as BF4.


Thanks I think I'll be picking up this card this weekend, hopefully I'll score around the same just hopefully not 1.3v for 1200 MHZ. I'd go with what frequency you got in GPUtool, since even minor artifacts will result in the driver crashing even if their not noticable like the odd flickering texture you may not see. I've had it happen it BF3 if you want maximum stability stick with GPUTool and add about .10v or .05v give or take.

Although I do recall MSI have a newer bios maybe the result on why it's so high is due to vdroop of the newer bios since the card was said to be too hot. Not sure if you flashed their newer bios but my guess they may have higher vdroop on them. On my 7970 1200 MHZ was @ 1.25v with a vdroop of about -78mV fluctuates abit according to GPU-Z.


----------



## SupahSpankeh

INdeed, it's proving a touch laborious. Very dissapointed so far.


----------



## FeelKun

Quote:


> Originally Posted by *SupahSpankeh*
> 
> At the factory OC, I'm getting crashes in BF4 on my DCU TOP. It's quite annoying. Graphical lockups.


I think they sent out cards under-volted... If it's not running at the advertised Specs it's going back. Could just be a Bad Batch.. Bad bios, idk.

Disappointed myself.

Going to RMA it and use my 6950 then wait to see about the 290(non x)/ nvidia prices.


----------



## SupahSpankeh

Even with a bit more voltage to the stock OC it's not stable in BF4.


----------



## eternal7trance

So after testing for a while, BF4 will let me do 1255 core and stay stable. I had to tone it down vs the other games I was playing which would let me do 1290ish depending on what I was playing


----------



## SupahSpankeh

Quote:


> Originally Posted by *eternal7trance*
> 
> So after testing for a while, BF4 will let me do 1255 core and stay stable. I had to tone it down vs the other games I was playing which would let me do 1290ish depending on what I was playing


So jelly. sigh.


----------



## eternal7trance

Quote:


> Originally Posted by *SupahSpankeh*
> 
> So jelly. sigh.


Hey you can get one right now for 275 on newegg if you use the $25 off 250 code.


----------



## SupahSpankeh

Quote:


> Originally Posted by *eternal7trance*
> 
> Hey you can get one right now for 275 on newegg if you use the $25 off 250 code.


I'm in the UK, and I've gotta return this sucker before I get a new one.

BEfore I do though - can anyone suggest anything? I've tried:

* Stock OC
* Stock OC with voltage
* OC'd with voltage
* all of the above with Target Power of 20%.

I'm using ASUS Tweak, I'm hoping that's not causing any issues. Suggestions very welcome.


----------



## FeelKun

Quote:


> Originally Posted by *eternal7trance*
> 
> Hey you can get one right now for 275 on newegg if you use the $25 off 250 code.


What's the code? thinking about getting a 780 right now...


----------



## eternal7trance

Quote:


> Originally Posted by *Resme*
> 
> What's the code? thinking about getting a 780 right now...


NAFSAVE25NOV1R


----------



## VegetarianEater

ugh i wish they had that code yesterday, the sapphire toxic was back in stock... but now it's gone again


----------



## brazilianloser

Quote:


> Originally Posted by *eternal7trance*
> 
> NAFSAVE25NOV1R


The promo code NAFSAVE25NOV1R cannot be combined with combo deals or gift items attached to products.

It wont work on the 780 due to that...


----------



## Dart06

I used that Newegg code and ordered two windforce 280x cards. Was going to hold out for the 290 but I think saving 100-150$ per card and only losing 10-15% performance is fine. Crossfire scaling looked very good in Battlefield 4 so let's hope it stays that way.

These will be my first AMD cards.


----------



## eternal7trance

Quote:


> Originally Posted by *Dart06*
> 
> I used that Newegg code and ordered two windforce 280x cards. Was going to hold out for the 290 but I think saving 100-150$ per card and only losing 10-15% performance is fine. Crossfire scaling looked very good in Battlefield 4 so let's hope it stays that way.
> 
> These will be my first AMD cards.


If you had more than one account, you could have used the code once for each card


----------



## Dart06

Quote:


> Originally Posted by *eternal7trance*
> 
> If you had more than one account, you could have used the code once for each card


Didn't think of that. All well, the 25$ isn't THAT important. Still feels like a good deal. After I get rid of my 670 it will be like paying 400$ for two 280x cards anyway.


----------



## taem

Quote:


> Originally Posted by *raghu78*
> 
> get the asus r9 28ox dcii top. the best reviewed and quietest R9 280X.l


Newegg just got them back in stock and I ordered it before stock runs out again. $309.99.

Yeah the reviews make it look like the top 280x ATM.

My only quibble is lack of backplate. I was tempted to get a vapor x just for the backplate.

Edit, oh nice, so glad I browsed this thread just now, went back, canceled order, reordered with that promo code eternal7trance mentioned. Here have a +rep for saving me $25! You are so sweet.


----------



## brazilianloser

Quote:


> Originally Posted by *taem*
> 
> Newegg just got them back in stock and I ordered it before stock runs out again. $309.99.
> 
> Yeah the reviews make it look like the top 280x ATM.
> 
> My only quibble is lack of backplate. I was tempted to get a vapor x just for the backplate.
> 
> Edit, oh nice, so glad I browsed this thread just now, went back, canceled order, reordered with that promo code eternal7trance mentioned. Here have a +rep for saving me $25! You are so sweet.


Well I have been rocking the Vapor X and so far satisfied would be the word. 100% usage on the gpu for a few hours playing Crysis 3 today and the highest temp 62... on air... due to me pushing all max out settings I am not getting 60fps but the usual 40+ is enough for me since the plan is to possible get a second one.


----------



## taem

Quote:


> Originally Posted by *brazilianloser*
> 
> Well I have been rocking the Vapor X and so far satisfied would be the word. 100% usage on the gpu for a few hours playing Crysis 3 today and the highest temp 62... on air... due to me pushing all max out settings I am not getting 60fps but the usual 40+ is enough for me since the plan is to possible get a second one.


I was all set to get a 7970 vapor x until the 280x came out. With the 7970s it seemed like that was the best cooler. I still would have gotten the 7970 vapor x if price had been lower but it's like +$100 over the 280x. I'm so jealous of your backplate tho. That card looks nice. The dc2t looks budget.


----------



## SupahSpankeh

Quote:


> Originally Posted by *taem*
> 
> I was all set to get a 7970 vapor x until the 280x came out. With the 7970s it seemed like that was the best cooler. I still would have gotten the 7970 vapor x if price had been lower but it's like +$100 over the 280x. I'm so jealous of your backplate tho. That card looks nice. The dc2t looks budget.


DC2T cooler is one of the best on the 280X - it's incredibly quiet, even when on full blast.

REgrettably, my card isn't stable... so I guess I'll be returning it for one that is


----------



## KeepWalkinG

I have r9 280x Sapphire Dual - X OC and i using sapphire trixx with 1.3v max voltage but in all benchmark valley, heaven 3dmark my voltage is 1.244, 1.219,1.225.......
my core clock is 1210 and memory is 7300
What bios i can put on my 280x for more voltage control ?
I have good cooling system my temp is max 69,70 celsius.


----------



## AnToNCheZ

I wouldn't try to push it any further than that if I were you. If your voltage slider is maxed at 1.3 yet you are only every using 1.25, then try pushing your clock a little higher until it is not stable. It will force more voltage if the chip has any legs left in it. I can pass 3DMark 11 at 1190 and also 2 hour loop of Valley Unigine at 1190 but gaming I will not push it past 1175, just to give it some breathing space. The card is way faster than a 770 at those clocks at a much nicer price point. I am on the Dual X also if you were wondering. Lovely cards, I had the 7950 as well which was a beast card, was able to hit 1225 with that one stable 1200 gaming.


----------



## dade_kash_xD

Ridiculously enough, this is what fixes the freezing/crashing issue on my setup (KINDA):

Completely remove CIM, load device manager remove all video drivers (delete previous versions also), reboot, install 13.11 beta 8, hit reset in MSI AB, redisable ULPS, reboot and then everything works fine.

BUTTTTTTTTTTT next time I reboot, I gotta do the same thing all over again. WTH!#!###! lol. Any ideas what this is guys? Should I rma boh cards?


----------



## Crowe98

Using a 6850 on the 13.11 beta driver im getting stuttering, my gpu usage is up and down. same with the 13.9 driver aswell.

*EDIT: Wrong thread :3*


----------



## DeviousAddict

For those of you like me looking for some benches to compare to in Xfire i found a fairly decent review. Didn't see it linked on the 1st page so............ http://www.tweaktown.com/reviews/5826/msi-radeon-r9-280x-3gb-twin-frozr-oc-in-crossfire-video-card-review/index5.html
I hve actually linked page 5 which takes you straight to the Firestrike results


----------



## Dart06

Have any of you crossfire'd 280x cards in Battlefield 4? How has your performance been?


----------



## navit

Quote:


> Originally Posted by *Dart06*
> 
> Have any of you crossfire'd 280x cards in Battlefield 4? How has your performance been?


I am cf with a 7970/ r9 280x and its working pretty good. I seem to stay above 100 fps for the most part and never go below 70 all on ultra. In the single player I was getting the DX error a lot until I put the 280x as the top card and then that stopped. all running windows 8.1 with the beta8 drivers.


----------



## Amhro

I'm still waiting for all 280x to be in stock here








But still not decided which one should I get...
Probably it will be asus DCII...


----------



## Dart06

Quote:


> Originally Posted by *navit*
> 
> I am cf with a 7970/ r9 280x and its working pretty good. I seem to stay above 100 fps for the most part and never go below 70 all on ultra. In the single player I was getting the DX error a lot until I put the 280x as the top card and then that stopped. all running windows 8.1 with the beta8 drivers.


I'm going to be playing on my 120hz monitor so it should work well I hope. I got two Windforce models.

First AMD cards but I'm happy the drivers seem more straightforward. Nvidia's naming scheme is annoying.

Also, are you using 4x MSAA and Post?


----------



## Dragon-Emperor

I am confused... I am looking into the R9280X-DC2T-3GD5, that is the exact model number, yet when I look at the 2 places I buy from:
Newegg.ca http://www.newegg.ca/Product/Product.aspx?Item=N82E16814121803
MemoryExpress.com http://www.memoryexpress.com/Products/MX48758

Newegg.ca's card is a 2 slot cooler and 1 DP, and Memoryexpress.com's is a 3 slot cooler and 4 DPs, yet they share the same product code.
What's going on here? Is the 3 slot cooler one 'better'?


----------



## navit

Quote:


> Originally Posted by *Dart06*
> 
> I'm going to be playing on my 120hz monitor so it should work well I hope. I got two Windforce models.
> 
> First AMD cards but I'm happy the drivers seem more straightforward. Nvidia's naming scheme is annoying.
> 
> Also, are you using 4x MSAA and Post?


yep


----------



## Cool Mike

Grabbed a second 280x toxic Thursday night. Was available for a short time. Look forward to crossfiring..


----------



## taem

Quote:


> Originally Posted by *Dragon-Emperor*
> 
> I am confused... I am looking into the R9280X-DC2T-3GD5, that is the exact model number, yet when I look at the 2 places I buy from:
> Newegg.ca http://www.newegg.ca/Product/Product.aspx?Item=N82E16814121803
> MemoryExpress.com http://www.memoryexpress.com/Products/MX48758
> 
> Newegg.ca's card is a 2 slot cooler and 1 DP, and Memoryexpress.com's is a 3 slot cooler and 4 DPs, yet they share the same product code.
> What's going on here? Is the 3 slot cooler one 'better'?


Different product codes.

R9280X-DC2T-3GD5 is the 2 slot.

R9280X-DC2T-3GD5 V2 is the 3 slot.

I haven't seen any reviews of the V2 so I know nothing about it. I recall seeing on a thread somewhere that the R9280X-DC2T-3GD5 is actually newer than the V2.


----------



## Dragon-Emperor

Quote:


> Originally Posted by *taem*
> 
> Different product codes.
> 
> R9280X-DC2T-3GD5 is the 2 slot.
> 
> R9280X-DC2T-3GD5 V2 is the 3 slot.
> 
> I haven't seen any reviews of the V2 so I know nothing about it. I recall seeing on a thread somewhere that the R9280X-DC2T-3GD5 is actually newer than the V2.


Ohh I totally didn't see that V2 as it went to the second line. Good call!
If the V2 is not the newer one, then that's some counter intuitive branding system alright. I'd want the one with the quieter cooler


----------



## taem

Quote:


> Originally Posted by *Dragon-Emperor*
> 
> Ohh I totally didn't see that V2 as it went to the second line. Good call!
> If the V2 is not the newer one, then that's some counter intuitive branding system alright. I'd want the one with the quieter cooler


I think what's going on is that the V2 uses the cooler from the 7970, while the R9280X-DC2T-3GD5 uses the cooler from the gtx 770 or 780. If you look closely at the pix, as a review noted, the cooler doesn't quite fit, it looks like a mod job. But all the reviews are of this 2 slot model and praise the cooler. You might look up reviews of the 3 slot asus 7970 for an idea of how the V2 cooler performs.

Btw there are also non TOP models of each that have lower clocks. R9280X-DC2-3GD5 and R9280X-DC2-3GD5 V2.


----------



## miklkit

Eh? The manufacturers are swapping coolers around?









Then will the cooler from an ASUS directcuII 6970 fit an MSI R9 280X? The ASUS died not long ago.


----------



## chiknnwatrmln

Seriously considering selling off my 670 ftw and my spare 7950 to buy a 280x toxic.

Hoping to get a decent deal for cyber Monday, do you guys think this card is worth it? It's $350 right now. I could sell my current cards and pocket a bit of cash, all while gaining performance..

Can any owners provide some input?


----------



## FeelKun

Quote:


> Originally Posted by *SupahSpankeh*
> 
> DC2T cooler is one of the best on the 280X - it's incredibly quiet, even when on full blast.
> 
> REgrettably, my card isn't stable... so I guess I'll be returning it for one that is


Newegg Issued my RMA + free return. Love teh egg. I have no clue what I'll be purchasing.... Think I'll sit back and see how the Graphic Wars 2 unfolds... Hopefully no artifacts while i wait


----------



## AnToNCheZ

I would almost sell the 670 and see what you can get out of the 7950 in terms of overclocking. I had a Dual X Sapphire that hit 1200Mhz easily and when you consider the fact that you have an additional 1G of VRAM compared to the 670 and the fact that you already have the 7950 with you, you could save some cash.

I have the 280 Sapphire Dual X now, and am very very happy with it. They are excellent cards for the price, and although the 670 I replaced my beloved 7950 with was a great overclocker (1255 Mhz stable), the 280 stomps it while again giving me extra VRAM to play with. I switched back over when I decided to go 1440p with the 27" ASUS screen that I bought. With the 290x not being available in anything but that as reference cooler for a while yet, I would say that an OC 280 is one of the smartest choices at the moment.


----------



## Dart06

Quote:


> Originally Posted by *AnToNCheZ*
> 
> I would almost sell the 670 and see what you can get out of the 7950 in terms of overclocking. I had a Dual X Sapphire that hit 1200Mhz easily and when you consider the fact that you have an additional 1G of VRAM compared to the 670 and the fact that you already have the 7950 with you, you could save some cash.
> 
> I have the 280 Sapphire Dual X now, and am very very happy with it. They are excellent cards for the price, and although the 670 I replaced my beloved 7950 with was a great overclocker (1255 Mhz stable), the 280 stomps it while again giving me extra VRAM to play with. I switched back over when I decided to go 1440p with the 27" ASUS screen that I bought. With the 290x not being available in anything but that as reference cooler for a while yet, I would say that an OC 280 is one of the smartest choices at the moment.


That or just get another 7950 and crossfire them.

I don't know if it would be worth the trouble of selling a 670 and a 7950 to get crossfire 280x cards.


----------



## prescotter

Hello Guys,

The Bios Collection of the R9 280x is very limited on the TechPowerUp,

*And im searching for a bios from a reference design card.*

Reason is that the single reference bios on TPU seems to be from a Pre-Production / Engineer Sample Card.

You can make a simple backup of bios file with GPU-Z.


----------



## sidp96

For some reason my 280x gets artifacts at stock clocks with extra voltage on League of Legends but I can play BF3 Metro 64 players at 1150Mhz same voltage no problem. This artifacting also ocurrs in Guild Wars 2, yet MSI Kombustor runs flawlessly for however long I want it to.


----------



## wanna_buy

Does anyone have Gigabyte R9 280X? What color PCB does it have? Blue like in the pictures or black like their newest GTX 770 cards?


----------



## phillyd

Quote:


> Originally Posted by *wanna_buy*
> 
> Does anyone have Gigabyte R9 280X? What color PCB does it have? Blue like in the pictures or black like their newest GTX 770 cards?


I just installed that GPU in a friend's computer and if it's the windforce (3 fan), then it has a blue PCB


----------



## wanna_buy

Why does Gigabyte still continue to use blue PCB's? Their GTX 770 with black PCB look so awesome.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *AnToNCheZ*
> 
> I would almost sell the 670 and see what you can get out of the 7950 in terms of overclocking. I had a Dual X Sapphire that hit 1200Mhz easily and when you consider the fact that you have an additional 1G of VRAM compared to the 670 and the fact that you already have the 7950 with you, you could save some cash.
> 
> I have the 280 Sapphire Dual X now, and am very very happy with it. They are excellent cards for the price, and although the 670 I replaced my beloved 7950 with was a great overclocker (1255 Mhz stable), the 280 stomps it while again giving me extra VRAM to play with. I switched back over when I decided to go 1440p with the 27" ASUS screen that I bought. With the 290x not being available in anything but that **** as reference cooler for a while yet, I would say that an OC 280 is one of the smartest choices at the moment.


Thanks for the advice, my 7950 is only ok at overclocking at 1125/7000.


----------



## rdr09

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Thanks for the advice, my 7950 is only ok at overclocking at 1125/7000.


your 7950 is as fast as a 7970GHz edition. if not, faster. pretty much a stock 280.


----------



## Pfortunato

Quote:


> Originally Posted by *AnToNCheZ*
> 
> I would almost sell the 670 and see what you can get out of the 7950 in terms of overclocking. I had a Dual X Sapphire that hit 1200Mhz easily and when you consider the fact that you have an additional 1G of VRAM compared to the 670 and the fact that you already have the 7950 with you, you could save some cash.
> 
> I have the 280 Sapphire Dual X now, and am very very happy with it. They are excellent cards for the price, and although the 670 I replaced my beloved 7950 with was a great overclocker (1255 Mhz stable), the 280 stomps it while again giving me extra VRAM to play with. I switched back over when I decided to go 1440p with the 27" ASUS screen that I bought. With the 290x not being available in anything but that **** as reference cooler for a while yet, I would say that an OC 280 is one of the smartest choices at the moment.


Hi, can you tell me more about your sapphire r9 280x dual-x please? I had a sapphire 7870xt dual-x and it was good, nice overclocks and temps were around 70c with 1100mhz oc on the vcore but I send it to RMA because of the coil whine and I do not trust sapphire anymore, can you tell me if your model suffers from coil whine? Does it overclocks well? Good temps? Any noise? In the store where I bought the 7870xt they are selling it for 253 euros and I dont want to spend much money.

Cheers


----------



## chiknnwatrmln

Quote:


> Originally Posted by *rdr09*
> 
> your 7950 is as fast as a 7970GHz edition. if not, faster. pretty much a stock 280.


I understand this. However, it's not as fast as a heavily oced 7970/280x.
Besides, my 7950 needs 1.27v just to get 1125 mhz and that makes it run hot.


----------



## KeepWalkinG

I have one problem look my card in idle voltage is on maxx why???
http://imageshack.us/photo/my-images/534/mpl7.jpg/


----------



## BackwoodsNC

Quote:


> Originally Posted by *KeepWalkinG*
> 
> I have one problem look my card in idle voltage is on maxx why???


What is your screen size and refresh rate? On mine if I leave the refresh rate at 144Hz it never goes into p-states. I have to set it at 120Hz


----------



## VegetarianEater

Quote:


> Originally Posted by *Cool Mike*
> 
> Grabbed a second 280x toxic Thursday night. Was available for a short time. Look forward to crossfiring..


you stole mine! just kidding, i was waiting for some sort of coupon, which came on friday but they were already sold out by then. i also want to crossfire toxic 280s.... (probably won't be overclocking them, i'll be running my psu at it's limit)


----------



## Dart06

I thought about getting two of the Toxics but the few % more performance didn't seem worth 50-60$ per card to me.

I don't like to overclock GPUs. I always run them at stock too.


----------



## eternal7trance

If you hold out, Newegg has been putting up $25 off $250 codes randomly throughout the week lately, last one was on 11/1


----------



## KeepWalkinG

Quote:


> Originally Posted by *BackwoodsNC*
> 
> What is your screen size and refresh rate? On mine if I leave the refresh rate at 144Hz it never goes into p-states. I have to set it at 120Hz


my monitor is Dell U2311H 60hz
How to change this voltage in idle?


----------



## SupahSpankeh

Quote:


> Originally Posted by *Resme*
> 
> Newegg Issued my RMA + free return. Love teh egg. I have no clue what I'll be purchasing.... Think I'll sit back and see how the Graphic Wars 2 unfolds... Hopefully no artifacts while i wait


I'm sticking with my DCT.

I've got it stable in everything apart from BF4, and BF4 tech forums are _full_ of PC gamers *****ing about DX failures and crashes. I think it's the game... so I'm happy to wait and see. My retailer did offer an RMA, but I'll be waiting for a week or so before sending it back.


----------



## eternal7trance

Quote:


> Originally Posted by *SupahSpankeh*
> 
> I'm sticking with my DCT.
> 
> I've got it stable in everything apart from BF4, and BF4 tech forums are _full_ of PC gamers *****ing about DX failures and crashes. I think it's the game... so I'm happy to wait and see. My retailer did offer an RMA, but I'll be waiting for a week or so before sending it back.


I'm assuming you're using AMD Beta V8 that fixed a lot of BF4 issues?


----------



## KeepWalkinG

All R9 280x sapphire in idle voltage is hold highest??
Or only i have this problem, maybe this is dangerous for my card?
In load my card voltage is 1.214, 1.225 only in idle is 1.288v


----------



## Dart06

Man my two Windforce 280x shipped. I'm hyped. Hopefully they'll be here on Wednesday.


----------



## FernTeixe

Quote:


> Originally Posted by *KeepWalkinG*
> 
> All R9 280x sapphire in idle voltage is hold highest??
> Or only i have this problem, maybe this is dangerous for my card?
> In load my card voltage is 1.214, 1.225 only in idle is 1.288v


are you using more than 1 monitor? did you tried 2 monitors? if you are using 2 monitors, it's how it work.
if you tried 2 monitors and now want to turn one off , go to ccc and disable desktop 2


----------



## brazilianloser

Quote:


> Originally Posted by *FernTeixe*
> 
> are you using more than 1 monitor? did you tried 2 monitors? if you are using 2 monitors, it's how it work.
> if you tried 2 monitors and now want to turn one off , go to ccc and disable desktop 2












Not sure if that answer made any sense... but maybe you quoted the wrong post...???


----------



## FernTeixe

Quote:


> Originally Posted by *brazilianloser*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not sure if that answer made any sense... but maybe you quoted the wrong post...???


when I'm using 2-3 monitors my voltage is always at maximum.


----------



## eternal7trance

I use two monitors and my voltage goes down on idle


----------



## brazilianloser

Quote:


> Originally Posted by *FernTeixe*
> 
> when I'm using 2-3 monitors my voltage is always at maximum.


I see. Never had that problem and to be sincere first time I see someone mentioning such.
Quote:


> Originally Posted by *eternal7trance*
> 
> I use two monitors and my voltage goes down on idle


Same here I use four and the voltages at idle are much lower than when at load gaming... clearly right.


----------



## KeepWalkinG

This is it thanks , i using TV and this is the problem








When unplug cable for tv i dont have this problem and my voltage in indle is 0.900


----------



## somethingname

Why are people having all this problems if they're just rebranded GPU's? I wanted to order a 280 but this thread is making me have doubts about ordering it.


----------



## steelkevin

Quote:


> Originally Posted by *somethingname*
> 
> Why are people having all this problems if they're just rebranded GPU's? I wanted to order a 280 but this thread is making me have doubts about ordering it.


I'm not having any problems.

What I guess is really happening is well for starters BF4 is full of glitches for most people right now so people who just bought their 280X card think the cards at fault when really it isn't.
The second thing that's probably happening is people are running Beta Drivers without fully understanding that a Beta driver is exactly that, a Beta and it's bound to be glitchy.

I'm also running a Beta driver btw (the 6th).


----------



## rdr09

Quote:


> Originally Posted by *KeepWalkinG*
> 
> This is it thanks , i using TV and this is the problem
> 
> 
> 
> 
> 
> 
> 
> 
> When unplug cable for tv i dont have this problem and my voltage in indle is 0.900


Quote:


> Originally Posted by *somethingname*
> 
> Why are people having all this problems if they're just rebranded GPU's? I wanted to order a 280 but this thread is making me have doubts about ordering it.


the problem of the gpu not idling all the way when 2 or more monitors are hooked up? of course, the gpu is working harder.

my 7950 and my 7970 both do this. i am sure my 4555, 4670, and 7770 will do the same.


----------



## brazilianloser

Quote:


> Originally Posted by *steelkevin*
> 
> I'm not having any problems.
> 
> What I guess is really happening is well for starters BF4 is full of glitches for most people right now so people who just bought their 280X card think the cards at fault when really it isn't.
> The second thing that's probably happening is people are running Beta Drivers without fully understanding that a Beta driver is exactly that, a Beta and it's bound to be glitchy.
> 
> I'm also running a Beta driver btw (the 6th).


Agree... I have yet to touch BF4 and yet to have any problems with my Sapphire Vapor X 280X. So far only played AC3, Portal 2 and Crysis 3... all for hours and on and not a single problem. Only thing I notice is in Skyrim where objects every once in a while just decide to bang all over the place non stop but that was already happening back with my GTX 660 Ti anyways.


----------



## FernTeixe

Quote:


> Originally Posted by *KeepWalkinG*
> 
> This is it thanks , i using TV and this is the problem
> 
> 
> 
> 
> 
> 
> 
> 
> When unplug cable for tv i dont have this problem and my voltage in indle is 0.900


nice seems I understood the question


----------



## SupahSpankeh

Quote:


> Originally Posted by *eternal7trance*
> 
> I'm assuming you're using AMD Beta V8 that fixed a lot of BF4 issues?


Yup beta 8.


----------



## Modovich

MSI R9 280X in action. After bios flashing its all good. Altho BF4 crashes here and there, latest drivers.


----------



## Wabbit16

Quote:


> Originally Posted by *Modovich*
> 
> 
> 
> MSI R9 280X in action. After bios flashing its all good. Altho BF4 crashes here and there, latest drivers.


What BIOS did you flash it with and why? If you don't mind me asking


----------



## SupahSpankeh

Got to 1200MHz at 1.238v using ASUS Tweak. Any idea how I make this permanent? I keep having to re-apply it.


----------



## Modovich

Quote:


> Originally Posted by *Wabbit16*
> 
> What BIOS did you flash it with and why? If you don't mind me asking


The stock bios on first batches of MSI R9 280x's has a problem with vrm temps.


----------



## brazilianloser

Any other Vapor X 280 owners out there as bothered by this as me??? I know its necessary but never had a card that I could see such... just doesn't have that polished look for me.


----------



## taem

Quote:


> Originally Posted by *SupahSpankeh*
> 
> Got to 1200MHz at 1.238v using ASUS Tweak. Any idea how I make this permanent? I keep having to re-apply it.


How's the power draw and temp at that setting? What's your max fan speed set to?

Also a general question. Can I use GPU Tweak to control the asus card settings, and also install MSI Afterburner for the on screen display? I wouldn't be touching the Afterburner card setting sliders obviously. But I wonder if that would confuse the card. Would I have to make sure GPU Tweak loads after Afterburner?

Edited because hit submit too fast


----------



## somethingname

I wanted the MSI 280X card but it had quite a lot of complaints of Coil Whine noise which is demonstrated in this vid




I went ahead & ordered the XFX 280X since its uber sexy & has a Lifetime warranty + Unlocked voltage


----------



## Yungbenny911

Hey everyone, i was thinking of doing a *R9 280X Vs GTX 770 Bench-off Thread* since this question has been asked by multiple people, and always results in keyboard battles lol. There would absolutely be no bickering/bias/fanboy/memes posts, only benchmark scores, and non offensive replies to previous posted scores that would be posted.

Yay or Nay?







. I'm asking first because, for the Thread to be successful, your benchmarks would be needed, and i would try to make spreadsheets/charts *with the top 10/5 scores for each benchmark*, so the Thread scores can be more presentable to Guests, and everyone looking for an upgrade path.

I currently have ideas of games with In-game benchmarks to run, but your suggestions would be much appreciated


----------



## neo668

Quote:


> Originally Posted by *Yungbenny911*
> 
> Yay or Nay?
> 
> 
> 
> 
> 
> 
> 
> . I'm asking first because, for the Thread to be successful, your benchmarks would be needed, and i would try to make spreadsheets/charts *with the top 10/5 scores for each benchmark*, so the Thread scores can be more presentable to Guests, and everyone looking for an upgrade path.


I have a HIS R9 280X iPower IceQ X² Turbo Boost and I would like to contribute. Just tell me what to do.


----------



## Joeking78

I have crossfire 280x which I've benched so I can contribute.


----------



## battleaxe

Quote:


> Originally Posted by *Yungbenny911*
> 
> Hey everyone, i was thinking of doing a *R9 280X Vs GTX 770 Bench-off Thread* since this question has been asked by multiple people, and always results in keyboard battles lol. There would absolutely be no bickering/bias/fanboy/memes posts, only benchmark scores, and non offensive replies to previous posted scores that would be posted.
> 
> Yay or Nay?
> 
> 
> 
> 
> 
> 
> 
> . I'm asking first because, for the Thread to be successful, your benchmarks would be needed, and i would try to make spreadsheets/charts *with the top 10/5 scores for each benchmark*, so the Thread scores can be more presentable to Guests, and everyone looking for an upgrade path.
> 
> I currently have ideas of games with In-game benchmarks to run, but your suggestions would be much appreciated


Absolutely man! Do it.


----------



## rdr09

Quote:


> Originally Posted by *Yungbenny911*
> 
> Hey everyone, i was thinking of doing a *R9 280X Vs GTX 770 Bench-off Thread* since this question has been asked by multiple people, and always results in keyboard battles lol. There would absolutely be no bickering/bias/fanboy/memes posts, only benchmark scores, and non offensive replies to previous posted scores that would be posted.
> 
> Yay or Nay?
> 
> 
> 
> 
> 
> 
> 
> . I'm asking first because, for the Thread to be successful, your benchmarks would be needed, and i would try to make spreadsheets/charts *with the top 10/5 scores for each benchmark*, so the Thread scores can be more presentable to Guests, and everyone looking for an upgrade path.
> 
> I currently have ideas of games with In-game benchmarks to run, but your suggestions would be much appreciated


i will subb. is ti possible to categorize by cpu and oc, maybe even ram speed, and stock vs max oc?

thanks, Yung.


----------



## Yungbenny911

I'll start creating the tools for the thread. Thanks for your feedback guys, this would be fun







. I think i would just focus on the GPU's since i would be working with top 10 AVG scores. I would also make two categories, Under water, and on air.


----------



## battleaxe

I dont' even have this card. But seeing the data maybe I'll decide to get one.


----------



## chiknnwatrmln

What is the best 280x for overclocking? I really like the toxic, but I'm currently going through Sapphire's rma process and I never want to deal with it again..


----------



## eternal7trance

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> What is the best 280x for overclocking? I really like the toxic, but I'm currently going through Sapphire's rma process and I never want to deal with it again..


Asus or HIS would be next in line then


----------



## Dart06

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> What is the best 280x for overclocking? I really like the toxic, but I'm currently going through Sapphire's rma process and I never want to deal with it again..


The Matrix is probably on the better end, but it's 50$ more expensive.

Man my cards are supposed to be here by Thursday but I was hoping for Wednesday. Going to be a long week.


----------



## tsm106

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Yungbenny911*
> 
> Hey everyone, i was thinking of doing a *R9 280X Vs GTX 770 Bench-off Thread* since this question has been asked by multiple people, and always results in keyboard battles lol. There would absolutely be no bickering/bias/fanboy/memes posts, only benchmark scores, and non offensive replies to previous posted scores that would be posted.
> 
> Yay or Nay?
> 
> 
> 
> 
> 
> 
> 
> . I'm asking first because, for the Thread to be successful, your benchmarks would be needed, and i would try to make spreadsheets/charts *with the top 10/5 scores for each benchmark*, so the Thread scores can be more presentable to Guests, and everyone looking for an upgrade path.
> 
> I currently have ideas of games with In-game benchmarks to run, but your suggestions would be much appreciated
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Absolutely man! Do it.
Click to expand...

LOL, more 770 shenanigans. Why am I not surprised by this?


----------



## Modus

Anyone running 280x CF? I have one right now for BF4 and it doesn't cut it for 60fps with v-sync on, I can play on high and average 55-60. Do you think a second card will give me the boost needed for ultra with 2xMSAA (other AA is off due to blur) and HBAO? I'm playing at 1440p btw.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Dart06*
> 
> The Matrix is probably on the better end, but it's 50$ more expensive.
> 
> Man my cards are supposed to be here by Thursday but I was hoping for Wednesday. Going to be a long week.


I don't mind paying extra, I just want a good card from a company with good customer service and an easy rma process.


----------



## Tobiman

I'd stick with the Asus 280x. It has the pcb of the matrix and the cooler is slimmer and better. Cooler on the matrix is thick for nothing and never performed as well as my vapor-x in the temps department. The matrix does clock higher though.

In Asus gpu tweak, when you apply settings, the save button appears. Press it then select 1, 2, 3, or 4 to choose the profile where you want your settings saved. That way you can apply your settings by just clicking a button.


----------



## Dart06

Quote:


> Originally Posted by *Modus*
> 
> Anyone running 280x CF? I have one right now for BF4 and it doesn't cut it for 60fps with v-sync on, I can play on high and average 55-60. Do you think a second card will give me the boost needed for ultra with 2xMSAA (other AA is off due to blur) and HBAO? I'm playing at 1440p btw.


Yes, crossfire 280x should destroy Battlefield 4 at 1440p. Might even get 4x MSAA. I'll see if I can find any kind of benches quick.


----------



## ImJJames

Is a 280x worthy upgrade from a CF 7850? Right now with my CF 7850 Battlefield 4 I am running 95+ FPS on High and around 80+ FPS on ULTRA MAX 1080P, very smooth no micro stuttering or problems.


----------



## Dart06

Quote:


> Originally Posted by *ImJJames*
> 
> Is a 280x worthy upgrade from a CF 7850? Right now with my CF 7850 Battlefield 4 I am running 95+ FPS on High and around 80+ FPS on ULTRA MAX 1080P, very smooth no micro stuttering or problems.


A single 280x, I don't really think the upgrade would be worth it. Probably better to wait.


----------



## SupahSpankeh

So - ASUS Tweak... I've got to 1200MHz at 1.238v (seem reasonable to everyone?) but I can't get ASUS Tweak to re-apply the settings. I often have to fire it up and load the saved profile.

Am I just using the wrong app?


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> LOL, more 770 shenanigans. Why am I not surprised by this?


Meaning what?


----------



## eternal7trance

Quote:


> Originally Posted by *SupahSpankeh*
> 
> So - ASUS Tweak... I've got to 1200MHz at 1.238v (seem reasonable to everyone?) but I can't get ASUS Tweak to re-apply the settings. I often have to fire it up and load the saved profile.
> 
> Am I just using the wrong app?


Could always try afterburner, it's pretty good at saving stuff.

Also 1200mhz is pretty decent, I would keep it at that if you can stay stable in BF4 with it


----------



## Yungbenny911

Hey Everyone, I'm done creating the *280X/7970 Vs 770/680* Bench-Off thread

*LINK TO THE THREAD*

Your participation is needed







, the thread is still under construction though, but i'll be done with it in sometime.


----------



## tsm106

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> LOL, more 770 shenanigans. Why am I not surprised by this?
> 
> 
> 
> Meaning what?
Click to expand...

The guy is obsessed with his 770 and trying to prove it is faster. He's been beaten silly on plenty of occasions and he comes back for more. You guys are like fresh meat for him.


----------



## Dart06

I had Nvidia cards forever but want to try something new. I also only game and recent driver issues are annoying. Right now I only care about BF4.

It's not like a 770 matters in BF4. In a month Mantle will be out and a 270X for all we know could beat a 770 (which would be hilarious).


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> The guy is obsessed with his 770 and trying to prove it is faster. He's been beaten silly on plenty of occasions and he comes back for more. You guys are like fresh meat for him.


tSm106? IDK... honestly I couldn't care less which is better. Which is better? That's all I'm interested in. I prob won't get one anyway. I'm still happy with my 670's and I'd prob go up to a 290x or 780 if I really wanted to spend the money. But I don't so this is interesting for lots of us anyway.


----------



## NightHawK360

I'm looking into buying a R9 280X, which one should I go for?

ASUS DCUII TOP
http://www.newegg.com/Product/Product.aspx?Item=N82E16814121803

Sapphire Vapor X
http://www.newegg.com/Product/Product.aspx?Item=N82E16814202045

Or shell out more cash and get the Sapphire Toxic.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814202044

Thanks.


----------



## Pfortunato

Stick with the asus, best pcb, cooler, temps and the most important, no noise at all and does oc well









Sent from my GT-I9505 using Tapatalk


----------



## PropheticCreed

Hi all. I purchased the Gigabyte GV-R928XOC-3GD model this morning and am just waiting for it to arrive. Has anyone else here purchased the model? If so, what are your thoughts?


----------



## syn17

Does anyone know what sort of frames you'd get with a 270x non oc vs a 270x oc on BF4 multiplayer in 1080p on ultra settings and on high settings? Would someone please be able to run a test and tell me the frame rate difference?


----------



## eternal7trance

Quote:


> Originally Posted by *PropheticCreed*
> 
> Hi all. I purchased the Gigabyte GV-R928XOC-3GD model this morning and am just waiting for it to arrive. Has anyone else here purchased the model? If so, what are your thoughts?


Horrible card because it comes voltage locked. If you don't plan on overclocking then it's great
Quote:


> Originally Posted by *NightHawK360*
> 
> I'm looking into buying a R9 280X, which one should I go for?
> 
> ASUS DCUII TOP
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814121803
> 
> Sapphire Vapor X
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202045
> 
> Or shell out more cash and get the Sapphire Toxic.
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202044
> 
> Thanks.


The Asus has better internals so I would stick with that.

Unless you want something that gives you a guaranteed 1150 OC on core then the Toxic might be better


----------



## PropheticCreed

Excellent. Despite belonging to OC.net I'm not really planning on doing anything to the card. I liked the stock boost to 1100MHz for that reason and the card was 299...so really couldn't beat it.


----------



## Dart06

Quote:


> Originally Posted by *PropheticCreed*
> 
> Excellent. Despite belonging to OC.net I'm not really planning on doing anything to the card. I liked the stock boost to 1100MHz for that reason and the card was 299...so really couldn't beat it.


Yep. I don't bother with overclocking GPUs either. I have two windforce 280x cards on the way.

1100MHz on each is plenty fine for me.


----------



## PropheticCreed

This is my first high-end GPU buy ever. I have gotten away using a free HD-4870 that I threw a cooler on and before that it was an ati X300 so I am very excited for this card. Just need something that I can throw into my system and crank out high FPS from the start.

Are you x-firing the cards to replace your 670s?


----------



## Yungbenny911

Quote:


> Originally Posted by *tsm106*
> 
> The guy is obsessed with his 770 and trying to prove it is faster. He's been beaten silly on plenty of occasions and he comes back for more. You guys are like fresh meat for him.


Haha, well, it would be a sum of the 10 best scores right? So the thread's name is not *Yungbenny911 Vs 280X owners* lol. You try to make it seem like it's about me, but you're entitled to your own opinion though









Doesn't mean i agree with you







. I personally think it would be fun, and would provide a lot of information most buyers need.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *PropheticCreed*
> 
> Excellent. Despite belonging to OC.net I'm not really planning on doing anything to the card. I liked the stock boost to 1100MHz for that reason and the card was 299...so really couldn't beat it.


Let me know how that card is, I'm considering getting either that, a matrix, or a toxic.

First I have to sell my two cards that I have now, though. I wish the ocn marketplace didn't have a rep requirement, nobody on eBay seems to be interested.


----------



## Dart06

Quote:


> Originally Posted by *PropheticCreed*
> 
> This is my first high-end GPU buy ever. I have gotten away using a free HD-4870 that I threw a cooler on and before that it was an ati X300 so I am very excited for this card. Just need something that I can throw into my system and crank out high FPS from the start.
> 
> Are you x-firing the cards to replace your 670s?


Yep. I sold each 670 for 225$ (the one I have now just sold for that much as well) and ended up only paying less than 200$ to upgrade. It is a pretty worthwhile upgrade considering the 1GB more video memory and that the 7970ghz edition (which is what the 280x is) is noticeably faster than 670s.


----------



## eternal7trance

Quote:


> Originally Posted by *PropheticCreed*
> 
> Excellent. Despite belonging to OC.net I'm not really planning on doing anything to the card. I liked the stock boost to 1100MHz for that reason and the card was 299...so really couldn't beat it.


I do like that they already have a 1100mhz OC so if you don't plan on overclocking much then it sounds like a great card for you


----------



## PropheticCreed

@chicken - I'll let you know for sure. so It seems like it should be a solid stock performer, especially considering the reference board calls for 850 MHz clock speeds. As far as I'm concerned 299 is money well spent on the card

@Dart - Ah good to see that you got decent prices for your cards. I'm interested on the crossfire performance, especially since these cards have nowhere to go but down in price. I think AMD was smart in rebranding the GHz edition, considering Asus upped the memory frequency even higher, plus we get to work with the 384-bit memory bus. I was contemplating a 770, but the 280X was released and the performance is at least on-par for a lot less.

@Eternal - Yup, I like my GPUs stock for the most part, don't have too much time these days to sit and meticulously tweak. I like being able to just jump straight into a game and go. Considering I'll be doing all my gaming at 1080P this card should just scream right along. I signed up for Star Citizen's alpha test and realised that the HD-4000 series doesn't support DX11 so it was a bit of a face palm before deciding to bite the bullet and go for the 280X.


----------



## Dart06

Quote:


> Originally Posted by *PropheticCreed*
> 
> @Dart - Ah good to see that you got decent prices for your cards. I'm interested on the crossfire performance, especially since these cards have nowhere to go but down in price. I think AMD was smart in rebranding the GHz edition, considering Asus upped the memory frequency even higher, plus we get to work with the 384-bit memory bus. I was contemplating a 770, but the 280X was released and the performance is at least on-par for a lot less.


I'll let you know how CFX performance is when I get my cards.

I saw no reason to get a 770 as the performance per dollar just isn't at the level that I find acceptable.


----------



## eternal7trance

Quote:


> Originally Posted by *PropheticCreed*
> 
> @chicken - I'll let you know for sure. so It seems like it should be a solid stock performer, especially considering the reference board calls for 850 MHz clock speeds. As far as I'm concerned 299 is money well spent on the card
> 
> @Dart - Ah good to see that you got decent prices for your cards. I'm interested on the crossfire performance, especially since these cards have nowhere to go but down in price. I think AMD was smart in rebranding the GHz edition, considering Asus upped the memory frequency even higher, plus we get to work with the 384-bit memory bus. I was contemplating a 770, but the 280X was released and the performance is at least on-par for a lot less.
> 
> @Eternal - Yup, I like my GPUs stock for the most part, don't have too much time these days to sit and meticulously tweak. I like being able to just jump straight into a game and go. Considering I'll be doing all my gaming at 1080P this card should just scream right along. I signed up for Star Citizen's alpha test and realised that the HD-4000 series doesn't support DX11 so it was a bit of a face palm before deciding to bite the bullet and go for the 280X.


It's too bad you didn't purchase it 1-2 days ago. Newegg keeps dropping $25 off $250 codes


----------



## Dart06

Quote:


> Originally Posted by *eternal7trance*
> 
> It's too bad you didn't purchase it 1-2 days ago. Newegg keeps dropping $25 off $250 codes


Yep I managed to get 25$ off one of my 280x cards on my birthday. Was amazing.


----------



## eternal7trance

Same here, I finally saw them drop it on a day I was ready to buy so I picked up the HIS one


----------



## ImJJames

All the good brands are out of stock ***...is the gigabyte version good? Already comes 1100mhz OCed


----------



## eternal7trance

Quote:


> Originally Posted by *ImJJames*
> 
> All the good brands are out of stock ***...is the gigabyte version good? Already comes 1100mhz OCed


It's good if you don't plan on overclocking it much


----------



## ImJJames

For overclocking wouldnt a 7970 perform generally better than 280x?


----------



## eternal7trance

Quote:


> Originally Posted by *ImJJames*
> 
> For overclocking wouldnt a 7970 perform generally better than 280x?


Not really, most 280x cards are just 7970 copies


----------



## brazilianloser

Finally got BF4 and so far it seems to be too much for the 280x to handle on max but I already knew that. Still a max temp of 65 on air during hours of eyefinity game play on high is not bad. And I haven't even tried to oc the card yet.


----------



## Dart06

Quote:


> Originally Posted by *brazilianloser*
> 
> Finally got BF4 and so far it seems to be too much for the 280x to handle on max but I already knew that. Still a max temp of 65 on air during hours of eyefinity game play on high is not bad. And I haven't even tried to oc the card yet.


With eyefinity it's too much but one monitor notsomuch. Should get a second one for eyefinity.


----------



## chiknnwatrmln

-double post from website lagging-


----------



## chiknnwatrmln

Just got to thinking that eventually I'm going to CF 280x's... Probably Matrices. I found out that they draw about 380w on full load, so my PSU (Corsair TX850) along with my cpu and peripherals isn't powerful enough....

But I do have a spare CX600 PSU, what would happen if I hooked up one card to that? I don't see why it wouldn't work, the psu has 46a on the 12v rail which is more than enough for just one card. I have a SWEX power switch to switch on the PSU also. It would just look kinda goofy with an extra PSU outside my case. Thoughts?


----------



## brazilianloser

Yeah I know. Planning on that unless the two upcoming cards don't impress me. I have till the 18 to return. Or keep it so just being patient and seeing how the 290 will be and possible a 780ti...


----------



## Dart06

Quote:


> Originally Posted by *brazilianloser*
> 
> Yeah I know. Planning on that unless the two upcoming cards don't impress me. I have till the 18 to return. Or keep it so just being patient and seeing how the 290 will be and possible a 780ti...


I was going to hold out on 290s but the performance gain isn't worth 150$ per card to me.


----------



## brazilianloser

Well I will still hold out for the the real benchmarks to come out end of this week hopefully. With four monitors I need a lot of horsepower... And it the non X is a good step forward for the 100something It might be worth it for me that is.


----------



## dade_kash_xD

*TO ANYBODY LOOKING TO BUY THE XFX DD R9-280X DO NOT BUY THESE CRAPPY CARDS!!!!!!!!*

Nothing but terrible experience with these cards. I have spent over 20 hours trying to fix the crashing/freezing problem I am having with them. I have even put my old 6970's in crossfire in my new rig and they work perfect. It's not my mobo or psu or anything else. IT'S THE GARBAGE XFX IS PEDDLING! RMA TOMORROW AM!


----------



## brazilianloser

Either way was able to oc my Vapor X to 1100 on the GPU and 1560 on the Memory... about an hour of BF4 without any problems at all with a max temp of 66 on air. Before oc playing on high 5760x1080 was getting 30ish fps after oc got high 30ish and mid 40s. Not bad. Now the bad is that it gets loud. If I keep this babies will definitely put them on water.


----------



## eternal7trance

Quote:


> Originally Posted by *dade_kash_xD*
> 
> *TO ANYBODY LOOKING TO BUY THE XFX DD R9-280X DO NOT BUY THESE CRAPPY CARDS!!!!!!!!*
> 
> Nothing but terrible experience with these cards. I have spent over 20 hours trying to fix the crashing/freezing problem I am having with them. I have even put my old 6970's in crossfire in my new rig and they work perfect. It's not my mobo or psu or anything else. IT'S THE GARBAGE XFX IS PEDDLING! RMA TOMORROW AM!


XFX is well known to take the reference design and use cheap parts. I would never recommend them to anyone anymore


----------



## dade_kash_xD

I'm returning my 2 XFX DD r9-280x GARBAGE EDITION cards tomorrow morning first thing. What cards do you guys recommend I get? I would like 2x 280x crossfire or 2x r9-290(non X) or a single 290x. Looks like nothing good is in stock. What a dilemma. If I would've been able to get my 290x and be good right now.


----------



## Jophess

My XFX 280x is sitting happily at 1150/1800.

/shrug


----------



## eternal7trance

Quote:


> Originally Posted by *Jophess*
> 
> My XFX 280x is sitting happily at 1150/1800.
> 
> /shrug


For now...


----------



## ImJJames

Ive never had problem with XFX brands.


----------



## dade_kash_xD

Quote:


> Originally Posted by *Jophess*
> 
> My XFX 280x is sitting happily at 1150/1800.
> 
> /shrug


As a single card they work great, but in crossfire is where the issues come about. Actually, I just found a thread on Guru3D forums where another dude with the same exact cards as me is having the same issues with just one DD 280x.


----------



## Clockster

http://www.techpowerup.com/gpuz/48b3y/

Club3D Royal King Edition

Love the fact that voltage is unlocked out of the box, overclocks really well, although the cooler isn't the greatest.


----------



## eternal7trance

Really glad I went with my card now

http://www.tomshardware.com/reviews/radeon-r9-280x-third-party-round-up,3655-4.html


----------



## Dart06

Quote:


> Originally Posted by *eternal7trance*
> 
> Really glad I went with my card now
> 
> http://www.tomshardware.com/reviews/radeon-r9-280x-third-party-round-up,3655-4.html


Yep those temps are outrageous. No thank you.


----------



## Neo_Morpheus

the 770's are going to drop in money massive, then we can have a competition, I can help out Yungbenny911 with beating you all down with a stick with my basic stock clocks on my 770's


----------



## steelkevin

350€ for 290s here...

We only get 10 days to return products so I probably won't be changing for a 290 but now it's going to be on my mind for at least on a couple days x)


----------



## hoevito

Wow...I was buying another 280x to Crossfire with my Matrix platinum. With the glowing reviews on the 290 and considering it's not THAT much more expensive than a 280x, would I be crazy in wanting to crossfire a 280x with a 290? I was buying a new card this coming friday and it was hard enough already trying to device which 280x to pair with my matrix, and now I REALLY cannot make up my mind...


----------



## DiceAir

My CLub3d R9-280x in crossfire is running about 96C when playing BF4 and that's on default fan speed. I must say the fans are not spinning that fast on full load but i have an aggressive profile now and my GPU is only going 71C on BF4. I have some other issues where my FPS is limited to 100. I'm running on 2560x1440 @ 96Hz so any suggestions. I'm playing on ultra 2xmsaa HBAo on post aa off. so I hope either Dice or AMD can resolve this. Any suggestions?

When I get home from work i will try the following.

1. Creating a new profile for Bf4.exe and use predefined BF4 setting for the CrossFireX
2. Disable frame pacing
3. unpark CPU cores


----------



## Joeking78

Quote:


> Originally Posted by *DiceAir*
> 
> My CLub3d R9-280x in crossfire is running about 96C when playing BF4 and that's on default fan speed. I must say the fans are not spinning that fast on full load but i have an aggressive profile now and my GPU is only going 71C on BF4. I have some other issues where my FPS is limited to 100. I'm running on 2560x1440 @ 96Hz so any suggestions. I'm playing on ultra 2xmsaa HBAo on post aa off. so I hope either Dice or AMD can resolve this. Any suggestions?
> 
> When I get home from work i will try the following.
> 
> 1. Creating a new profile for Bf4.exe and use predefined BF4 setting for the CrossFireX
> 2. Disable frame pacing
> 3. unpark CPU cores


Have you got V-Sync enabled?


----------



## DiceAir

Quote:


> Originally Posted by *Joeking78*
> 
> Have you got V-Sync enabled?


Yes i have it enabled on Battlefield 4 only.


----------



## Joeking78

Quote:


> Originally Posted by *DiceAir*
> 
> Yes i have it enabled on Battlefield 4 only.


That's why you are limited to 100fps. V-Sync limits your FPS to your monitor refresh rate.


----------



## DiceAir

Quote:


> Originally Posted by *Joeking78*
> 
> That's why you are limited to 100fps. V-Sync limits your FPS to your monitor refresh rate.


I know that but my screen is not 100Hz i's 96Hz so should be limited to 96FPS not 100FPS


----------



## Lisjak

Hey guys, I am thinking of a gpu upgrade and thought I would ask here instead of making a new thread. Do you guys think my pc is good enough to utilise a new 280x ? Or would I need to upgrade some components?


----------



## DiceAir

I say install the card first and see if you get enough frames for your liking. then upgrade cpu before you waste your money on something you maybe don't even need to upgrade.


----------



## Vakten

Hey guys,

How's the OCing going on your cards? I've got an XFX 280x DD and I've got it running on a modest 1100MHz/ 1500MHz and have it maxing at about 60c after a few hours of gaming so just wondering what some of you others guys have gotten and see if I'll try pushing it any further









Enjoy lads!


----------



## hoevito

Quote:


> Originally Posted by *Vakten*
> 
> Hey guys,
> 
> How's the OCing going on your cards? I've got an XFX 280x DD and I've got it running on a modest 1100MHz/ 1500MHz and have it maxing at about 60c after a few hours of gaming so just wondering what some of you others guys have gotten and see if I'll try pushing it any further
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Enjoy lads!


I can run Valley at 1250/7200mhz all day long and top out around 67* on my Matrix platinum, but I've settled at 1220mhz for playing actual games. I've been using The Witcher 2 and Crysis 3 for seeking out a stable overclock on my card, as I can only hit 1220 and 7100 on the memory without any signs of artifacting and graphical hiccups. If I push any higher than that then I start getting artifacts and weird crap happening on the screen. Oddly enough, I can play Battlefield 4 completely maxed out (at 1080p) at 1240 core 7200 memory perfectly fine.

I guess I should be happy with 1220//7100 but I was expecting more...


----------



## Ashuiegi

Quote:


> Originally Posted by *steelkevin*
> 
> Hi,
> 
> Just wondering if anybody here has watercooled their 280X and if they could tell us about their temps and OC results while I painfully wait for the EK-FC R9-280X DCII to finally be released.
> 
> It looks like most people are getting the Asus DC2T 280X though so I guess not many have it underwater yet. Who is planning on watercooling their 280X btw ?


it 's the same block as the 7970 DCii if you wait for retailer to change the name you might wait a long time


----------



## Ashuiegi

and i m pretty sure you wont be able to cf 280 and 290 togheter


----------



## dade_kash_xD

Just ordered 2x Sapphire R9-290's!


----------



## brazilianloser

Quote:


> Originally Posted by *dade_kash_xD*
> 
> Just ordered 2x Sapphire R9-290's!


Got the auto notification email for the Asus one but by the time I got to Newegg it was out of stock already... Thought it was to be out only the end of the week an anyways though.


----------



## battleaxe

Quote:


> Originally Posted by *brazilianloser*
> 
> Got the auto notification email for the Asus one but by the time I got to Newegg it was out of stock already... Thought it was to be out only the end of the week an anyways though.


The auto notification is useless when they are going this fast. That dumb thing can take hours and even days longer than reality. Don't waste time with it when the cards are selling this fast.


----------



## Dart06

Quote:


> Originally Posted by *hoevito*
> 
> Wow...I was buying another 280x to Crossfire with my Matrix platinum. With the glowing reviews on the 290 and considering it's not THAT much more expensive than a 280x, would I be crazy in wanting to crossfire a 280x with a 290? I was buying a new card this coming friday and it was hard enough already trying to device which 280x to pair with my matrix, and now I REALLY cannot make up my mind...


I cannot recommend a 290 to anyone currently. Those temperatures are a big problem. Maybe when non-reference boards are out, but then you'll be paying 50$ more a card.


----------



## mooter

Just a word of warning to any owners of the Asus Crosshair V Formula, or Asus Crosshair V Formula-Z motherboard who intend on buying a pair of R9 280x's to use in crossfire.... don't.

I bought two XFX Double Dissipation R9 280x's to replace my old 6950's but the longest crossfire bridge (10cm) isn't long enough to stretch across the x16 PCIe lanes.

XFX don't seem to care and have blamed the motherboard for being "incompatible" with their cards. They don't seem to understand that it's their oversized non-reference cooler causing the problem!

I'm now having to choose between returning the cards for something else, or replacing my motherboard completely. Both at my expense.

Really not very happy but what else can I do?!


----------



## Nastrodamous

i am having a problem with screen flickering on the MSI Gaming 280x every once in a while on my screen a line will flicker like its getting some kind of interference, I have one monitor hooked up via hdmi, and the other hooked up via DVI, it only happens on the one with HDMI. I have used additional graphics card to test and it only happens on this card.

What do you guys think is going on?


----------



## eternal7trance

Quote:


> Originally Posted by *mooter*
> 
> Just a word of warning to any owners of the Asus Crosshair V Formula, or Asus Crosshair V Formula-Z motherboard who intend on buying a pair of R9 280x's to use in crossfire.... don't.
> 
> I bought two XFX Double Dissipation R9 280x's to replace my old 6950's but the longest crossfire bridge (10cm) isn't long enough to stretch across the x16 PCIe lanes.
> 
> XFX don't seem to care and have blamed the motherboard for being "incompatible" with their cards. They don't seem to understand that it's their oversized non-reference cooler causing the problem!
> 
> I'm now having to choose between returning the cards for something else, or replacing my motherboard completely. Both at my expense.
> 
> Really not very happy but what else can I do?!


"Welcome to XFX support, how can we make excuses not to help you?"


----------



## Koluson

guys fx 6350 @ 4.5 will be bottleneck for 280x ? or better buy 270x/ gtx 760.


----------



## brazilianloser

http://www.kitguru.net/components/graphic-cards/zardon/amd-r9-290-review-1600p-4k-and-cf/

There is one review for the 290


----------



## mooter

Quote:


> Originally Posted by *eternal7trance*
> 
> "Welcome to XFX support, how can we make excuses not to help you?"


Oh, you've experienced it too, eh?
Quote:


> Thank you for your feedback, we shall observe the discussions you generate with interest and if this does indeed become a problem reported by more than one person, (at present it is you alone) we shall pass it on to TM to come up with a solution. As we have done everything we can under these circumstances to assist you, and you have rejected every option we have given you, please refrain from using language such as "XFX do not care". Our conversations here clearly prove otherwise, we have saved and documented them along with all actions we have taken to assist you. If you continue to post untruthful statements and neglect to relay what has actually been discussed, you are opening yourself up for libel action should you continue to make untrue accusations in print. Once again, we apologise for the inconvenience you have experienced and the fact that you are obviously very angry, however there is simply nothing further we are able to do. Thank you kindly.


Sapphire sent a 120mm crossfire bridge to one of their 280x customers experiencing the same problem. I guess they care


----------



## eternal7trance

I won't get the 290 or the 290x until they put some better coolers on it. Pieces of junk atm


----------



## Dart06

Quote:


> Originally Posted by *eternal7trance*
> 
> I won't get the 290 or the 290x until they put some better coolers on it. Pieces of junk atm


It's kind of like "Hey we released these cards at these prices to hurt our competition but you have to pay 50$ more for non-reference for them to be worthwhile." Smart business move really.


----------



## Jindaman

I was thinking of crossfiring two 280x will a CORSAIR HX850 be enough for that?


----------



## ImJJames

Its worthwhile how it is, please read reviews and benchmarks there are tons out there already just released few hours ago it outperforming 780. For $400 you can't complain, AMD is forcing NVIDIA to price cut, and if you're crying about that, I feel sorry for you.


----------



## Dart06

Quote:


> Originally Posted by *ImJJames*
> 
> Its worthwhile how it is, please read reviews and benchmarks there are tons out there already just released few hours ago it outperforming 780. For $400 you can't complain, AMD is forcing NVIDIA to price cut, and if you're crying about that, I feel sorry for you.


Realists understand that tdp, temperature and noise are important in a video card. There are plenty of AMD cards that are a much better value at the moment than a reference volcano.


----------



## black7hought

I dropped my XFX 280X off at UPS for an RMA refund. I've jumped to the 290 club. It is probably overkill for 1080p but I don't plan on buying a new GPU for a couple years.


----------



## ImJJames

Quote:


> Originally Posted by *Dart06*
> 
> Realists understand that tdp, temperature and noise are important in a video card. There are plenty of AMD cards that are a much better value at the moment than a reference volcano.


Realist? You mean a computer grease monkey who have nothing better to do but overclock a card for every inch of its life? In the real world people buy a graphic card slap it in and forget about it. And for the real world point of view the r9 290 makes 780 and 290x non-existent because of its price performance ratio.


----------



## eternal7trance

Quote:


> Originally Posted by *ImJJames*
> 
> Realist? You mean a computer grease monkey who have nothing better to do but overclock a card for every inch of its life? In the real world people buy a graphic card slap it in and forget about it. And for the real world point of view the r9 290 makes 780 and 290x non-existent because of its price performance ratio.


But taking your same situation, people in the real world want a card they don't have to monitor. You would have to watch this thing like a hawk, on top of probably having to change the fan profile with a 3rd party program to keep it from overheating in your real world computer with 2 fans and horrible cooling.


----------



## Dart06

Quote:


> Originally Posted by *ImJJames*
> 
> Realist? You mean a computer grease monkey who have nothing better to do but overclock a card for every inch of its life? In the real world people buy a graphic card slap it in and forget about it. And for the real world point of view the r9 290 makes 780 and 290x non-existent because of its price performance ratio.


Those people that buy a video card and slap it in are going to be angry about how loud it is and that they might need a bigger PSU to support it. They could also be angry at how much hotter the GPU gets.


----------



## ImJJames

Quote:


> Originally Posted by *eternal7trance*
> 
> But taking your same situation, people in the real world want a card they don't have to monitor. You would have to watch this thing like a hawk, on top of probably having to change the fan profile with a 3rd party program to keep it from overheating in your real world computer with 2 fans and horrible cooling.


If you're not going crazy on OC'ing you don't need to monitor anything, the thing is made to handle 90C+ for normal usage...


----------



## eternal7trance

Quote:


> Originally Posted by *ImJJames*
> 
> If you're not going crazy on OC'ing you don't need to monitor anything, the thing is made to handle 90C+ for normal usage...


Yes but cooling in your average computer is not as good as cooling on a test bench which is what you usually see in reviews


----------



## ImJJames

Quote:


> Originally Posted by *eternal7trance*
> 
> Yes but cooling in your average computer is not as good as cooling on a test bench which is what you usually see in reviews


Why because test bench is open case? Maybe that will dissipate a few C's? Because of that the 290 is terrible for normal cases because it will overheat? Think about what you're saying.


----------



## eternal7trance

Quote:


> Originally Posted by *ImJJames*
> 
> Why because test bench is open case? Maybe that will dissipate a few C's? Because of that the 290 is terrible for normal cases because it will overheat? Think about what you're saying.


Are you forgetting that it's going to suck in hot air inside the case vs cold air on a test bench?

Especially once they become dust ridden


----------



## steelkevin

Quote:


> Originally Posted by *Ashuiegi*
> 
> it 's the same block as the 7970 DCii if you wait for retailer to change the name you might wait a long time


Yeah right because I'm stupid and don't ever do any research before saying stuff.
If that was the case EK wouldn't bother making a new block for it.

Do your research before saying silly stuff.

I'm returning my awesome DC2T 280X. Turned out today was the last day I could claim a free return so I did.
I'll be getting a 290.
Watercooling it right away so temps noise and whatever really won't be an issue for me







.


----------



## brazilianloser

Quote:


> Originally Posted by *steelkevin*
> 
> Yeah right because I'm stupid and don't ever do any research before saying stuff.
> If that was the case EK wouldn't bother making a new block for it.
> 
> Do your research before saying silly stuff.
> 
> I'm returning my awesome DC2T 280X. Turned out today was the last day I could claim a free return so I did.
> I'll be getting a 290.
> Watercooling it right away so temps noise and whatever really won't be an issue for me
> 
> 
> 
> 
> 
> 
> 
> .


Same I still have until the 18th to return mine. I will initially have the portable tornado going air but also putting it in water so noise and temps not an issue.


----------



## ImJJames

Quote:


> Originally Posted by *eternal7trance*
> 
> Are you forgetting that it's going to suck in hot air inside the case vs cold air on a test bench?
> 
> Especially once they become dust ridden


I know tons of people with 290x closed case with zero problems man. That is all I am saying.


----------



## Ashuiegi

Quote:


> Originally Posted by *steelkevin*
> 
> Yeah right because I'm stupid and don't ever do any research before saying stuff.
> If that was the case EK wouldn't bother making a new block for it.
> 
> Do your research before saying silly stuff.
> 
> I'm returning my awesome DC2T 280X. Turned out today was the last day I could claim a free return so I did.
> I'll be getting a 290.
> Watercooling it right away so temps noise and whatever really won't be an issue for me
> 
> 
> 
> 
> 
> 
> 
> .


i did and ek said they would not make new one because the old one fits, they never said they will make a new one,... , they will just change the name , good retailer have already changed the name of the block on their site but not all of them will do it . same thing goes for the matrix hd7970 block which is now called matrix 280x. do the research on ekcoolingconfigurator and it tell you the 7970 one works ,....

""EKWB explains that due to the similarities between the older Radeon HD 7970 and the new Radeon R9-280X, and since there was no official reference design for the PCB, most of the AIB partners have decided to either re-use the older reference design (AMD 109-C38637) or proprietary designs, including Asus' DirectCUII boards. EKWB has also renamed the EK-FC7970 Matrix to the EK-FX R9-280X Matrix, just to prevent confusion.""

http://www.tomshardware.com/news/ekwb-r9-280x-ek-fc-water,24633.html


----------



## brazilianloser

That Sapphire 290 in Stock is calling my name. But will be patient and see if the Asus or other more reliable options will pop up before the 18th


----------



## ImJJames

Quote:


> Originally Posted by *brazilianloser*
> 
> That Sapphire 290 in Stock is calling my name. But will be patient and see if the Asus or other more reliable options will pop up before the 18th


They also have the MSI in stock, thats crazy how fast the asus went out of stock, I just check few hours ago was still in stock.


----------



## brazilianloser

Quote:


> Originally Posted by *ImJJames*
> 
> They also have the MSI in stock, thats crazy how fast the asus went out of stock, I just check few hours ago was still in stock.


I woke up at six am and it was out of stock already.


----------



## brazilianloser

And on a review I posted earlier it was said to become widely available on the 14th only... Unless it was a mistype by the reviewer.


----------



## ducknukem86

i got to tell you, i'm a bit disappointed with the performance of my 280x. I don't know if its because i'm playing unoptimized games. Assassin's Creed 3 for example has framerate dips to 25 fps, very choppy. I had to set AA to normal. Far Cry 3 in 1080p ultra settings with no AA gives me 60 fps most of the times, but when i drive during the night, because of the lights, the framerate dips to 30 fps. Are all games like this? I mean, i thought i was buying a powerful videocard, how can framerate be so inconsistent?


----------



## brazilianloser

Quote:


> Originally Posted by *ducknukem86*
> 
> i got to tell you, i'm a bit disappointed with the performance of my 280x. I don't know if its because i'm playing unoptimized games. Assassin's Creed 3 for example has framerate dips to 25 fps, very choppy. I had to set AA to normal. Far Cry 3 in 1080p ultra settings with no AA gives me 60 fps most of the times, but when i drive during the night, because of the lights, the framerate dips to 30 fps. Are all games like this? I mean, i thought i was buying a powerful videocard, how can framerate be so inconsistent?


Lack of a proper driver and such... But yeah mine does okay but not good enough for my eyefinity setup.


----------



## steelkevin

Quote:


> Originally Posted by *Ashuiegi*
> 
> i did and ek said they would not make new one because the old one fits, they never said they will make a new one,... , they will just change the name , good retailer have already changed the name of the block on their site but not all of them will do it . same thing goes for the matrix hd7970 block which is now called matrix 280x. do the research on ekcoolingconfigurator and it tell you the 7970 one works ,....
> 
> ""EKWB explains that due to the similarities between the older Radeon HD 7970 and the new Radeon R9-280X, and since there was no official reference design for the PCB, most of the AIB partners have decided to either re-use the older reference design (AMD 109-C38637) or proprietary designs, including Asus' DirectCUII boards. EKWB has also renamed the EK-FC7970 Matrix to the EK-FX R9-280X Matrix, just to prevent confusion.""
> 
> http://www.tomshardware.com/news/ekwb-r9-280x-ek-fc-water,24633.html


Digging deeper ?
Enjoy









http://www.ekwb.com/news/404/19/ASUS-latest-revision-of-R9280X-DC2-T-to-get-a-Full-Cover-water-block/


----------



## ImJJames

Quote:


> Originally Posted by *brazilianloser*
> 
> And on a review I posted earlier it was said to become widely available on the 14th only... Unless it was a mistype by the reviewer.


AMD intended launch was the 14th, but they pushed it today because of Nvidia price cut.


----------



## brazilianloser

Love XFX charging 20 bucks more for no reason on the reference card.


----------



## black7hought

They are doing the same on their 290, which pushed me to Sapphire.


----------



## brazilianloser

That's what I meant. The 290.


----------



## brazilianloser

Quote:


> Originally Posted by *black7hought*
> 
> They are doing the same on their 290, which pushed me to Sapphire.


just having second thoughts about Sapphire after seeing that their rma service seems to be terrible according to others here in the forums.


----------



## Ashuiegi

Are the outputs the same? , this could be where the slight price difference between ref card is.


----------



## black7hought

Quote:


> Originally Posted by *brazilianloser*
> 
> just having second thoughts about Sapphire after seeing that their rma service seems to be terrible according to others here in the forums.


I've seen horror stories about every manufacturer's RMA process on this forum. The only thing I like about XFX besides their 280X design is the lifetime warranty.


----------



## PropheticCreed

Quote:


> Originally Posted by *brazilianloser*
> 
> just having second thoughts about Sapphire after seeing that their rma service seems to be terrible according to others here in the forums.


I find that everyone has an RMA horror story with manufacturer "X." The negative stories are always more verbose than the positive. I look at it this way, if Sapphire/Gigabyte/Asus etc had consistently poor customer service with horror stories of RMA and quality etc then AMD and nVidia would most likely no longer allow them to produce their products.

I found myself feeling leery about Gigabyte because I've never used a Gigabyte GPU and have heard some horror stories. Then I reminded myself that just because other people may have had a bad experience doesn't mean that I will. I find the generally all of the manufacturers make solid products and that I need to stop worrying so much


----------



## black7hought

Yep, the same goes for any product. It is always good to acknowledge the advice but you have to try it for yourself. I'm sure for every 10 complaints, there are 200 users who are happy with their product from company "X".


----------



## brazilianloser

Quote:


> Originally Posted by *PropheticCreed*
> 
> I find that everyone has an RMA horror story with manufacturer "X." The negative stories are always more verbose than the positive. I look at it this way, if Sapphire/Gigabyte/Asus etc had consistently poor customer service with horror stories of RMA and quality etc then AMD and nVidia would most likely no longer allow them to produce their products.
> 
> I found myself feeling leery about Gigabyte because I've never used a Gigabyte GPU and have heard some horror stories. Then I reminded myself that just because other people may have had a bad experience doesn't mean that I will. I find the generally all of the manufacturers make solid products and that I need to stop worrying so much


Yeap. To be sincere this 280 I have here is the first non Nvidia non Evga card I have owned in six... Seven years. Always treated greatly there. But so far this one has done well. And will keep going with it to see where it takes me. Just waiting on Newegg to stop being lazy and give me my gift card so I can grab one of those 290.


----------



## brazilianloser

But going to have to put then in water fast... Those temps and noise levels are somewhat ridiculous.


----------



## steelkevin

Quote:


> Originally Posted by *Ashuiegi*
> 
> Are the outputs the same? , this could be where the slight price difference between ref card is.


I think you forgot to beg my forgiveness for being so stubborn and ignorant.

It's ok though I can wait.


----------



## Ashuiegi

Quote:


> Originally Posted by *steelkevin*
> 
> I think you forgot to beg my forgiveness for being so stubborn and ignorant.
> 
> It's ok though I can wait.


*** are you talking about ?

i was not even talking to you. if you have nothing to say , then just don't


----------



## taem

Quote:


> Originally Posted by *Pfortunato*
> 
> Stick with the asus, best pcb, cooler, temps and the most important, no noise at all and does oc well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sent from my GT-I9505 using Tapatalk


According to this roundup at least the Asus wins only on noise at load. It's one of the worst on temp and louder than other cards at idle. http://www.tomshardware.com/reviews/radeon-r9-280x-third-party-round-up,3655.html

I got the Asus R9280X DC2T 3GD5 because noise at load is a key issue for me -- 40dB at full load in this review, even lower in other reviews. The 75c temp at load reported by this review doesn't sound very good but other reviews all reported lower temps. Meanwhile all reviews note this card's quietness at load. So it's the right pick for me.

I'm not sure about this review though. It reports 56.8dB for the Sapphire Vapor X at full load. FIFTY SIX POINT EIGHT DECIBELS!! Who would buy this if that were true? Maybe if you game in the middle of a construction yard. I would not buy a card with a +16dB noise level for any amount of performance gain.


----------



## Amhro

Quote:


> Originally Posted by *Koluson*
> 
> guys fx 6350 @ 4.5 will be bottleneck for 280x ? or better buy 270x/ gtx 760.


You will be fine.


----------



## steelkevin

Quote:


> Originally Posted by *Ashuiegi*
> 
> *** are you talking about ?
> 
> i was not even talking to you. if you have nothing to say , then just don't


oh god... I feel sorry for people who rely on your ability to remember stuff.

http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-owners-club/1240#post_21129326

FYI, replacing profanity by those little stars can and will get you a warning / infraction (go figure...). I don't quite understand the consequences of them yet but still.


----------



## Algy

hi all, i'll buy a 280x but i don't know which one. the available models that I find right now are the MSI's, the gigabyte's, the sapphire vapor's and the asus model with the cooling of 3 slots (not the matrix, it's the "v2" model).

I wont ocing it for know, but in 1 year or so I'll do it. so I would be glad if anyone recommends me a nice model.

sorry about my english, lol.

thx!


----------



## taem

Asus R9280X DC2T 3GD5 arrived. Box cover is ugly but the box inside the box is nice enough to keep and put stuff inside.


----------



## ihatelolcats

any idea when 280 is coming out?


----------



## chiknnwatrmln

Anyone have CF 280xs? Just bought two more monitors and I need the horsepower to push all those pixels, but I still need the cards to be efficient enough to be powered by a TX850 with a 3770k and a bunch of fans.

I was considering the Matrix, but that card is absolutely massive and I feel like it would be hard to cool two of those.


----------



## taem

This thing is enormous. Here it is next to an atx mobo



Weighs a ton too.


----------



## Cribbs

Quote:


> Originally Posted by *PropheticCreed*
> 
> I find that everyone has an RMA horror story with manufacturer "X." The negative stories are always more verbose than the positive. I look at it this way, if Sapphire/Gigabyte/Asus etc had consistently poor customer service with horror stories of RMA and quality etc then AMD and nVidia would most likely no longer allow them to produce their products.
> 
> I found myself feeling leery about Gigabyte because I've never used a Gigabyte GPU and have heard some horror stories. Then I reminded myself that just because other people may have had a bad experience doesn't mean that I will. I find the generally all of the manufacturers make solid products and that I need to stop worrying so much


I'm honestly worried about buying any product that isn't EVGA, I had a friend who bought a second hand imported 680 with no receipt, when it died he called evga and they got him to send it to a local store where he got a refund.
That kind of service is just second to none and is one of the reasons I always lean towards the green team.


----------



## Dart06

Quote:


> Originally Posted by *Cribbs*
> 
> I'm honestly worried about buying any product that isn't EVGA, I had a friend who bought a second hand imported 680 with no receipt, when it died he called evga and they got him to send it to a local store where he got a refund.
> That kind of service is just second to none and is one of the reasons I always lean towards the green team.


Yep I've only ever had EVGA cards. Always worked great. Hopefully Gigabyte continues that tradition for AMD cards for me.

Any time I buy Nvidia, I buy EVGA.


----------



## wreck3d

XFX 280X or ASUS 280X DCUII or Sapphire 280X Vapor-X









Have been with my XFX HD6870 for 4 years and no problems with it, ASUS 280X is quiet which draws me to it, Sapphire Vapor-X looks pretty, as does the XFX. Is the ASUS quietest of the lot? Which one runs the coolest?

I don't plan to overclock, I just want to upgrade my graphics.

Any suggestions?


----------



## AgentHydra

Having some issues with my Asus 280X DC2. Any time I do anything graphics intensive for 30-45 minutes it will freeze and crash to desktop, then artifact like crazy until rebooted (black and white geometric shapes). Happens in BF3, BF4beta, Blacklight Retribution, 3DMark, everything. Running 13.11Beta (original version). Fan is on auto, temps are fine, not overclocked at all.

To me sounds like bad VRAM maybe, any ideas?


----------



## taem

Quote:


> Originally Posted by *AgentHydra*
> 
> Having some issues with my Asus 280X DC2. Any time I do anything graphics intensive for 30-45 minutes it will freeze and crash to desktop, then artifact like crazy until rebooted (black and white geometric shapes). Happens in BF3, BF4beta, Blacklight Retribution, 3DMark, everything. Running 13.11Beta (original version). Fan is on auto, temps are fine, not overclocked at all.
> 
> To me sounds like bad VRAM maybe, any ideas?


Always great hearing about issues with the card UPS just delivered. :|


----------



## PropheticCreed

Quote:


> Originally Posted by *Cribbs*
> 
> I'm honestly worried about buying any product that isn't EVGA, I had a friend who bought a second hand imported 680 with no receipt, when it died he called evga and they got him to send it to a local store where he got a refund.
> That kind of service is just second to none and is one of the reasons I always lean towards the green team.


Nothing wrong to go with what you're used to. A friend of mine only buys EVGA nVidia products and I've always been very impressed with their quality both in product and customer service. I am currently using a second-hand XFX HD 4870 and phoned their support after the fan went, the service rep was great and for free, he sent me a new fan assembly as soon as they received a returned/dead card of the same...(granted I had gotten a sweet deal on an aftermarket cooler after not being able to wait a week...) so I've been very happy with XFX.


----------



## DiceAir

So isorted my issue last night by reformating my pc. So glad it's working can play BF4 like a boss. Only issue is the inconstancy of gpu usage. both of my GPU's is not being used the same amount so this is a bit of a problem but that's only in BF4 so i'm not to worried as there is some issues with drivers and Game


----------



## eTheBlack

Quote:


> Originally Posted by *DiceAir*
> 
> So isorted my issue last night by reformating my pc. So glad it's working can play BF4 like a boss. Only issue is the inconstancy of gpu usage. both of my GPU's is not being used the same amount so this is a bit of a problem but that's only in BF4 so i'm not to worried as there is some issues with drivers and Game


I think that is how BF4 works, I have same "issue", but I have sync on (60 fps)


----------



## DiceAir

Quote:


> Originally Posted by *eTheBlack*
> 
> I think that is how BF4 works, I have same "issue", but I have sync on (60 fps)


Game feels smoother with frame pacing off


----------



## eTheBlack

Quote:


> Originally Posted by *DiceAir*
> 
> Game feels smoother with frame pacing off


For me is smooth either way


----------



## gkolarov

BF4 works great with two 7950 in cross. Both cards have equal load.


----------



## DiceAir

Quote:


> Originally Posted by *gkolarov*
> 
> BF4 works great with two 7950 in cross. Both cards have equal load.


Maybe still a problem with R9 seties of cards


----------



## Clockster

Quote:


> Originally Posted by *DiceAir*
> 
> Maybe still a problem with R9 seties of cards


Just a driver thing and glad you got it sorted







Joker btw


----------



## syn17

The Sapphire 270x Dual x card comes with a DVI to VGA adapter, is this DVI male to VGA female?


----------



## rdr09

Quote:


> Originally Posted by *DiceAir*
> 
> So isorted my issue last night by reformating my pc. So glad it's working can play BF4 like a boss. Only issue is the inconstancy of gpu usage. both of my GPU's is not being used the same amount so this is a bit of a problem but that's only in BF4 so i'm not to worried as there is some issues with drivers and Game


it could be BF4 but if it is MP, then it could be the cpu. here my usage in BF3 with 7950/7970 crossfire . . .

i7 4.5 HT off (basically an i5)



usage: 7950/97%; 7970/87%

Here is BF4 Beta MP 64 with just a 7970 . . .


----------



## SupahSpankeh

Quote:


> Originally Posted by *taem*
> 
> This thing is enormous. Here it is next to an atx mobo
> 
> 
> 
> Weighs a ton too.


Yeah, but it's a beauty. OC's like a bstard and runs like a dream at amazing temps with very little fan noise relative to duty.

+1 for buying the right card. Love mine.


----------



## AgentHydra

Quote:


> Originally Posted by *AgentHydra*
> 
> Having some issues with my Asus 280X DC2. Any time I do anything graphics intensive for 30-45 minutes it will freeze and crash to desktop, then artifact like crazy until rebooted (black and white geometric shapes). Happens in BF3, BF4beta, Blacklight Retribution, 3DMark, everything. Running 13.11Beta (original version). Fan is on auto, temps are fine, not overclocked at all.
> 
> To me sounds like bad VRAM maybe, any ideas?


Sure enough I tried underclocking the VRAM to 1500Mhz and the problem seems to have disappeared, I think this card will be going back to Asus.

I'm sure this is just an isolated incident though.


----------



## taem

Quote:


> Originally Posted by *SupahSpankeh*
> 
> Yeah, but it's a beauty. OC's like a bstard and runs like a dream at amazing temps with very little fan noise relative to duty.
> 
> +1 for buying the right card. Love mine.


Without having used it yet, this is definitely the right card for me, based on reviews. Even if I decide to go 290, I'll put this in a Node 304 and pair it to a 1080p HDTV, ought to be perfect in that use. I happen to favor Asus, we all have our specific experiences, me, it's Asus whenever possible.

I'm looking forward to setting up my new system with this card and my new Korean 1440p. (We speak of the Asus R9280X DC2T 3GD5.) Just have to decide what game to play first - Tomb Raider maybe. Or maybe I'll get Metro Last Light. I figure first run should be something pretty.

Btw what utility are you using to oc? I hope to use Afterburner, so I can run an OSD with gpu/cpu temps and fps. (Hwinfo tweaked) Hope it works, for voltage and remembered settings. I wish all brand utils would have an osd like AB.


----------



## ducknukem86

Quick question:

I currently have the sapphire vapor x 280x running with a 750watt 80 bronze certified Rosewill PSU, do you think i could crossfire in the future? i also have an i5 4670k at 4.2 at the moment.


----------



## syn17

Anyone had any experience with the MSI 270x? Is it worth buying?


----------



## diggiddi

How well does the Matrix platinum Overclock compared to the DC top?? Can users post max stable OC results for comparison, also are the platinum's cherry picked or is it still luck of the lottery between the two chips?


----------



## ImJJames

Quote:


> Originally Posted by *ducknukem86*
> 
> Quick question:
> 
> I currently have the sapphire vapor x 280x running with a 750watt 80 bronze certified Rosewill PSU, do you think i could crossfire in the future? i also have an i5 4670k at 4.2 at the moment.


750 watt is enough.


----------



## DiceAir

Quote:


> Originally Posted by *rdr09*
> 
> it could be BF4 but if it is MP, then it could be the cpu. here my usage in BF3 with 7950/7970 crossfire . . .
> 
> i7 4.5 HT off (basically an i5)
> 
> 
> 
> usage: 7950/97%; 7970/87%
> 
> Here is BF4 Beta MP 64 with just a 7970 . . .


mine isn't even on 100% usage. itt's at about 80-90% usage. but even if there is a CPU bottleneck doesn't explain why my one gpu could be used 10% more than the other


----------



## TempAccount007

Should I buy the HIS IceQ X² 280x or the ASUS R9280X-DC2T-3GD5?

HIS: http://www.newegg.com/Product/Product.aspx?Item=N82E16814161441
ASUS: http://www.newegg.com/Product/Product.aspx?Item=N82E16814121803

The HIS has a 8+8 pin configuration and 9 phases.

The ASUS has a 6+8 pin configuration and 12 phases.

These are the two differences I'm concerned about. Which one will provide the most/best power for overclocking?

Thank you for your help









Edit: Also, the ASUS has unlocked voltages, right? ASUS has a history of locking voltages. What is the max voltage in Trixx?


----------



## Ashuiegi

God , i m stuck at work and my toxic has arrived at the parcel relay near me , i cannot think about work anymore , hopefully i m mostly done for today









my matrix hd 7970 can do 1280 mhz core 1850 mhz vram Max oc , but it can do that on the same voltage as 1100 and no amount of V will make it go further. even load line calibration or vrm freq change don't change anything to that.


----------



## chevZ

Really desperately looking for a Benchmark/Evidence of anyone with CROSSFIRE R9 280X's that can confidently confirm all micro stutter issues are gone, and eyefinity is stable finally. Not too mention it seems like theres not a single review/benchmark of it that has tested it over eyefinity resolutions.

I plan on possibly grabbing two of these tomorrow but I gotta say.. the things around the net from the last 12months about Eyefinity/Xfire (I'll be doing both) are not filling me with confidence..... I want to believe otherwise.


----------



## Ashuiegi

have you evidence that SLI and mutli screen gaming has 0 issue on nvidia ? if you don't want any problem , buy one card and use one screen otherwise whatever the card or brand you are likely to have problems at some point .
personally my gtx 670 has never been able to handle a game and steaming on another screen without crashing after a while , it has never managed to run bf3 without crash. i had already stutter issues due to the boost clock mechanism on nvidia that is too complicated and partly broken with a single card. that was with one card , i can't tell you for sli because after all that i went straight for amd and i never had these kind of problem since.
i had similar issue with a 570 previously and i never plan to buy a nividia card again unless it's considerably cheaper then the equivalent amd.


----------



## chevZ

Quote:


> Originally Posted by *Ashuiegi*
> 
> have you evidence that SLI and mutli screen gaming has 0 issue on nvidia ? if you don't want any problem , buy one card and use one screen otherwise whatever the card or brand you are likely to have problems at some point .
> personally my gtx 670 has never been able to handle a game and steaming on another screen without crashing after a while , it has never managed to run bf3 without crash. i had already stutter issues due to the boost clock mechanism on nvidia that is too complicated and partly broken. that was with one card , i can't tell you for sli because after all that i went straigh for amd i never has these kind of problem since.
> i had similar issue with a 570 previously and i never plan to buy a nividia card again unless it's cheaper then the equivalent amd.


No I don't but I know Surround has a much better reputation and I havn't had -many- issues with it from my point of view, I know nothing about Eyefinity on the other hand though.

Thus, I'm asking the AMD gurus if it's time I give AMD a shot after 5 years







... I just find nothing about Eyefinity+Crossfire+R9 280X setups on the net, and that's scary.

One GPU will not suffice for Eyefinity/Surround - In my budget anyway. R290X isn't even good enough to run Max BF4 Settings over 5760x1080 from everything I've read... and two R290X's is a very steep budget jump.


----------



## Ashuiegi

i think a problems on an amd often escalade into false rummor and generalisation while problems with nvidia tend to be diminished .
it s like for a guy or a girl , a lady's man is a stallion , a girl that like to have fun is a sl&/ , i think it s the same thing happening here ,...


----------



## chevZ

Quote:


> Originally Posted by *Ashuiegi*
> 
> i think a problems on an amd often escalade into false rummor and generalisation while problems with nvidia tend to be diminished .
> it s like for a guy or a girl , a lady's man is a stallion , a girl that like to have fun is a sl&/ , i think it s the same thing happening here ,...


I agree.

Honestly if I could find a SINGLE person with Xfire/R280X/Eyefinity saying "Yep everything runs amazing and no micro stutter here" that would probably be good enough....


----------



## chiknnwatrmln

Quote:


> Originally Posted by *chevZ*
> 
> No I don't but I know Surround has a much better reputation and I havn't had -many- issues with it from my point of view, I know nothing about Eyefinity on the other hand though.
> 
> Thus, I'm asking the AMD gurus if it's time I give AMD a shot after 5 years
> 
> 
> 
> 
> 
> 
> 
> ... I just find nothing about Eyefinity+Crossfire+R9 280X setups on the net, and that's scary.
> 
> One GPU will not suffice for Eyefinity/Surround - In my budget anyway. R290X isn't even good enough to run Max BF4 Settings over 5760x1080 from everything I've read... and two R290X's is a very steep budget jump.


I also want to figure this out as i might be getting 280x's for 4800x900.


----------



## brazilianloser

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I also want to figure this out as i might be getting 280x's for 4800x900.


Why not dish the extra 80-100 bucks and get the 290 (non x). For eyefinity the 4gb will be a lot more useful than the 3gb on the 280x... and before too long they will have the non reference cooler versions out for those that don't like the noise and heat.


----------



## Ashuiegi

i m getting my toxic 280x tonight , will CF it with my matrix 7970-p , i will tell you what problems i encouter. if i can get them to work ( not even sure how i m gonna make it fit into the case yet but that will be done







)


----------



## chiknnwatrmln

Quote:


> Originally Posted by *brazilianloser*
> 
> Why not dish the extra 80-100 bucks and get the 290 (non x). For eyefinity the 4gb will be a lot more useful than the 3gb on the 280x... and before too long they will have the non reference cooler versions out for those that don't like the noise and heat.


Actually had some people suggest that to me, I'm not getting ref cards and I'm getting cards on cyber Monday.

If non ref 290s are out then I'll get them, if not then 280xs. I'm trying to get a good deal on my new cards, they're certainly not cheap haha.


----------



## Dart06

So far these 280x cards have been a headache. Trying to get certain things figured out.


----------



## Ashuiegi

ok so the test of the 280x toxic and 7970 matrix will have to wait until i drain my wc loop , couldn't find a way to slide it under the tube ,......
i installed it in my secondary rig to test the card , but i stil had to remove the 280 rad from the front of my haf-xb to make it fit


----------



## ImJJames

Quote:


> Originally Posted by *chevZ*
> 
> Really desperately looking for a Benchmark/Evidence of anyone with CROSSFIRE R9 280X's that can confidently confirm all micro stutter issues are gone, and eyefinity is stable finally. Not too mention it seems like theres not a single review/benchmark of it that has tested it over eyefinity resolutions.
> 
> I plan on possibly grabbing two of these tomorrow but I gotta say.. the things around the net from the last 12months about Eyefinity/Xfire (I'll be doing both) are not filling me with confidence..... I want to believe otherwise.


Crossfire in general since the latest beta drivers have fixed all micro stuttering for me.


----------



## ducknukem86

i'm also considering crossfiring, since i want more performance and i already bought one 280x


----------



## RaphLYC

Hey guys just bought my Sapphire Vapor-X R9 280X couple weeks ago and want to try out overclocking, can someone teach me how to oc a vga card?
Is these settings good?
Voltage 1200, Core 1200, ram 1600

Default voltage was 1163 and core 1070 and vram is 1550
And am I getting the average score with these OC settings?


P.S. I also have a problem was i didnt get any crashes when i ran Valley Benchmark 1.0 but when I play BF4 i get like a brown screen with black vertical lines is it the memory set too high? or not enough voltage for the core or for the memory if its the memory what should i set it to? the default is 1500


----------



## Pedros

Hey guys, need some help. Any of you made a decision, in the last days, of going Crossfire with 280x's instead of jumping to the 290 bandwagon?

So, here's the deal... i'm gaming at 2560x1600 ( or will be since i'm building the rig atm )... and i'm kind of tired waiting for custom design 290's to appear.

So the call was to go 2 x 290's ... or just skip the 290's and get 2 x 280x ( Matrix Platinum at a great price each, 299 Eur ).

But i just can't make up my mind about this ...

If so, please tell me your experiences and if they are a good solution for 2560x1600. ( i'm not going 4k or multi-monitor ).


----------



## COMBO2

Quote:


> Originally Posted by *Pedros*
> 
> Hey guys, need some help. Any of you made a decision, in the last days, of going Crossfire with 280x's instead of jumping to the 290 bandwagon?
> 
> So, here's the deal... i'm gaming at 2560x1600 ( or will be since i'm building the rig atm )... and i'm kind of tired waiting for custom design 290's to appear.
> 
> So the call was to go 2 x 290's ... or just skip the 290's and get 2 x 280x ( Matrix Platinum at a great price each, 299 Eur ).
> 
> But i just can't make up my mind about this ...
> 
> If so, please tell me your experiences and if they are a good solution for 2560x1600. ( i'm not going 4k or multi-monitor ).


I would go with 2x reference 290s, and put them under water. Personally, I think that would be the smartest option. The cards will clock pretty high when watercooled, and you can overclock them further. Either that, or a 780 Ti would probably be my next choice. Be careful of micro-stuttering though. I've still heard it's not COMPLETELY fixed up like AMD said it was.


----------



## rdr09

Quote:


> Originally Posted by *COMBO2*
> 
> I would go with 2x reference 290s, and put them under water. Personally, I think that would be the smartest option. The cards will clock pretty high when watercooled, and you can overclock them further. Either that, or a 780 Ti would probably be my next choice. Be careful of micro-stuttering though. I've still heard it's not COMPLETELY fixed up like AMD said it was.


i read otherwise. i read they are smoother than sli.

@Pedros, that rez - go with the 290s but you might have issue pushing both with an i5. try one first.


----------



## COMBO2

Quote:


> Originally Posted by *rdr09*
> 
> i read otherwise. i read they are smoother than sli.
> 
> @Pedros, that rez - go with the 290s.


I mean, I won't deny that. I actually haven't used Crossfire, I'm using SLI right now though. Just from what I had seen and read, people were still complaining about Crossfire and it's compatibility/microstutter and such.


----------



## rdr09

Quote:


> Originally Posted by *COMBO2*
> 
> I mean, I won't deny that. I actually haven't used Crossfire, I'm using SLI right now though. Just from what I had seen and read, people were still complaining about Crossfire and it's compatibility/microstutter and such.


scroll down sub-title: smoothness . . .

http://www.hardocp.com/article/2013/11/01/amd_radeon_r9_290x_crossfire_video_card_review/10


----------



## COMBO2

Quote:


> Originally Posted by *rdr09*
> 
> scroll down sub-title: smoothness . . .
> 
> http://www.hardocp.com/article/2013/11/01/amd_radeon_r9_290x_crossfire_video_card_review/10


Oh right. I was more referring to AMD as a whole but it seems that Crossfire on the 290/X is definitely much better.


----------



## rdr09

Quote:


> Originally Posted by *COMBO2*
> 
> Oh right. I was more referring to AMD as a whole but it seems that Crossfire on the 290/X is definitely much better.


7900 series are much better as well now. I have no issue with BF3, C3, and C2. not tried crossfire in any other game. I want to try BF4 but I am getting ready for my 290.









edit: on the contrary . . . i see some complaining with stutter in nvidia due to driver.


----------



## RaphLYC

Quote:


> Originally Posted by *RaphLYC*
> 
> Hey guys just bought my Sapphire Vapor-X R9 280X couple weeks ago and want to try out overclocking, can someone teach me how to oc a vga card?
> Is these settings good?
> Voltage 1200, Core 1200, ram 1600
> 
> Default voltage was 1163 and core 1070 and vram is 1550
> And am I getting the average score with these OC settings?
> 
> 
> P.S. I also have a problem was i didnt get any crashes when i ran Valley Benchmark 1.0 but when I play BF4 i get like a brown screen with black vertical lines is it the memory set too high? or not enough voltage for the core or for the memory if its the memory what should i set it to? the default is 1500


Found out that i cant play games with those oc settings even if i bump up the voltage to 1240...


----------



## Ashuiegi

these stutter issue is like the rest , when it s an amd card it s a sin , when it s an nvidia card it must be a bad driver install or something , for god sake stop looking for excuse . No brand will garante you a great experience with 2 cards , and even with one card things don't always work the way it should , no matter the brand. when you add a 2nd card you don't double the possibility of problems , it goes up exponontially with every piece of hardware you will add.


----------



## jimi977

im planning to get a gigabyte 280x. my question is, will gigabyte update their windforce cooler and pcb like the gtx770 or 270. i just dont like the blue pcb of the 280x


----------



## Pedros

Thank you guys!
The rig i have here it's outdated.

I'm getting or an 4770k or an 4820k ... still deciding.


----------



## ducknukem86

Hey guys, i'm in need of advice here. As you know i got the 280x, but i don't feel happy with it. As soon as the framerate goes below 60fps, in any game, i experience certain choppiness or stutter. I thought anything above 30 fps would be acceptable when playing videogames, since consoles normally run at 30 fps. I don't get it, but i'm suffering with this, that's why i've started thinking of getting a better videocard, but most people i've read say if you're playing at 1080p a better card would be overkill (say a 290) and even those videocards can't sustain 60 fps all the time.


----------



## battleaxe

Quote:


> Originally Posted by *ducknukem86*
> 
> Hey guys, i'm in need of advice here. As you know i got the 280x, but i don't feel happy with it. As soon as the framerate goes below 60fps, in any game, i experience certain choppiness or stutter. I thought anything above 30 fps would be acceptable when playing videogames, since consoles normally run at 30 fps. I don't get it, but i'm suffering with this, that's why i've started thinking of getting a better videocard, but most people i've read say if you're playing at 1080p a better card would be overkill (say a 290) and even those videocards can't sustain 60 fps all the time.


This video shows you how to fix it. The guy is a bit annoying at first, just stay with him and he goes through the steps for you. Its pretty easy. See if this fixes it. For BF3 and Bf4 at least it won't help on other games.

Oops forgot the link; http://www.youtube.com/watch?v=B6DOBV7QkLU


----------



## ducknukem86

Quote:


> Originally Posted by *battleaxe*
> 
> This video shows you how to fix it. The guy is a bit annoying at first, just stay with him and he goes through the steps for you. Its pretty easy. See if this fixes it. For BF3 and Bf4 at least it won't help on other games.
> 
> Oops forgot the link; http://www.youtube.com/watch?v=B6DOBV7QkLU


Thank you very much! This has helped!


----------



## ImJJames

Quote:


> Originally Posted by *RaphLYC*
> 
> Found out that i cant play games with those oc settings even if i bump up the voltage to 1240...


Curious whats your ASIC quality? Mine is 69%

1200Clock/1760 Mem @ 1.25 Volts, Stock Mem volts Video


----------



## leyzar

Hy peeps,

@ 280x Crossfire users,

I have been seeing allot of bad rep with crossfire issues, micro-stutters and that sort of thing.
How has your experience been ?


----------



## Cool Mike

Just installed my second Sapphire 280x Toxic. Was surprised I can still stay at 1200 core and 1800 Memory in crossfire. The toxic handles the heat very well. Firestrike run looks good.


----------



## RaphLYC

Quote:


> Originally Posted by *ImJJames*
> 
> Curious whats your ASIC quality? Mine is 69%
> 
> 1200Clock/1760 Mem @ 1.25 Volts, Stock Mem volts Video


My asic is 70.4
but now im running at 1140/1550 and the voltage set at 1170 using MSI AB
seems fine with games and valley bench


----------



## jorvie

Hi,

After getting a r9 270x my screen started going white or grey and completely freezes up at random times. This never happens when gaming, only when just browsing the web or if its idle.

I upgraded from a hd 7850 and I have used an hd 7870 and gtx 660 within the last 2 weeks without any problems.

My powersupply is this
http://www.silverstonetek.com/product.php?pid=253&area=en

It was working fine for a couple of days, after using guru3d driver sweeper, but then I got a blank screen again. I don't know if it's a coincidence but so far it has only happened while using chrome.
Could this be a driver issue? I really don't want to rma my card and wait weeks only to find out that the new card have the same problem and it was just a problem with the driver.

I have now tried the Catalyst V13.10 drivers from sapphire site and I still got the same problem.

The drivers work perfectly fine with my hd 7850 so I doubt it's a driver issue. The problem has never occoured doring gaming so I doubt that it's because of my powersupply

My system is as follow

ASUS P8H77-I Mini ITX motherboard
8gb kingston 1600mhz ddr3 memory
Intel i5-3570k processor
Windows 7 Ultimate 64bit
Sapphire DUAL-X R9 270X 2GB graphic card http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=0&psn=000101&lid=1&leg=0
Silverstone ST45SF 450watt powersupply http://www.silverstonetek.com/product.php?pid=253&area=en
AM D Catalyst™ 13.11 Beta9.2 / AM D Catalyst™ 13.10 both tested and still give me the same problem
Google chrome is fully updated to 31.0.1650.57 m


----------



## RaphLYC

i also have a problem is that i set the GPU voltage at 1.170 at MSI AB but when i check the MSI AB Hardware Monitor it says 1.219 and GPU-Z at 1.189


----------



## Devildog83

Hey guys just looking for a bit of informed help here.

I have a 7870 Devil and was think of 2 routes, X-Fire with a R9 270x or get a 280x and maybe crossfire down the road. With all of the X-fire issues I keep seeing I am leaning 280x but I would want to X-Fire it down the road sometime. I know the X-fire route would give me better frame-rates now but am I looking a too much trouble or have the X-Fire issues been solved on the on the Pitcairn cards to the point where it makes sense. It would also be cheaper by $100 or so to get the 280x but I would have to sell the Devil. Any thoughts. Either way I will be asking to join the club soon.


----------



## NightHawK360

Alright, pulled the trigger on a ASUS R9 280X DCUII and CM Storm Stryker. Should arrived next week, so excited.


----------



## Modovich

This is my Unigine Valley


----------



## Ashuiegi

toxic 280x + matrix hd 7970

at first i had problem with the old driver (one that don't even support r9 series) , my 7970 was always at 100% usage , but as soon as i installed the lastest driver everything worked ok. and bonus after the driver install both card are well reconized and can be ajusted in gpu tweak









they are at 1150mhz-1650mhz
this is a firestrike run , i dunno why but in the combined test i don't get any benefits while in the 2 first i get nearly 100% scaling
http://www.3dmark.com/fs/1118212

100 + average fps in bf4 1440p ultra , with v sync my card are at 50-60% usage each ,...


----------



## KeepWalkinG

Can i put bios from sapphire TOXIC on my Dual X card? r9 280x ?


----------



## pyshing

Hey guys I have recently bought GB 280x windforce but mu voltage is locked. Where I can download unlocked voltage bios for example toxic one ?!


----------



## jimi977

im planning to get a gigabyte 280x oc. anybody knows if they will update the cooler and pcb like the 770gtx and 270x? i just don't like the blue pcb. it doesn't go with my rig


----------



## DeviousAddict

I remember reading in this thread about someone who couldn't Xfire their two XFX 280X card but i cant find the post now.

Does anyone know or remember if he managed to find a bridge that was long enough? i have just received my 2nd card and come up with the same problem









Basically the bridge supplied with the card wont reach over their cooler to the other card as the bridge connections sit a lot lower than the cooler.

I need to buy an extended bridge, anyone who knows where i can one within the UK would be a great help


----------



## Devildog83

Man, there is a lotta' questions in here and zero answers. Kinda strange for an OCN forum.


----------



## battleaxe

Quote:


> Originally Posted by *DeviousAddict*
> 
> I remember reading in this thread about someone who couldn't Xfire their two XFX 280X card but i cant find the post now.
> 
> Does anyone know or remember if he managed to find a bridge that was long enough? i have just received my 2nd card and come up with the same problem
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Basically the bridge supplied with the card wont reach over their cooler to the other card as the bridge connections sit a lot lower than the cooler.
> 
> I need to buy an extended bridge, anyone who knows where i can one within the UK would be a great help


I'm pretty sure that was dade_kash_xD, but I also think he was unhappy and sold his cards. I do believe he was able to xfire though, so maybe he'd be willing to sell you his xfire bridge. I could be wrong though, just going on memory. This thread is getting way too long to fish back through.

his name is dade_kash_xD PM him and see if its true.


----------



## rdr09

Quote:


> Originally Posted by *Devildog83*
> 
> Man, there is a lotta' questions in here and zero answers. Kinda strange for an OCN forum.


with regard to your ? about your 7870 . . .you might get an answer here . . .

http://www.overclock.net/t/1236487/amd-radeon-hd7850-hd7870-owners-club


----------



## Devildog83

I have a Seasonic ss660 x2 PSU, is it enough to power the DC2T or MSI gaming 280x?


----------



## Amhro

Quote:


> Originally Posted by *Ashuiegi*
> 
> toxic 280x + matrix hd 7970
> 
> at first i had problem with the old driver (one that don't even support r9 series) , my 7970 was always at 100% usage , but as soon as i installed the lastest driver everything worked ok. and bonus after the driver install both card are well reconized and can be ajusted in gpu tweak
> 
> 
> 
> 
> 
> 
> 
> 
> 
> they are at 1150mhz-1650mhz
> this is a firestrike run , i dunno why but in the combined test i don't get any benefits while in the 2 first i get nearly 100% scaling
> http://www.3dmark.com/fs/1118212
> 
> 100 + average fps in bf4 1440p ultra , with v sync my card are at 50-60% usage each ,...


damn, these LEDs look amazing, i wish asus DC2 had the same


----------



## BackwoodsNC

Quote:


> Originally Posted by *Devildog83*
> 
> I have a Seasonic ss660 x2 PSU, is it enough to power the DC2T or MSI gaming 280x?


Here you go Link


----------



## Devildog83

Quote:


> Originally Posted by *rdr09*
> 
> with regard to your ? about your 7870 . . .you might get an answer here . . .
> 
> http://www.overclock.net/t/1236487/amd-radeon-hd7850-hd7870-owners-club


The reason I asked hear is because I was think of getting the 280x, is there another thread that covers the 280x? I am a member of the 7870 club and I am there everyday but they don't know the 280x.


----------



## Ashuiegi

you can buy a 120 mm saphire CF bridge , in France i know where to find them but they don't ship international ,....


----------



## DeviousAddict

Quote:


> Originally Posted by *battleaxe*
> 
> I'm pretty sure that was dade_kash_xD, but I also think he was unhappy and sold his cards. I do believe he was able to xfire though, so maybe he'd be willing to sell you his xfire bridge. I could be wrong though, just going on memory. This thread is getting way too long to fish back through.
> 
> his name is dade_kash_xD PM him and see if its true.


Cheers dude, PM'd him just now


----------



## DeviousAddict

Quote:


> Originally Posted by *Ashuiegi*
> 
> you can buy a 120 mm saphire CF bridge , for France i know where to find them but they don't ship international ,....


When you say international does that mean outside the EU? because I live in the UK


----------



## Ashuiegi

Quote:


> Originally Posted by *Amhro*
> 
> damn, these LEDs look amazing, i wish asus DC2 had the same


yeah sadly they are gone as soon as i put my wateblock back on


----------



## rdr09

Quote:


> Originally Posted by *Devildog83*
> 
> The reason I asked hear is because I was think of getting the 280x, is there another thread that covers the 280x? I am a member of the 7870 club and I am there everyday but they don't know the 280x.


crossfire issues will always exist, imo. it is always favorable to go single gpu (in both camps).

2 7870s will be more powerful for sure. even more powerful than a 290 when oc'ed. prolly even. i've read one member asked the same question who already owns 2 7850s and the np stated that there are currently no issues running crossfire. zero issues. i've tried crossfire 7950/7970 and had zero issues but i only played BF3, C3, and C2. games like FC3 is a different story. my advise to you is to shoot for a 290 for a few more bucks or a 780 if you want to go green.

edit: in short, a jump from a 7870 to a 280 is really minimal other than the add'l vram.


----------



## mtcn77

Quote:


> Originally Posted by *rdr09*
> 
> crossfire issues will always exist, imo. it is always favorable to go single gpu (in both camps).
> 
> 2 7870s will be more powerful for sure. even more powerful than a 290 when oc'ed. prolly even. i've read one member asked the same question who already owns 2 7850s and the np stated that there are currently no issues running crossfire. zero issues. i've tried crossfire 7950/7970 and had zero issues but i only played BF3, C3, and C2. games like FC3 is a different story. my advise to you is to shoot for a 290 for a few more bucks or a 780 if you want to go green.
> 
> edit: in short, a jump from a 7870 to a 280 is really minimal other than the add'l vram.


7870CF won't actually be faster because the higher total memory bandwidth is just another dependency on software control.


----------



## aMaNeCeR

Hi,

What r9 280x card has reference pcb? Is to put a watercooling full block

Bye.


----------



## Devildog83

Quote:


> Originally Posted by *BackwoodsNC*
> 
> Here you go Link


Thanks for the link but it doesn't have my 125w CPU or the 280x so it's not much help. Thanks for trying.


----------



## Devildog83

Quote:


> Originally Posted by *rdr09*
> 
> crossfire issues will always exist, imo. it is always favorable to go single gpu (in both camps).
> 
> 2 7870s will be more powerful for sure. even more powerful than a 290 when oc'ed. prolly even. i've read one member asked the same question who already owns 2 7850s and the np stated that there are currently no issues running crossfire. zero issues. i've tried crossfire 7950/7970 and had zero issues but i only played BF3, C3, and C2. games like FC3 is a different story. my advise to you is to shoot for a 290 for a few more bucks or a 780 if you want to go green.
> 
> edit: in short, a jump from a 7870 to a 280 is really minimal other than the add'l vram.


Thank you, finally a straight answer. I was thinking about a 290 but the reference ones get kinda hot and I don't know if non-ref will be out of the price range. I think my 7870 Devil with a 270x Devil would be very cool but I am still on the fence and it's has not released yet nor has the 290 non-ref cards.. That's why I asked and thanks for the help.

And yes, 2 x 7870 is faster than 1 x 280x or 7970, I have just never X-Fired and I am tentative to go there.


----------



## Ashuiegi

Quote:


> Originally Posted by *DeviousAddict*
> 
> When you say international does that mean outside the EU? because I live in the UK


yep this site only sell and ship in france , belgium
but look for saphire 120mm crossfire bridge , sure you can find one retailer in uk that have it


----------



## raghu78

Quote:


> Originally Posted by *aMaNeCeR*
> 
> Hi,
> 
> What r9 280x card has reference pcb? Is to put a watercooling full block
> 
> Bye.


HIS Radeon R9 280X IceQ X² 3GB GDDR5 (H280XQM3G2M)
HIS Radeon R9 280X IceQ X² Turbo Boost Clock 3GB GDDR5 (H280XQMT3G2M)
MSI R9 280X Gaming 3G Radeon R9 280X 3GB GDDR5 (V277-053R)
VTX3D Radeon R9 280X 3GB GDDR5 (VXR9 280X 3GBD5-2DHE)
Club 3D Radeon R9 280X royalKing 3GB GDDR5 (CGAX-R928X7O)
Club 3D Radeon R9 280X royalQueen 3GB GDDR5 (CGAX-R928X7)

http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/images/front.jpg

you can confirm it here
http://www.coolingconfigurator.com/


----------



## Devildog83

Quote:


> Originally Posted by *raghu78*
> 
> HIS Radeon R9 280X IceQ X² 3GB GDDR5 (H280XQM3G2M)
> HIS Radeon R9 280X IceQ X² Turbo Boost Clock 3GB GDDR5 (H280XQMT3G2M)
> MSI R9 280X Gaming 3G Radeon R9 280X 3GB GDDR5 (V277-053R)
> VTX3D Radeon R9 280X 3GB GDDR5 (VXR9 280X 3GBD5-2DHE)
> Club 3D Radeon R9 280X royalKing 3GB GDDR5 (CGAX-R928X7O)
> Club 3D Radeon R9 280X royalQueen 3GB GDDR5 (CGAX-R928X7)
> 
> http://www.techpowerup.com/reviews/MSI/R9_280X_Gaming/images/front.jpg
> 
> you can confirm it here
> http://www.coolingconfigurator.com/


What he said.


----------



## aMaNeCeR

Thank you both!!!


----------



## steelkevin

Quote:


> Originally Posted by *aMaNeCeR*
> 
> Hi,
> 
> What r9 280x card has reference pcb? Is to put a watercooling full block
> 
> Bye.


None.

There is no reference R9 280X. I guess, and I do it myself, you could call the HD 7970 reference PCB the R9 280'es though. In which case the previous replies are spot on.
Otherwise you can go with the Asus DC2T (be carefull not to get the V2 which doesn't share the same PCB and is a three slot card) and wait a couple weeks. EK is making blocks for them. That was what I was going to do because having a "7970" on my 280's block would've bothered me.


----------



## Waleh

Hey guys, how is the 280x doing in bf4 at 1920x1080 (1080p ultra)?


----------



## taem

Quote:


> Originally Posted by *Ashuiegi*
> 
> yeah sadly they are gone as soon as i put my wateblock back on


Wut?! Why you buy those cards if you're going to water cool? You can't abandon those sexy things. My DC2T looks like crap with no backplate and I'm air cooling. Wish I could buy those harnesses off you and glue them onto my card.
Quote:


> Originally Posted by *steelkevin*
> 
> Otherwise you can go with the Asus DC2T (be carefull not to get the V2 which doesn't share the same PCB and is a three slot card) and wait a couple weeks. EK is making blocks for them. That was what I was going to do because having a "7970" on my 280's block would've bothered me.


EK is making blocks for the Asus R9280X DC2T 3GD5? Because I want to buy a backplate.


----------



## Ashuiegi

a single 7970 at 1200-1800 was fine for ultra on 1440p, about 45 fps average , so i guess it will be more then ok for 1080p


----------



## EmoKid

Hi, please add me to the club









I've got some interresting news (I hope!)
I bought a 7970 from Asus, the non-top, non overvolting one








I tried to unlock it, without succes, until now! I flashed the Asus 280X Matrix bios, reinstalled the driver (13.11 Beta 8) and it worked!
The previous attempts at overclocking resulted in BSOD's and had to settle at 1125/1575. I am now happily running 1175/1600 and still testing








The card even survived 10 consecutive runs of Heaven 4.0 Extreme









OC1.png 2045k .png file


IMG_20131109_223010.jpg 1436k .jpg file


----------



## steelkevin

Quote:


> Originally Posted by *Waleh*
> 
> Hey guys, how is the 280x doing in bf4 at 1920x1080 (1080p ultra)?


I get 70-90fps on ultra with my Asus DC2T R9 280X.
V-sync is broken on BF4 though so while I should be able to maintain a solid 60fps it dips as low as 51.

I tried playing on Low @1080p just for the sake of it and I was getting an average of 160fps and max fps was a bit over 200.

I'm on the Beta 6 driver btw. Beta 8 is said to drastically improve fps (saw on 290 test that it improved fps by atleast 5 which isn't bad







.
Quote:


> Originally Posted by *taem*
> 
> Wut?! Why you buy those cards if you're going to water cool? You can't abandon those sexy things. My DC2T looks like crap with no backplate and I'm air cooling. Wish I could buy those harnesses off you and glue them onto my card.
> EK is making blocks for the Asus R9280X DC2T 3GD5? Because I want to buy a backplate.


I haven't heard anything about backplates and I don't think they systemately make backplatesto match their waterblocks but you could probably get one made by someone (dwood if you're in the US).

Weird question to ask btw. I bought the 280X to watercool it which is why I took the DC2T although it was 300€ instead of 250€ for a regular sapphire or a good HIS card.
In the end though once I saw the 290's performance I just couldn't do anything else but return the 280X (sending it back on Tuesday) to get one and immediatly slap a waterblock on it. After all adding 50€ for a 20-25% stock increase in performance isn't negligable. Especially considering I don't plan on upgrading any time soon. Add to that the fact my 280X couldn't overclock well while the 290 is limited by bad cooling so it'll truely benefit from watercooling.


----------



## Waleh

Quote:


> Originally Posted by *steelkevin*
> 
> I get 70-90fps on ultra with my Asus DC2T R9 280X.
> V-sync is broken on BF4 though so while I should be able to maintain a solid 60fps it dips as low as 51.
> 
> I tried playing on Low @1080p just for the sake of it and I was getting an average of 160fps and max fps was a bit over 200.
> 
> I'm on the Beta 6 driver btw. Beta 8 is said to drastically improve fps (saw on 290 test that it improved fps by atleast 5 which isn't bad
> 
> 
> 
> 
> 
> 
> 
> .
> I haven't heard anything about backplates and I don't think they systemately make backplatesto match their waterblocks but you could probably get one made by someone (dwood if you're in the US).
> 
> Weird question to ask btw. I bought the 280X to watercool it which is why I took the DC2T although it was 300€ instead of 250€ for a regular sapphire or a good HIS card.
> In the end though once I saw the 290's performance I just couldn't do anything else but return the 280X (sending it back on Tuesday) to get one and immediatly slap a waterblock on it. After all adding 50€ for a 20-25% stock increase in performance isn't negligable. Especially considering I don't plan on upgrading any time soon. Add to that the fact my 280X couldn't overclock well while the 290 is limited by bad cooling so it'll truely benefit from watercooling.


Perfect, thank you! +rep So, getting a 120 Hz monitor is actually advantageous because I can pull above 60 fps with this card on Bf4? But, you're also at a smaller res? Or, is it 1920x1080


----------



## RaphLYC

Sapphire R9-280X VAPOR-X OC benchmark
Driver 13.11 beta 8
This card is voltage unlock
Default 1070/1550 GPU voltage 1163, Memory Voltage 1500
After OC (min33,idle40,max71)
Overclocked 1160/1600 GPU voltage 1169 (I set as 1120 but AB displayed at 1169), Memory Voltage 1519 (I set as 1520 but AB displayed at 1519)
http://s1188.photobucket.com/user/raf_ilt/media/OC1_zps3a0fdb65.png.html
3D MARK
http://s1188.photobucket.com/user/raf_ilt/media/OC2_zps13cef00a.png.html
3D MARK 11
http://s1188.photobucket.com/user/raf_ilt/media/OC3_zpsb164a6fb.png.html
VALLEY BENCHMARK (EXTREME HD)
http://s1188.photobucket.com/user/raf_ilt/media/OC4_zps54dcc606.png.html

PC SPEC:
i7 3770K ES @ 4.6
CORSAIR H100
Asus P8-Z77 V PRO
2xSSD 4xHDD
SUPER FLOWER LEADEX PLATINUM 1000W
NZXT SWITCH 810


----------



## steelkevin

Quote:


> Originally Posted by *Waleh*
> 
> Perfect, thank you! +rep So, getting a 120 Hz monitor is actually advantageous because I can pull above 60 fps with this card on Bf4? But, you're also at a smaller res? Or, is it 1920x1080


Nope, 1080p







.

Makes a massive difference seeing how I used to play BF3 on a surprisingly well oc'ed GTS 240 that could only pull over 60fps @720p with everything set to the lowest setting possible. All that on the same screen as now (a 24" 1080p monitor that doesn't actually belong to me but to my brother who'll be taking it back in a couple weeks). I don't know much if anything at all about how 120Hz monitors perform but I'll either get myself one of those 24" 144Hz Asus monitors. That or a Ducky Shine keyboard. I'd like to heard your experience compared to a regular 60Hz monitor. I've heard to really benefit from 120Hz monitors you should reach for 90fps for all that's worth. To be honest you could probably achieve that playing on High presets though.
All in all it's a great card and you should definitely get it if it's in your budget no matter what card you come from it won't disapoint. It's dead silent @stock whether @idle or in game and delivers more than enough for 1080p gamers.


----------



## taem

Quote:


> Originally Posted by *Waleh*
> 
> Perfect, thank you! +rep So, getting a 120 Hz monitor is actually advantageous because I can pull above 60 fps with this card on Bf4? But, you're also at a smaller res? Or, is it 1920x1080


Don't you have to apply an unofficial patch to overclock a display with amd Gpus? Or is that only crossfire? Or is my info dated? The patch I know of breaks hdcp.

Edit oh nm you mean actually paying the money for a 120 display, I just assume everyone is cheap like me and you meant overclocking a Korean pls.


----------



## RaphLYC

Sapphire R9-280X VAPOR-X OC benchmark
Driver 13.11 beta 8
This card is voltage unlock
Default 1070/1550 GPU voltage 1163, Memory Voltage 1500
After OC (min33,idle40,max71)
Overclocked 1160/1600 GPU voltage 1169 (I set as 1120 but AB displayed at 1169), Memory Voltage 1519 (I set as 1520 but AB displayed at 1519)
http://s1188.photobucket.com/user/raf_ilt/media/OC1_zps3a0fdb65.png.html
3D MARK
http://s1188.photobucket.com/user/raf_ilt/media/OC2_zps13cef00a.png.html
3D MARK 11
http://s1188.photobucket.com/user/raf_ilt/media/OC3_zpsb164a6fb.png.html
VALLEY BENCHMARK (EXTREME HD)
http://s1188.photobucket.com/user/raf_ilt/media/OC4_zps54dcc606.png.html

PC SPEC:
i7 3770K ES @ 4.6
CORSAIR H100
Asus P8-Z77 V PRO
2xSSD 4xHDD
SUPER FLOWER LEADEX PLATINUM 1000W
NZXT SWITCH 810


----------



## TempAccount007

Quote:


> Originally Posted by *Devildog83*
> 
> Man, there is a lotta' questions in here and zero answers. Kinda strange for an OCN forum.


I think the more expierenced users will be going with at least 290 if they are upgrading.

If my 280x is a dog with a low ASIC (like my previous 7950) I will probably return it and go the 290 route. By then there should be a good selection of 290s


----------



## Darkchild

Quote:


> Originally Posted by *Devildog83*
> 
> Thank you, finally a straight answer. I was thinking about a 290 but the reference ones get kinda hot and I don't know if non-ref will be out of the price range. I think my 7870 Devil with a 270x Devil would be very cool but I am still on the fence and it's has not released yet nor has the 290 non-ref cards.. That's why I asked and thanks for the help.
> 
> And yes, 2 x 7870 is faster than 1 x 280x or 7970, I have just never X-Fired and I am tentative to go there.


Dont leave us devil!! lol just kidding. ive ran crossfire for the better part of 4 years. when i had my 5770s i never knew what micro stutter was as i didnt experience it on that build. even currently on my 7870s i didnt have the micro stutter thats been soo bad across the net. To be honest ive tested my current rig with and without frame pacing enabled and i dont see a difference. think micro stutter doesn't affect all builds. i think it affects mismatch configurations such as running a 7850 with a 7870 or cards that have boost as the clocks will prolly almost never match which adds variance to the config meaning if card A is boosting to say 1150 but card B is only boosting to 1090 then that will affect things like gpu usage variations. im the type that has to keep my clocks the same i always set clocks according to the slowest card So if card A can oc to 1250(my asus) and card B can only oc to 1175(my powercolor) then both my cards will be set to 1175. Ive actually tested 1250 with 1175 and it was bad i could tell that it wasnt running as smooth as it does when the clocks are the same not to mention it didnt help performance any having the asus at 1250.


----------



## 856Media

Just joined the ranks! Returned my 270x. Ordered the Sapphire R9 280x (Purple Box) - http://www.amazon.com/gp/product/B00FLMKNE0/ref=oh_details_o00_s00_i00?ie=UTF8&psc=1

So Far I am liking it.. I know once I go AMD 8 Core 8350 CPU, I'll be enjoying it even more. (BF4 Gamer)

I can't get any OCing inside of Catalyst to work correctly. Fan control works great but any adjustment up from 1050mhz, doesn't play well and I get DX Errors. :/


----------



## Darkchild

If anybody in this thread was curious about how much faster crossfire 7870ghz is heres some bench results:


----------



## 856Media

This Dual X Version of the Sapphire R9 280X is apparently already OC'd. Seems to be the ceiling for my card. maxed at 1020/1500. Unable to push further, even with the full 20% in power increase.


----------



## Darkchild

Quote:


> Originally Posted by *856Media*
> 
> This Dual X Version of the Sapphire R9 280X is apparently already OC'd. Seems to be the ceiling for my card. maxed at 1020/1500. Unable to push further, even with the full 20% in power increase.


time to up the voltage


----------



## ImJJames

Quote:


> Originally Posted by *Darkchild*
> 
> If anybody in this thread was curious about how much faster crossfire 7870ghz is heres some bench results:


When I had my 2x 7850 on air, never passed 68C. I find it funny my min and max is higher but you have higher score.


----------



## 856Media

Quote:


> Originally Posted by *Darkchild*
> 
> time to up the voltage


How would I go about doing so. The voltage slider is locked. I have never upped voltage on any GPU I have owned. Can you walk a noob through it.


----------



## ImJJames

Quote:


> Originally Posted by *856Media*
> 
> This Dual X Version of the Sapphire R9 280X is apparently already OC'd. Seems to be the ceiling for my card. maxed at 1020/1500. Unable to push further, even with the full 20% in power increase.


Go to settings and unlock it.


----------



## 856Media

After unlocking the voltage adjustment inside the settings, the slider is still locked, even after rebooting... Any ideas? (MSI AFTERBURNER)


----------



## taem

ASIC 60.4 on my Asus DC2T :|

Should I return it? That seems bad to me but how bad is it?


----------



## Ashuiegi

ASIC supposed to be high only for air overclocking but i just think it s total bullcrap , and it s not even that important i would say , i have a matrix card at 60% asic and it can oc to 1280-1850 , my toxic card is at 69% but only oc to 1250-1800


----------



## ImJJames

Quote:


> Originally Posted by *Ashuiegi*
> 
> ASIC supposed to be high only for air overclocking but i just think it s total bullcrap , and it s not even that important i would say , i have a matrix card at 60% asic and it can oc to 1280-1850 , my toxic card is at 69% but only oc to 1250-1800


This, I find it funny all the graphic cards I have had that were under 70% ASIC were best overclockers.


----------



## taem

Quote:


> Originally Posted by *ImJJames*
> 
> This, I find it funny all the graphic cards I have had that were under 70% ASIC were best overclockers.


My understanding is, lower ASIC score means you can overclock more, but with more heat generated, meaning suitable for a WC system. But if you want air cooled overclocking, you need a high ASIC score. I'm air cooling.

Btw what setting is best for unigine heaven for general comparison purposes on this thread?


----------



## Ashuiegi

my matrix 7970 at 60% asic can run at 1280 -1850 on air and stock voltage just fine , under 70 for gpu and under 50 for vrm ram and board
plus gpu-z asic reading might not be that accurate , i would trust the manufacturer better with binning the right chips for the factory oced card


----------



## ImJJames

Quote:


> Originally Posted by *taem*
> 
> My understanding is, lower ASIC score means you can overclock more, but with more heat generated, meaning suitable for a WC system. But if you want air cooled overclocking, you need a high ASIC score. I'm air cooling.
> 
> Btw what setting is best for unigine heaven for general comparison purposes on this thread?


For heaven it would be ULTRA 1080P EXTREME tess


----------



## Amhro

Quote:


> Originally Posted by *856Media*
> 
> After unlocking the voltage adjustment inside the settings, the slider is still locked, even after rebooting... Any ideas? (MSI AFTERBURNER)


I guess your card has locked voltage.


----------



## 856Media

Quote:


> Originally Posted by *Amhro*
> 
> I guess your card has locked voltage.


RAAAAGGEEEE..

What information could I post here to confirm that?


----------



## Ashuiegi

do you have the latest driver and the correct program for your brand of 280x ?
you can't change the voltage of most asus card with msi afterburner for exemple


----------



## 856Media

Ugh... the stupid card is so big, I had to unplug my freakin CD drive to get it to fit. I didn't use ANY of the tools included on the CD. I am using the most updated Beta Drivers from AMD 13.11 Beta 9.

Guess I have to venture to Best Buy in search of a Sata elbow.


----------



## Ashuiegi

you can get the correct program online too , just go to the site of your manufacturer. but if you use msi afterburner for an asus or sapphire card for exemple i m not surprised it doesn't work


----------



## 856Media

Thanks man. Looks like I need to use the Sapphire Trixx Software. Will report back with findings.


----------



## 856Media

Stock is 1020/1500/1162v

Pushed to 1150/1500/1200 + 20% Power - was able to get it into game (BF4) but crashed moments later. You think push voltage more?


----------



## TempAccount007

It was bound to be brought up...

A high ASIC quality means you can overclock with less voltage - low ASIC cards require more.

HOWEVER, high ASIC cards use more CURRENT - low ASIC cards require less.

Therefore high ASIC cards are hotter than low ASIC cards at the same voltage, however they are both the same temps at the same clock speed.

The ONLY time a low ASIC card will be a problem is if your temps are good, but you can't increase the voltage anymore to achieve better overclocks (happened to me). High ASIC cards don't have that problem. Temp is often the limit of performance for high ASIC cards, while voltage is often the limit for very low ASIC cards. In addition, because low ASIC cards demand more from their VRM, they are more likely to fail sooner.

Having just returned a card with a 57% ASIC, I can assure you guys that ASIC quality (at least on the ends of the range) can make a HUGE difference.

In total, I have probably spent a full 12 hours looking ASIC information up. I returned my card early last week so I had some time to really look into it.

This is the single best post I could find on the subject:
http://forum.beyond3d.com/showpost.php?p=1344188&postcount=44
Quote:


> This is normal for a given silicon process (say 0.13u) to have such variation that some transistors in certain chip die have shorter channel length (less than 0.13u) or some have longer channel length. Those that have shorter channel length _*HIGH ASIC*_ have faster intrinsic speed and can run as fast when smaller Vcore is applied (pros). On the other hand (cons), due to the lower threshold voltage which draws higher leakage current and generates more heat at the same higher Vcore, these chips can run as fast at a low Vcore as the higher Vcore rate chips, but they will max out at a lower Vcore compared to the higher Vcore rated siblings.


In case you were wondering, I haven't found any trend in ASIC distribution between card manufacturers. You are just as likely to receive a given ASIC quality regardless of how much you spent.

Finally, this is why I went with the 280X DirectCU II. My card comes on Wednesday, but if I get stuck with a low ASIC chip (even though I would probably return it) the card will have more phases (it has a total of 12) to provide higher voltage without vdroop... hopefully.


----------



## ImJJames

Quote:


> Originally Posted by *856Media*
> 
> Stock is 1020/1500/1162v
> 
> Pushed to 1150/1500/1200 + 20% Power - was able to get it into game (BF4) but crashed moments later. You think push voltage more?


If you're still on stock voltage than heck yeah...increase that baby.


----------



## 856Media

Quote:


> Originally Posted by *ImJJames*
> 
> If you're still on stock voltage than heck yeah...increase that baby.


Any thoughts on what a safe ceiling might be?


----------



## rdr09

Quote:


> Originally Posted by *856Media*
> 
> Any thoughts on what a safe ceiling might be?


depends on your temps. use HwInfo64 to monitor both Gpu's CORE and VRMs(just as important). on air, i suggest staying away from 1.3v for the core. max i would venture with proper temp is 1.27v. for the memory 1.6v should be plenty.

are you still using the Phenom? if you are, then i suggest to keep the gpu at stock and oc'ed the phenom to the moon.


----------



## steelkevin

I didn't even know anything about ASIC quality before I saw it mentionned in here. For people wondering GPU-z can read your card's ASIC quality. You just have to left click in the top left corner of GPU-Z and chose "Read ASIC Quality". A window then pop ups with your card's ASIC quality and its meaning.

Mine is 63.7%. Which means it would've done better underwater


----------



## TheImpZA

What are your guys thoughts on the Sapphire Vapor-X R9-270X? I would go for the Toxic, but that's too big for my case. Anyone here own a 270X Vapour-X. (too many X's in this post







). Should I go for MSI or Gigabyte instead?

It's my first time going AMD, so I'm a little nervous. I'll be buying a card in December, after Mantle has launched too see if it's worth it for the performance increase. Also, hopefully by then they have Never Settle packages for the R9 series cards.

I'm running at 900p, so a 270X should last a while right?

Is it just me, or is there a distinct lack of reviews for the 270X when compared to other cards?


----------



## KeepWalkinG

When i clock my sapphire dual x r9 280x with Sapphire Trixx i have 1 problem.
After sleep PC voltage in sapphire trixx going on stock 1.200... and every time i need to move to 1.3v this is my voltage for 1220 core and 7200 memory.
How to change this option after sleep my voltage be stable???

THANKS !!!


----------



## ImJJames

Quote:


> Originally Posted by *TheImpZA*
> 
> What are your guys thoughts on the Sapphire Vapor-X R9-270X? I would go for the Toxic, but that's too big for my case. Anyone here own a 270X Vapour-X. (too many X's in this post
> 
> 
> 
> 
> 
> 
> 
> ). Should I go for MSI or Gigabyte instead?
> 
> It's my first time going AMD, so I'm a little nervous. I'll be buying a card in December, after Mantle has launched too see if it's worth it for the performance increase. Also, hopefully by then they have Never Settle packages for the R9 series cards.
> 
> I'm running at 900p, so a 270X should last a while right?
> 
> Is it just me, or is there a distinct lack of reviews for the 270X when compared to other cards?


Well 270x is just a rebranded 7870...not really anything to talk about considering its been out for a while now. And don't worry a 270x with some overclocking will run any games out right now on max graphics @ 900p no problem.


----------



## mtcn77

Quote:


> Originally Posted by *TheImpZA*
> 
> What are your guys thoughts on the Sapphire Vapor-X R9-270X? I would go for the Toxic, but that's too big for my case. Anyone here own a 270X Vapour-X. (too many X's in this post
> 
> 
> 
> 
> 
> 
> 
> ). Should I go for MSI or Gigabyte instead?
> 
> It's my first time going AMD, so I'm a little nervous. I'll be buying a card in December, after Mantle has launched too see if it's worth it for the performance increase. Also, hopefully by then they have Never Settle packages for the R9 series cards.
> 
> I'm running at 900p, so a 270X should last a while right?
> 
> Is it just me, or is there a distinct lack of reviews for the 270X when compared to other cards?


I recognise 270x as a recommendable compared to HD7870. The memory specifications are selected according to operable latencies at higher frequencies, so I don't have any confusion when LANoc reports no benefit for memory overclocking.
Sapphire 270x Toxic is a great board, the next best they tested is Asus DirectCu 2 270x.
1-Asus DC Mini Gtx 670,
2-Sapphire T 270x,
3-MSI Gtx 760 H,
4-Asus DC 270x,
6-Nvidia Gtx 760,
7-Gigabyte WF 270x,
8-EVGA SC Gtx 760.


----------



## Devildog83

Quote:


> Originally Posted by *mtcn77*
> 
> I recognise 270x as a recommendable compared to HD7870. The memory specifications are selected according to operable latencies at higher frequencies, so I don't have any confusion when LANoc reports no benefit for memory overclocking.
> Sapphire 270x Toxic is a great board, the next best they tested is Asus DirectCu 2 270x.
> 1-Asus DC Mini Gtx 670,
> 2-Sapphire T 270x,
> 3-MSI Gtx 760 H,
> 4-Asus DC 270x,
> 6-Nvidia Gtx 760,
> 7-Gigabyte WF 270x,
> 8-EVGA SC Gtx 760.


I am waiting for the 270x Devil which was announced recently. My second choice would be the 270x Hawk @ $220. If the Devil is too expensive the hawk it is.


----------



## MarlowXim

Picked up an MSI 280x for crossfire with 7970, I can say that for this cooler you will need ample ventilation, currently had to give the card enough spacing to have a dual 140mm fans blowing on it.

The PCB is black much like any of the Gigabyte UD4 series, seems to be reference design when I looked at the PCB.

ASIC Quality: 65.8%
Current Overclock on air: 1200/1600 @ 1.231v 1.244v
Temperature @ Max Fan RPM: 60 to 61 degrees

GPUtool Artifact testing 15 minutes - 0 artifacts
Tested Stability: Metro Last Light 15 Runs Pass and BF3 Campaign 10-20mins. Will test in Multiplayer on Metro 64p for further stability testing Tested in 64p Metro

Very curious on how this would do on water, currently requires alot less volts then my 7970 to get to 1200. Although one of the fans is defective, squeaking or scratching much like a loud hard drive. Probably not enough silicone lubrication in the fan bearings at factory.


----------



## BackwoodsNC

Quote:


> Originally Posted by *856Media*
> 
> Any thoughts on what a safe ceiling might be?


Max safe 24/7 is 1.3v. Gotta keep the temps below 70c or it will start acting up on overclock. These are temp sensitive


----------



## ImJJames

Quote:


> Originally Posted by *MarlowXim*
> 
> Picked up an MSI 280x for crossfire with 7970, I can say that for this cooler you will need ample ventilation, currently had to give the card enough spacing to have a dual 140mm fans blowing on it.
> 
> The PCB is black much like any of the Gigabyte UD4 series, seems to be reference design when I looked at the PCB.
> 
> ASIC Quality: 65.8%
> Current Overclock on air: 1200/1600 @ 1.231v
> Temperature @ Max Fan RPM: 60 to 61 degrees
> 
> GPUtool Artifact testing 15 minutes - 0 artifacts
> Tested Stability: Metro Last Light 15 Runs Pass and BF3 Campaign 10-20mins. Will test in Multiplayer on Metro 64p for further stability testing
> 
> Very curious on how this would do on water, currently requires alot less volts then my 7970 to get to 1200. Although one of the fans is defective, squeaking or scratching much like a loud hard drive. Probably not enough silicone lubrication in the fan bearings at factory.


Post a valley benchmark please with those clocks







crossfire


----------



## TempAccount007

Quote:


> Originally Posted by *BackwoodsNC*
> 
> Max safe 24/7 is 1.3v. Gotta keep the temps below 70c or it will start acting up on overclock. These are temp sensitive


I wonder if temps increase resistance, so you need more voltage to compensate.

Just a thought.


----------



## taem

I'm not thrilled about temps on my Asus DC2T 3GD5. Idles at 38c, which seems high.

In my one gaming session so far, Dead Island took temp to 75c. But I think something weird was going on, game froze, returned an error message about shadows, then unfroze, and temps dropped to low 60s. Load was high 90s. If mid 70s temp is what I'm going to get with this card, that's too high. I'll try some different games tomorrow.

Fans never went over 45%, and they're quiet, which was why I picked this model, so on that score, card is great. I can't hear the fans at 45% over my case fans.

Incidentally side fan on intake lowers temps by a few degrees. Haven't tried exhaust.


----------



## Ashuiegi

same with the toxic with stock fan setting it s dead quiet but it will go over 70 in long gaming session , i used a custom profil on my toxic but it s in CF on the top so it was mandatory


----------



## RaphLYC

How much faster is a vapor-x r9 280x oc to 1170/1600 on gaming? I am not sure if its worth it to overclock it...


----------



## ImJJames

Quote:


> Originally Posted by *RaphLYC*
> 
> How much faster is a vapor-x r9 280x oc to 1170/1600 on gaming? I am not sure if its worth it to overclock it...


I am guessing that model stock is 1000mhz? If so, 1180 would be a pretty significant boost in FPS for gaming.


----------



## iksklld

I ordered myself 2x Club3d R9-280x Royalking as they were on special here. Should be here latest tuesday


----------



## MarlowXim

Quote:


> Originally Posted by *ImJJames*
> 
> Post a valley benchmark please with those clocks
> 
> 
> 
> 
> 
> 
> 
> crossfire


Not sure if it's a good score or not.. but very pretty benchmark


----------



## ImJJames

Quote:


> Originally Posted by *MarlowXim*
> 
> Not sure if it's a good score or not.. but very pretty benchmark


Thats pretty good, thats about 80% more performance than my single 7970 OC'ed 1200/1710


----------



## RaphLYC

Quote:


> Originally Posted by *ImJJames*
> 
> I am guessing that model stock is 1000mhz? If so, 1180 would be a pretty significant boost in FPS for gaming.


the stock os 1070


----------



## steelkevin

Now what's going on here ?

How is 70°C for a GPU too hot ? It's right where it needs to be for a silent air-cooled GPU. If you're not happy with silent / Air-cooled GPUs you've got two options:
- Crank your GPU's fans up
- Go watercooling

There is nothing wrong with those cards people "complained" about on page 141. I would only ever be worried if an air-cooled GPU which was quite loud was pulling over 80-85°C while gaming.


----------



## Ashuiegi

i know from experience that my cards that were in a badly cooled case and on which i never bothered changing the fan settings were dead far too soon.
i like to keep them under 70 even in CF. i don't believe it's the gpu , it can handle these temp easy but other component on the board might age faster at higher temp.


----------



## Joeking78

I'm going to have to check my AISC when I get home, I think I have a good set of cards because in crossfire I can get to 1200mhz core on stock voltage (mine are voltage locked).

I have a question though, I think I used to know the answer before I took a 5 year break...are core clock & memory clock linked, as in, you can push your core to 1200mhz and leave your mem clock alone and it will bench fine, but pushing your mem slightly with the same core clock can crash a benchmark?

I can run 1200core/1500mem (stock memory), but putting just 50mhz on the mem with the same core clock crashes, however, leaving stock core (1100) and pushing memory to 1750mhz also benches fine.


----------



## battleaxe

Quote:


> Originally Posted by *taem*
> 
> I'm not thrilled about temps on my Asus DC2T 3GD5. Idles at 38c, which seems high.
> 
> In my one gaming session so far, Dead Island took temp to 75c. But I think something weird was going on, game froze, returned an error message about shadows, then unfroze, and temps dropped to low 60s. Load was high 90s. If mid 70s temp is what I'm going to get with this card, that's too high. I'll try some different games tomorrow.
> 
> Fans never went over 45%, and they're quiet, which was why I picked this model, so on that score, card is great. I can't hear the fans at 45% over my case fans.
> 
> Incidentally side fan on intake lowers temps by a few degrees. Haven't tried exhaust.


You need to setup a fan profile in afterburner. If you are only running 45% that's why your temps are going above 70c. Sounds like you've got lots of room to get those temps down.


----------



## BackwoodsNC

Quote:


> Originally Posted by *steelkevin*
> 
> Now what's going on here ?
> 
> How is 70°C for a GPU too hot ? It's right where it needs to be for a silent air-cooled GPU. If you're not happy with silent / Air-cooled GPUs you've got two options:
> - Crank your GPU's fans up
> - Go watercooling
> 
> There is nothing wrong with those cards people "complained" about on page 141. I would only ever be worried if an air-cooled GPU which was quite loud was pulling over 80-85°C while gaming.


I can only speak from experience i got with mine. With a 1205 core clock i started getting artifacts at about 70-72c. With that same overclock and the card's under water I no longer get those artifacts. YMV!!!


----------



## battleaxe

Do you guys know of a good way to cool the VRM's without a water block?


----------



## DiceAir

Quote:


> Originally Posted by *taem*
> 
> I'm not thrilled about temps on my Asus DC2T 3GD5. Idles at 38c, which seems high.
> 
> In my one gaming session so far, Dead Island took temp to 75c. But I think something weird was going on, game froze, returned an error message about shadows, then unfroze, and temps dropped to low 60s. Load was high 90s. If mid 70s temp is what I'm going to get with this card, that's too high. I'll try some different games tomorrow.
> 
> Fans never went over 45%, and they're quiet, which was why I picked this model, so on that score, card is great. I can't hear the fans at 45% over my case fans.
> 
> Incidentally side fan on intake lowers temps by a few degrees. Haven't tried exhaust.


75C is still fine. i wouldn't worry. My GTX570 for example was running 90C+ for about a month and i was playing every night. nothing happened. They are designed to handle the load. I'm sure Asus won't make a fan profile like that if it wasn't safe.


----------



## battleaxe

Quote:


> Originally Posted by *DiceAir*
> 
> 75C is still fine. i wouldn't worry. My GTX570 for example was running 90C+ for about a month and i was playing every night. nothing happened. They are designed to handle the load. I'm sure Asus won't make a fan profile like that if it wasn't safe.


It all depends on where the card is designed to throttle. It may not hurt the card, but if its throttling that's no fun to deal with. My 670's throttle at 75, so I try to keep them below 70 at all times. They run better that way too. The fan profile in MSI AB is the way to go with protecting these things and keeping performance in check. There's no reason not to use it other than some extra fan noise when gaming. But they can still be setup to run silent when at idle with a fan profile.


----------



## neXen

My 7950s throttled hard when temps went higher than 70 Celsius.

I picked up a sapphire R9 280x Vapor X

Doe these also throttle around 70?

I was thinking of slapping an arctic accelero xtreme 3 on it, does anyone have any experience or insight on this?


----------



## gnemelf

I've got the Powercolor 280x running a small oc of (1100/1600)


----------



## taem

Quote:


> Originally Posted by *steelkevin*
> 
> Now what's going on here ?
> 
> How is 70°C for a GPU too hot ? It's right where it needs to be for a silent air-cooled GPU. If you're not happy with silent / Air-cooled GPUs you've got two options:
> - Crank your GPU's fans up
> - Go watercooling
> 
> There is nothing wrong with those cards people "complained" about on page 141. I would only ever be worried if an air-cooled GPU which was quite loud was pulling over 80-85°C while gaming.


It's 75c not 70c, it hits that temp almost immediately. Just going to game menu screen takes temp to 50c. On the plus side it hits 75c and stays there, doesn't go higher, at least not after about 90 minutes. So the heat sink and fans on the Asus DC2T 3GD5 are as well designed as reviews say.

And like I said, it lowered to 63c after that error msg. Fps didn't change so I don't think it was a heat driven glitch and throttling. Only throttling I've heard of with the 280x is the gigabyte or msi (forget which) at 90c with original bios. New bios fixes the throttling.

As to whether 75 is high, I think so. I know a lot of guys go as high as 90, and consider 100+ as danger zone. But 75 to me, at stock, is too high. What happens when I bump that up 1200 clock? I was hoping for 60s at stock, 70 max.

Gonna try some different games later today. Unigine Heaven for an hour or so took temps to 67c which is more to my liking. I don't want to crank up the fans, defeats the purpose of my Fractal Design Define R4 build. As it is with the gpu fans so nice and quiet I want to replace the nzxt fans with lower cfm/pressure but quieter fans.


----------



## steelkevin

Quote:


> Originally Posted by *taem*
> 
> It's 75c not 70c, it hits that temp almost immediately. Just going to game menu screen takes temp to 50c. On the plus side it hits 75c and stays there, doesn't go higher, at least not after about 90 minutes. So the heat sink and fans on the Asus DC2T 3GD5 are as well designed as reviews say.
> 
> And like I said, it lowered to 63c after that error msg. Fps didn't change so I don't think it was a heat driven glitch and throttling. Only throttling I've heard of with the 280x is the gigabyte or msi (forget which) at 90c with original bios. New bios fixes the throttling.
> 
> As to whether 75 is high, I think so. I know a lot of guys go as high as 90, and consider 100+ as danger zone. But 75 to me, *at stock*, is too high. What happens when I bump that up 1200 clock? I was hoping for 60s at stock, 70 max.
> 
> Gonna try some different games later today. Unigine Heaven for an hour or so took temps to 67c which is more to my liking. I don't want to crank up the fans, defeats the purpose of my Fractal Design Define R4 build. As it is with the gpu fans so nice and quiet I want to replace the nzxt fans with lower cfm/pressure but quieter fans.


Well there we have it. The card isn't running @stock. It's already overclocked. There's being demanding and there's being delusionnal. Cards that run @60°C while remaining silent and being slightly overclocked do not exist.
You're asking for too much. You can't have it all.


----------



## battleaxe

Do you guys know of a good way to cool the VRM's without a water block?

Bump


----------



## BackwoodsNC

Quote:


> Originally Posted by *battleaxe*
> 
> Do you guys know of a good way to cool the VRM's without a water block?
> 
> Bump


Heatsinks and good airflow


----------



## battleaxe

Quote:


> Originally Posted by *BackwoodsNC*
> 
> Heatsinks and good airflow


Well, yes of course. But I remember seeing some heatsink kits for the VRM's on here somewhere, but no idea where to look. Sorry, that's what I meant. Some of the guys were attaching sinks to the ram and VRM's.


----------



## kpo6969

Quote:


> Originally Posted by *gnemelf*
> 
> I've got the Powercolor 280x running a small oc of (1100/1600)


Would you recommend your card, Haven't seen a review anywhere yet for it. Anything I've read on the 7970 version wasn't too good. The only real difference I saw between the two was the 280X has a back plate.


----------



## taem

280x Vapor X owners, talk to me about the card. How loud is it?

I'm probably going to return my Asus. I'm happy with the card -- it's so quiet! -- but it's too long by about 3mm and I have to turn the drive cage so it blocks front fans. The Vapor X is shorter by 7mm so that would fit. So would MSI and Gigabyte, but I like the look of the Vapor X. My only qualm is reviews noting loud noise -- 56.8dB at full load vs Asus at 40.8. http://www.tomshardware.com/reviews/radeon-r9-280x-third-party-round-up,3655-5.html


----------



## MoNtHeR2

Can anyone help me ? .... i have r9 280x windforce gpu ...and amd fx 8350 @ 4.4 ghz cpu ..... when i play bf4 my gpu usage drops to 50% and that makes the game laggy ...whats wrong ??....i dont think its bottlenecked by cpu !!


----------



## Devildog83

Quote:


> Originally Posted by *MoNtHeR2*
> 
> Can anyone help me ? .... i have r9 280x windforce gpu ...and amd fx 8350 @ 4.4 ghz cpu ..... when i play bf4 my gpu usage drops to 50% and that makes the game laggy ...whats wrong ??....i dont think its bottlenecked by cpu !!


Post a sig rig, it will help us help you.


----------



## MoNtHeR2

Quote:


> Originally Posted by *Devildog83*
> 
> Post a sig rig, it will help us help you.


what does sig rig mean







?
if u mean my pc specs its:
fx 8350 @ 4.4 ghz
990xa-ud3
8 gb ddr3
r9 280x windforce


----------



## Devildog83

Quote:


> Originally Posted by *MoNtHeR2*
> 
> what does sig rig mean
> 
> 
> 
> 
> 
> 
> 
> ?
> if u mean my pc specs its:
> fx 8350 @ 4.4 ghz
> 990xa-ud3
> 8 gb ddr3
> r9 280x windforce


On the black line at the top of the page on the right there is a link called rigbuilder, click that and build your rig so folks can see what you have.

What driver version are you using?


----------



## MoNtHeR2

Quote:


> Originally Posted by *Devildog83*
> 
> On the black line at the top of the page on the right there is a link called rigbuilder, click that and build your rig so folks can see what you have.
> 
> What driver version are you using?


13.11v9.2 beta


----------



## Devildog83

Quote:


> Originally Posted by *MoNtHeR2*
> 
> 13.11v9.2 beta


Have you disabled overdrive? In the CCC.


----------



## MoNtHeR2

Quote:


> Originally Posted by *Devildog83*
> 
> Have you disabled overdrive? In the CCC.


should i ?


----------



## Devildog83

Quote:


> Originally Posted by *MoNtHeR2*
> 
> should i ?


I did because I was told it causes issues if you are running another clocking tool. The Graphics Overdrive could interfere. I just thought it might help.


----------



## Devildog83

http://www.brightsideofnews.com/news/2013/11/8/amd-radeon-r9-290-blowing-the-doors-off-the-competition.aspx

Hmmmmmm!!!


----------



## darkelixa

Heya,

I currently have a 770gtx 2gb video card with my i5 4670k ,16gb ddr3 ram, 120gb samsung 840,1x2tb wd black hdd,750 fsp psu gold and my 770 gives me alot of stuttering in game like final fantasy and crysis 3 so im here to change ships.

Would it be a good upgrade to go with a xfx 7970 3gb card for $315

http://www.pccasegear.com/index.php?main_page=product_info&products_id=25637&zenid=86cc50390ca7f88ec8c5e6e5b3bbfd75

Or are the r280x versions alot better, they are about $70 more


----------



## ImJJames

Quote:


> Originally Posted by *darkelixa*
> 
> Heya,
> 
> I currently have a 770gtx 2gb video card with my i5 4670k ,16gb ddr3 ram, 120gb samsung 840,1x2tb wd black hdd,750 fsp psu gold and my 770 gives me alot of stuttering in game like final fantasy and crysis 3 so im here to change ships.
> 
> Would it be a good upgrade to go with a xfx 7970 3gb card for $315
> 
> http://www.pccasegear.com/index.php?main_page=product_info&products_id=25637&zenid=86cc50390ca7f88ec8c5e6e5b3bbfd75
> 
> Or are the r280x versions alot better, they are about $70 more


280x is a re-branded 7970 just get the cheapest of the two. Will you get performance increase from 770? No, they are pretty equal.


----------



## Devildog83

Quote:


> Originally Posted by *darkelixa*
> 
> Heya,
> 
> I currently have a 770gtx 2gb video card with my i5 4670k ,16gb ddr3 ram, 120gb samsung 840,1x2tb wd black hdd,750 fsp psu gold and my 770 gives me alot of stuttering in game like final fantasy and crysis 3 so im here to change ships.
> 
> Would it be a good upgrade to go with a xfx 7970 3gb card for $315
> 
> http://www.pccasegear.com/index.php?main_page=product_info&products_id=25637&zenid=86cc50390ca7f88ec8c5e6e5b3bbfd75
> 
> Or are the r280x versions alot better, they are about $70 more


Check out the post above. For $85 more you could have 290. I would wait for the non reference cards though. They should have much better cooling solutions.


----------



## darkelixa

A 290 here in Australia starts at $499 which is $184 more than the original 7970. Does the 925mh clock speed matter on the $315 card or is it worth to spend the extra for a 1000mh card?

Something has to be better than the 770gtx stuttering in video playback . When I use my misses 7870 2gb this issue does not happen at all.


----------



## hoevito

Quote:


> Originally Posted by *darkelixa*
> 
> A 290 here in Australia starts at $499 which is $184 more than the original 7970. Does the 925mh clock speed matter on the $315 card or is it worth to spend the extra for a 1000mh card?
> 
> Something has to be better than the 770gtx stuttering in video playback . When I use my misses 7870 2gb this issue does not happen at all.


After overclocking my Matrix platinum (which comes clocked at 1100mhz/6400mhz already) to 1220/7000mhz, I got around a 9-12% gain to my frame-rates depending on what game was being played over stock clocks, for example. From all that I've gathered, a well clocked 7970/280x is pretty close to gtx 780 territory in quite a few cases. I would imagine that would get you around 20% more performance at the least with a card that can clock to around 1200mhz as opposed to one around 900mhz, so yeah, I'd say they're worth the extra money unless you decide to step up to the 290...


----------



## darkelixa

My budget does not really stretch that for for a 290 unless they go down in price anytime soon, so aim for a 1000mh 7970/280 ya reckin?


----------



## Ashuiegi

if you get a 280x you can pick up a second one later for a reasonable price , it s not an option with 290.
Anyway if you want a 290 , wait for the custom version.


----------



## darkelixa

I was looking at the HIS Radeon R9 280X they are $385

http://www.scorptec.com.au/computer/51913-his-r9280x-3gd5-t

They have a 1000mhz frequency. Are they a good brand or is Gigabyte better? My current 770gtx is gigabyte so im a bit hesitant in buying another


----------



## hoevito

Quote:


> Originally Posted by *darkelixa*
> 
> My budget does not really stretch that for for a 290 unless they go down in price anytime soon, so aim for a 1000mh 7970/280 ya reckin?


The ASUS R9280X-DC2T (1070mhz), Sapphire Toxic (1150mhz), Matrix Platinum (1100mhz), and SAPPHIRE 100363VXSR Vapor X (1070mhz) are the cream of the crop in 280x land I would say...they are 4 of the highest clocked 7970/280x's and as such offer the most overclocking headroom compared to the others...can't go wrong with any of them really!!!


----------



## darkelixa

My last sapphire card 7870 started to artifact after a few weeks ( a huge problem sapphire had with that batch of card) Is that problem still current in there new line up? I know with sapphire they didnt solve the issue till it was 7970


----------



## hoevito

Quote:


> Originally Posted by *darkelixa*
> 
> My last sapphire card 7870 started to artifact after a few weeks ( a huge problem sapphire had with that batch of card) Is that problem still current in there new line up? I know with sapphire they didnt solve the issue till it was 7970


I can't speak about Sapphire's specifically, but as was mentioned earlier, these 7970/280x's are pretty sensitive to heat ESPECIALLY when overclocked!!! Some of these other users here I would imagine are NOT setting up customer fan profiles when overclocking, which easily explains why some users are having issues with their cards. I would suggest if you do decide to get one and if you decide to do any overclocking, DEFINITELY setup a custom fan profile to keep your card running as cool as possible, and that will also keep artifacts to a minimum as well. I pretty much keep my fans running at 50% all the time and I never have any issues unless I push my overclock to hard in graphically demanding games.

I can run a stable overclock with absolutely zero issues up to 1240/7200mhz on Battlefield 4 multiplayer for example, but can only go up to 1220/7000 in the Witcher 2.

Oh yeah, I use the Witcher 2 as a benchmark, and to test stability with my overclocks. I was running Heaven and Valley and Firestrike like everyone else swears by to test my graphics card for stability, but while the card would pass them all with flying colors at say 1240/7200/mhz even looping for 30-40 minutes, it would artifact within the first 20 seconds of a cutscene in The Witcher 2 and possibly crash within about 5 minutes of gameplay. Just food for thought I guess...


----------



## MoNtHeR2

Quote:


> Originally Posted by *Devildog83*
> 
> I did because I was told it causes issues if you are running another clocking tool. The Graphics Overdrive could interfere. I just thought it might help.


it didnt help


----------



## diggiddi

Quote:


> Originally Posted by *hoevito*
> 
> After overclocking my Matrix platinum (which comes clocked at 1100mhz/6400mhz already) to 1220/7000mhz, I got around a 9-12% gain to my frame-rates depending on what game was being played over stock clocks, for example. From all that I've gathered, a well clocked 7970/280x is pretty close to gtx 780 territory in quite a few cases. I would imagine that would get you around 20% more performance at the least with a card that can clock to around 1200mhz as opposed to one around 900mhz, so yeah, I'd say they're worth the extra money unless you decide to step up to the 290...


Do you know how long the Crossfire bridge is on your 280x 100, 107 or 120mm?


----------



## Joeking78

Quote:


> Originally Posted by *MoNtHeR2*
> 
> Can anyone help me ? .... i have r9 280x windforce gpu ...and amd fx 8350 @ 4.4 ghz cpu ..... when i play bf4 my gpu usage drops to 50% and that makes the game laggy ...whats wrong ??....i dont think its bottlenecked by cpu !!


I have the same card (two of them in Crossfire)...what are you temps at load? How is your case air flow? Could potentially be an airflow/temp problem.

Also, did you follow the correct procedure for driver install? i.e. remove via control panel, restart, run driver sweeper, restart, install new drivers?

Have you setup a BF4 profile in CCC? Do any of the CCC & BF4 video settings conflict? I used to experience issues with BF3 if I set vsync in CCC & in BF3 video settings...right now all my settings in CCC are application controlled.

However, I think your problem is hardware/driver related, not settings/config related.


----------



## Joeking78

Quote:


> Originally Posted by *hoevito*
> 
> The ASUS R9280X-DC2T (1070mhz), Sapphire Toxic (1150mhz), Matrix Platinum (1100mhz), and SAPPHIRE 100363VXSR Vapor X (1070mhz) are the cream of the crop in 280x land I would say...they are 4 of the highest clocked 7970/280x's and as such offer the most overclocking headroom compared to the others...can't go wrong with any of them really!!!


The Windforce is 1100/1500 and my version can get to 1200/1750 with stock voltage, the Gigabyte card is a very good one too...it is voltage locked though...the DC2T isn't voltage locked and I think the Toxic & Matrix are also voltage unlocked.


----------



## darkelixa

So gigabyte is better than xfx and His?


----------



## hoevito

Quote:


> Originally Posted by *darkelixa*
> 
> So gigabyte is better than xfx and His?


Gigabyte>HIS for sure

Gigabyte core clock is 150mhz higher out the box and boost clock is 100mhz higher (1000/1100 vs 850/1000)
3 year warranty vs 2 year for the HIS
The Gigabyte heatsink and fan design at least to me looks much better than the HIS

The XFX looks pretty nice aesthetically to me, but it just doesn't look like it's going to cool very well for a decent overclock, which is the same issue the HIS would have in my eyes. Personally though, I feel that the Asus DC2T is probably the best all around 280x, as it comes clocked at 1070 at the core and 6400mhz memory, which is quite a bit higher than other 280x's as most others are have memory clocks at 6000mhz. The Matrix plat, while a sweet card, is a space hogging power sucking BEAST. If you could actually find a Sapphire toxic anywhere and they weren't so expensive, I'd say it's probably the best 280x money can buy...but you're just $40 short of a 290 at that point (in the US)...


----------



## kpo6969




----------



## Joeking78

Quote:


> Originally Posted by *darkelixa*
> 
> So gigabyte is better than xfx and His?


The Gigabyte card is good but like I said it's voltage locked, no matter what you do you cannot unlock voltage control so you're restricted to how far you can push by the the stock voltage.

Mine will get to 1200 core, 1700-1750 memory but it got so hot at those clocks (over 100c in Crossfire) that I did the GPUMOD to water cool them.


----------



## Warl0rdPT

To Asus 280X DC2T owners, it it possible to adjust card voltage using Afterburner? Or do you have to use the Asus GPU Tweak?

I would like to keep Afterburner because of the ingame OSD (and the author just confirmed that he is working on a x64 version that will be compatible with x64 games and will exclusive to afterburner for 9 months).

And I don't think it makes sense (or its even possible) to have both the GPU Tweak and Afterburner at the same time.


----------



## neXen

Quote:


> Originally Posted by *hoevito*
> 
> G*igabyte>HIS for sure
> 
> Gigabyte core clock is 150mhz higher out the box and boost clock is 100mhz higher (1000/1100 vs 850/1000)*
> 3 year warranty vs 2 year for the HIS
> The Gigabyte heatsink and fan design at least to me looks much better than the HIS
> 
> The XFX looks pretty nice aesthetically to me, but it just doesn't look like it's going to cool very well for a decent overclock, which is the same issue the HIS would have in my eyes. Personally though, I feel that the Asus DC2T is probably the best all around 280x, as it comes clocked at 1070 at the core and 6400mhz memory, which is quite a bit higher than other 280x's as most others are have memory clocks at 6000mhz. The Matrix plat, while a sweet card, is a space hogging power sucking BEAST. If you could actually find a Sapphire toxic anywhere and they weren't so expensive, I'd say it's probably the best 280x money can buy...but you're just $40 short of a 290 at that point (in the US)...


Wut?

The HIS 7970 was one of the best overclockers i have seen


----------



## tsm106

Quote:


> Originally Posted by *neXen*
> 
> Wut?
> 
> The HIS 7970 was one of the best overclockers i have seen


It's funny that you guys think brand has something to do with the silicon lottery. This urban myth is a lie.


----------



## MoNtHeR2

i have gpu usage dRop while playing bf4 !! i have amd fx 8350 cpu


----------



## mtcn77

Quote:


> Originally Posted by *MoNtHeR2*
> 
> i have gpu usage dRop while playing bf4 !! i have amd fx 8350 cpu


I'm not an expert at what this software does, but why don't you try running at a higher schedule timer resolution? AFAIK, Windows is not a real time OS, inputs are processed at scheduled intervals.
This one program called "Hypermatrix" is speculated to improve latency down to 0.5 ms, maybe improving cpu duty ratio as well.

HyperMatrix.zip 6k .zip file


----------



## Ashuiegi

it doesn't work very well , plus afterburner tend to forget your custom fan setting and leave the fan not spinning at all , it happened to me with 2 asus card both amd and nvidia , i m never using afterburner again .
but you can ajust clock but not the voltage on afterburner with a asus card , you can run them both but i think it could cause some crash .
i just think the latest version of gpu tweak is now far superior to msi afterburner specially for an asus card.


----------



## taem

Quote:


> Originally Posted by *Ashuiegi*
> 
> it doesn't work very well , plus afterburner tend to forget your custom fan setting and leave the fan not spinning at all , it happened to me with 2 asus card both amd and nvidia , i m never using afterburner again .
> but you can ajust clock but not the voltage on afterburner with a asus card , you can run them both but i think it could cause some crash .
> i just think the latest version of gpu tweak is now far superior to msi afterburner specially for an asus card.


Well like that guy said, the appeal of Afterburner is the osd.

Do you or anybody else know, can you install Afterburner alongside Asus GPU Tweak just for the osd? You'd leave the AB tweak settings alone of course. But I wonder if that would confuse the system, with one tweak app reporting different settings than the other? Or would whichever app that loads last override the first?

Really wish we could get a standalone osd app that reports temps etc. I know AB uses rivatuner for the osd, can you just install that? I don't want to experiment and blow up my card lol, some guy posted somewhere that he used 3 tweak apps and the voltage increases became cumulative, though I'm not quite sure how that would be possible.


----------



## Ashuiegi

i've run both of them together and i had trouble knowing which was doing what , so i gave up on it , but it was with an old version of gpu tweak that was buggy (this is why i tried afterburner at first), i think most of the problem came from that .
the 2 app will run together but i can't tell you what will happen when tweaking the card, i used afterburner because at the time gpu tweak custom fan profile was even worst then after .
it wont break your card anyway that's for sure.


----------



## Lubed Up Slug

Does anyone have any 280x single and crossfire benchmarks in battlefield 4 preferably at 1080p?


----------



## taem

Quote:


> Originally Posted by *Lubed Up Slug*
> 
> Does anyone have any 280x single and crossfire benchmarks in battlefield 4 preferably at 1080p?


Lots of benches out there, mostly for beta though. Here's one for single Asus 280x http://www.hardwareheaven.com/reviews/1851/pg5/asus-radeon-r9-280x-directcu-ii-top-graphics-card-review-battlefield-4.html

65 avg at 1080p maximum. Min does drop below 60, to 55.


----------



## Lubed Up Slug

Quote:


> Originally Posted by *taem*
> 
> Lots of benches out there, mostly for beta though. Here's one for single Asus 280x http://www.hardwareheaven.com/reviews/1851/pg5/asus-radeon-r9-280x-directcu-ii-top-graphics-card-review-battlefield-4.html
> 
> 65 avg at 1080p maximum. Min does drop below 60, to 55.


Thanks, I also went out and found some but I couldn't find any 280x crossfire benchmarks, there will probably be some when mantle is released for bf4.


----------



## Crowe98

thinking on waiting for the 800 series by nvidia... not too sure yet


----------



## darkelixa

They can't even get series 7 drivers right, so I wouldnt be in a rush to buy an 8 anytime soon lol


----------



## Crowe98

Quote:


> Originally Posted by *darkelixa*
> 
> They can't even get series 7 drivers right, so I wouldnt be in a rush to buy an 8 anytime soon lol


Apparently the 8 series is something to look forward too, new technologies or whatever.


----------



## Durvelle27

If you want to be added to the OP please PM required information


----------



## Rar4f

Is R9 280X Matrix more suited for overclocking than a Toxic? The quality of Matrix seems better than Toxic's.

Toxic has guranteed 1100 MHz, but i believe if your lucky with overclocking Matrix will take better advantage of it.


----------



## sixor

good way to test gigabyte 270x

oc values???????


----------



## light70

Hi there!

I'm about to buy an r9 280x video card but I don't know which one get, I think that the MSI is one of better one but can't find one at 250 euro.

So I'm thinking about the vapor-x or the asus directII standard ( 850 mhz, 1000 mhz in boost; this is the NON-TOP version ).
Which one of the 2 would suggest me? Expecially I would like to know which one has a better vrm situation.


----------



## Rar4f

Quote:


> Originally Posted by *light70*
> 
> Hi there!
> 
> I'm about to buy an r9 280x video card but I don't know which one get, I think that the MSI is one of better one but can't find one at 250 euro.
> 
> So I'm thinking about the vapor-x or the asus directII standard ( 850 mhz, 1000 mhz in boost; this is the NON-TOP version ).
> Which one of the 2 would suggest me? Expecially I would like to know which one has a better vrm situation.


Are you going to do moderate or extreme overclocking?


----------



## hoevito

Quote:


> Originally Posted by *Rar4f*
> 
> Is R9 280X Matrix more suited for overclocking than a Toxic? The quality of Matrix seems better than Toxic's.
> 
> Toxic has guranteed 1100 MHz, but i believe if your lucky with overclocking Matrix will take better advantage of it.


On water maybe, but on air the most I can get out of my Matrix that's completely stable with everything I throw at it is 1220/7000mhz. I can get it to loop Valley at 1250, Heaven at 1240, and play Battlefield 4 all day at 1240, but modded to hell Skyrim and the Witcher 2 can only run at 1220. Your mileage may vary, as they say...

The Toxic boosts to 1150, and the Matrix to 1100, so you may have had those backwards by the way...

Personally if I could've found any Toxic's in stock at the time I would've went with one instead of a Matrix. The Matrix is a HUGE card, so make sure if you're eying it you remember that it takes up 3 whole pci slots...and crossfiring them will be a pain even IF you have the space...


----------



## taem

Anyone know how many power phases the various cards pack? Finding it hard to get this info except for Matrix Toxic and DC2T. Especially interested in Club3D RoyalKing and Sapphire Vapor X.

I'd also love to hear comments from anyone who got a Club3D 280x.


----------



## Arizonian

Hello everyone,

I've recently had some discussion with thread starter - *Durvelle27*

If you've submitted proof of your GPU and haven't been submitted yet but would like to be added to the OP of the club thread: Please send PM to *Durvelle27* with your GPU manufacturer and Series and he will add you.


----------



## TempAccount007

Quote:


> Originally Posted by *hoevito*
> 
> Gigabyte>HIS for sure


Gigabyte is voltage locked. Why anyone would buy a voltage locked card is beyond me.

HIS is voltage unlocked and has more power phases (it also requires 2x 8 pin for power).

The HIS would have been my second choice after the ASUS 280X DirectCU II.

Don't get gigabyte. You will have a performance ceiling due to voltage lock from day one.

Also, all the HIS hate I have seen around forums is misplaced. They make excellent cards.


----------



## kpo6969

Newegg has the CLUB 3D royalKing 280X listed now:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814550011


----------



## TempAccount007

Trying to figure out the truth of this post:

http://www.overclock.net/t/1439854/various-amd-r9-290-reviews/740_20#post_21160437
(thanks kpo6969 for providing the link)

I think he is forgetting to count phases that are not for the core.


----------



## kpo6969

http://www.overclock.net/t/1439854/various-amd-r9-290-reviews/750#post_21160437


----------



## light70

Quote:


> Originally Posted by *Rar4f*
> 
> Are you going to do moderate or extreme overclocking?


moderate, I have seen that 1150/1200 can be achived if you get a lucky chip


----------



## Ashuiegi

you can go higher easy , my toxic goes to 1250 in firestrike , my old 7970 matrix up to 1280 good and 1320 with awefull artifact, all that on air.


----------



## DizzlePro

has anyone managed to unlock the voltage on a sapphire toxic 280x?


----------



## Ashuiegi

i have not tried to rise it yet mine does 1250 on stock voltage , but with sapphire trixx it should work


----------



## DizzlePro

also does the 280x supprt TrueAudio?


----------



## kpo6969

No


----------



## DizzlePro

im planning to get a 280x next week but im stuck between the Sapphire toxic & the Asus DC2T, does anyone recommend 1 over the other?

i also heard that the toxic is quite loud when the fan speed is past 70%


----------



## Joeking78

Quote:


> Originally Posted by *Lubed Up Slug*
> 
> Does anyone have any 280x single and crossfire benchmarks in battlefield 4 preferably at 1080p?


With 2 in crossfire max settings I can maintain 60fps with V-sync (not that hard tho tbh







)...without V-sync I get 150+ (200 if I look at the sky







) most of the time, drops to like 115ish when **** gets crazy on a map with lots of action.

I only ran a single card once in BF4, I can't remember exact FPS but it maxed around 90, dipping to high 60's/70.


----------



## Joeking78

Quote:


> Originally Posted by *TempAccount007*
> 
> Gigabyte is voltage locked. Why anyone would buy a voltage locked card is beyond me.
> 
> HIS is voltage unlocked and has more power phases (it also requires 2x 8 pin for power).
> 
> The HIS would have been my second choice after the ASUS 280X DirectCU II.
> 
> Don't get gigabyte. You will have a performance ceiling due to voltage lock from day one.
> 
> Also, all the HIS hate I have seen around forums is misplaced. They make excellent cards.


The Gigabytes aren't too bad in terms of overlcocking, even voltage locked I can get my core to 1200mhz, memory to 6280. I can't push the core or memory any higher without crashes but I can drop the core to 1180 and push the memory to 6500...both options bench fine but that is the limit...I would love to unlock the voltage









But temps are definitely an issue even with the 3 fan cooler...1200 core was not possible with the stock cooler but when I replaced the cooler with the GPUMOD water cooling it dropped my load temps by 40c+ and 1200 core passed several runs of 3Dmark.


----------



## TempAccount007

Quote:


> Originally Posted by *DizzlePro*
> 
> i also heard that the toxic is quite loud when the fan speed is past 70%


Noise comparison videos of some 280x cards (including the two you mentioned):
http://www.tomshardware.com/reviews/radeon-r9-280x-third-party-round-up,3655-6.html


----------



## light70

anyone could tell what is better between asus dIrectII standard ( non TOP version ) and the vaporX ?? I have seen a review in which was reported that vapo-x keep decent temperature even under overclock ( about 1200 mhz and 79 °C ) while the asus manage to arrive at 1200 but with 89°C. It was also reported that the asus was way more silent than the vapor-x indeed the sapphire has the fan at 70% while the asus only reach 49%. Actually I would also know which card has the better power phase!!


----------



## Warl0rdPT

Quote:


> Originally Posted by *taem*
> 
> Well like that guy said, the appeal of Afterburner is the osd.
> 
> Do you or anybody else know, can you install Afterburner alongside Asus GPU Tweak just for the osd? You'd leave the AB tweak settings alone of course. But I wonder if that would confuse the system, with one tweak app reporting different settings than the other? Or would whichever app that loads last override the first?
> 
> Really wish we could get a standalone osd app that reports temps etc. I know AB uses rivatuner for the osd, can you just install that? I don't want to experiment and blow up my card lol, some guy posted somewhere that he used 3 tweak apps and the voltage increases became cumulative, though I'm not quite sure how that would be possible.


Yeah I wouldn't mind using the GPU Tweak but I would like to have an OSD. I know afterburner needs the rivatuner statistics server (thats what actually does the OSD), but I don't think GPU Tweak would provide RTSS with the needed info, since you configure whats displayed inside afterburner and not RTSS.

Also running both at the same time smells like asking for troubles...


----------



## KeepWalkinG

Why when my computer going in sleep mode my voltage on 280x after sleep, voltage is stock...?
I overclocked my card with Sapphire Trixx i set voltage 1.275.. after sleep pc voltage is 1.200.
Any idea how to change this option ?


----------



## Mach 5

So ive decided to jump over to the Red team, and ordered myself a HIS 280x, Its been delivered today and I cant wait to get home and put it through its paces!

Edit: 1500th post!


----------



## DeviousAddict

OK, I have finally got a cross fire bridge that will fit my card. now I really want to fit it right now but i cant turn my pc off at the mo as I'm doing some file converting, does anyone know if I can connect the bridge whilst the PC is running?


----------



## battleaxe

Quote:


> Originally Posted by *DeviousAddict*
> 
> OK, I have finally got a cross fire bridge that will fit my card. now I really want to fit it right now but i cant turn my pc off at the mo as I'm doing some file converting, does anyone know if I can connect the bridge whilst the PC is running?


I definitely would not do that. Can't say I have a definitive answer, but that's a good way to fry the input/output tabs on the cards. That's a no-no on most electronics.


----------



## Devildog83

Quote:


> Originally Posted by *battleaxe*
> 
> I definitely would not do that. Can't say I have a definitive answer, but that's a good way to fry the input/output tabs on the cards. That's a no-no on most electronics.


Except for fans, I flip the switch on the back of the PSU when I unplug or plug in anything so I would say NO!


----------



## taem

Quote:


> Originally Posted by *DizzlePro*
> 
> im planning to get a 280x next week but im stuck between the Sapphire toxic & the Asus DC2T, does anyone recommend 1 over the other?
> 
> i also heard that the toxic is quite loud when the fan speed is past 70%


The DC2T fan maxed out for me at 45% at full gpu load so straight comparison seems unfair if Toxic is st 70%.

DC2T also hit 75c temp at gaming load. So, its quiet, but there is a bit of gimmickry maybe.

Anyone have a Toxic? Run tge fan at 45% and rrport the temp. Conversely, domeone with DC2T run fan at 70% and check noise and temp. I returned my DC2T because I had to temove the hdd cage to fit it and I need the cage.


----------



## Durvelle27

Hello everyone,

I've recently had some discussion with thread starter - Durvelle27

If you've submitted proof of your GPU and haven't been submitted yet but would like to be added to the OP of the club thread: Please send PM to Durvelle27 with your GPU manufacturer and Series and he will add you.

Repost


----------



## DeviousAddict

hey people.
Well i connected the Xfire bridge (with the system off lol) and i have run a fire-strike bench, check it out :- http://www.3dmark.com/fs/1141818

I think it's a pretty good score personally









Edit: it's a little less than double my score with a single 280X http://www.3dmark.com/fs/1032764


----------



## VegetarianEater

is it just me or has it been almost 2 weeks since the toxic has been in stock? unless i'm missing them at 4 am or something, i haven't seen them in a while.... really wanting to get at least one for now (and probably a second later, maybe with a games bundle or something). Granted i haven't bought all the parts for my system yet, i'm waiting for black friday, but with something like the toxic it might not be in stock then...


----------



## battleaxe

Quote:


> Originally Posted by *DeviousAddict*
> 
> hey people.
> Well i connected the Xfire bridge (with the system off lol) and i have run a fire-strike bench, check it out :- http://www.3dmark.com/fs/1141818
> 
> I think it's a pretty good score personally


Okay, stupid question.

Where do I download Firestrike from can I have a URL please?

Feel like such a DA asking but been looking for a bit now, so WTheck.


----------



## DeviousAddict

Quote:


> Originally Posted by *battleaxe*
> 
> Okay, stupid question.
> 
> Where do I download Firestrike from can I have a URL please?
> 
> Feel like such a DA asking but been looking for a bit now, so WTheck.


No worries dude
If you have Steam just search 3DMark, they have a free demo which runs all the tests and is limited to basic settings or you can buy it for £18.99

Link: http://store.steampowered.com/app/223850/?snr=1_7_15__13


----------



## battleaxe

Quote:


> Originally Posted by *DeviousAddict*
> 
> No worries dude
> If you have Steam just search 3DMark, they have a free demo which runs all the tests and is limited to basic settings or you can buy it for £18.99
> 
> Link: http://store.steampowered.com/app/223850/?snr=1_7_15__13


Okay, I have steam and have 3dMark 11. But not sure how to run Firestrike?

I can run 3dMark 11

See, I told you I'm a DA.... lol


----------



## DeviousAddict

Quote:


> Originally Posted by *battleaxe*
> 
> Okay, I have steam and have 3dMark 11. But not sure how to run Firestrike?
> 
> I can run 3dMark 11
> 
> See, I told you I'm a DA.... lol


It's not on 3D mark 11, its just called 3Dmark. If you go to the link i put in the previous post it should take you to steams product page. Just click the download demo button and you're all good


----------



## battleaxe

Quote:


> Originally Posted by *DeviousAddict*
> 
> It's not on 3D mark 11, its just called 3Dmark. If you go to the link i put in the previous post it should take you to steams product page. Just click the download demo button and you're all good


Okay, downloading now. Didn't realize there was a difference. I must have only downloaded the mark 11 one.


----------



## DeviousAddict

no worries lol. Glad I could help


----------



## DeviousAddict

Vantage score for Xfire XFX R9 280X:
http://www.3dmark.com/3dmv/4865156


----------



## Theroty

How is everyone enjoying their cards? I love my Matrix!


----------



## Ashuiegi

crossfire 280x will give you 60 fps at 1440p and 1600p in bf4 ultra easy ( at least when oced to around 1150-1650). BF4 actually run better then bf3 , you can go look at bench from bf3 and add 10 fps , you got pretty much your number ,..... at least if your have windows 8 , otherwise go buy it , even on a laptop that play bf4 in low , windows 8 made a huge change.
i don't understand what the fuss about bf4 since if you could run bf3 maxxed you are 100% sure to run bf4 maxxed too ,.....


----------



## sidp96

http://www.guru3d.com/news_story/amd_alters_never_settle_program_with_battlefield_4_and_thief.html

sigh...


----------



## Devildog83

Quote:


> Originally Posted by *VegetarianEater*
> 
> is it just me or has it been almost 2 weeks since the toxic has been in stock? unless i'm missing them at 4 am or something, i haven't seen them in a while.... really wanting to get at least one for now (and probably a second later, maybe with a games bundle or something). Granted i haven't bought all the parts for my system yet, i'm waiting for black friday, but with something like the toxic it might not be in stock then...


----------



## light70

Quote:


> Originally Posted by *light70*
> 
> anyone could tell what is better between asus dIrectII standard ( non TOP version ) and the vaporX ?? I have seen a review in which was reported that vapo-x keep decent temperature even under overclock ( about 1200 mhz and 79 °C ) while the asus manage to arrive at 1200 but with 89°C. It was also reported that the asus was way more silent than the vapor-x indeed the sapphire has the fan at 70% while the asus only reach 49%. Actually I would also know which card has the better power phase!!


anyone please??


----------



## Ashuiegi

i manage to grab a toxic, i ordered it they said 7 to 14 days and i got it about 10 days after , but they sold all of them before they were in stock. they were all on preorder and the product is always saying 7 to 14 days.
it s in europe, france but i guess if you really want one you need to preorder it.


----------



## VegetarianEater

Quote:


> Originally Posted by *Devildog83*


same question or just bumping me? either way thanks lol.

to the other guy, here in the US we couldn't pre order the toxic cards (i dont think) but either way at the time i wasn't even thinking about buying one, i was set on the asus. I decided however that i liked the looks of the toxic more (backplate+LEDs), and despite the fact that 2 of them will max out my PSU i really want to buy them (was also considering the fact that i could possibly get one of those 25$ off codes to make them cheaper)


----------



## Ashuiegi

by pre order i meant order it before it s actualy in stock


----------



## VegetarianEater

yeah can't do that either, i could have on amazon about 2 weeks ago, but not anymore...


----------



## MoNtHeR2

PLEASE HELP ME GUYS I GOT R9 280X WINFORCE AND AMD FX 8350 @4.4
BUT GPU USAGE AND MY GPU CORE CLOCK KEEP DROPING MAKING BF4 UNPLAYABLE WHAT IS WRONG??
http://postimg.org/image/43o2kolad/


----------



## grim.

I have an Asus 280x directcuII TOP, and am getting really bad artifacting making bf4 unplayable. This is after already rma'ing my matrix platinum for the same reason!

Anyone else having problems with this card on Battlefield 4?


----------



## Rar4f

Quote:


> Originally Posted by *grim.*
> 
> I have an Asus 280x directcuII TOP, and am getting really bad artifacting making bf4 unplayable. This is after already rma'ing my matrix platinum for the same reason!
> 
> Anyone else having problems with this card on Battlefield 4?


Do you get artifacts with other games?


----------



## grim.

I haven't given any other games a long session yet, tried Day Z but that's glitchy at the best of times. I'm going to try out bf3 and bioshock infinite tomorrow. It only seems to happen after about 2 hours or so though. I just uploaded a vid to youtube, and in doing so noticed that there's a few people on both nvidia and ati suffering with artifacts. So hopefully it's just bf4 being buggy.

The 7970 Matrix Platinum was doing it on bf3 too, and after only 20 minutes of gameplay. Also on benchmarking software like unigine heaven benchmark 4. But this 280x although the same chip, seems to be a better card so far. Hopefully I won't have to change it because I honestly don't know what else to try.


----------



## maxti6

I don't know exactly what it could be but every since I installed my R9 270x my audio has improved. Especially in DVD movies and such. For example I stuck in Rockshow (McCartney live '76) and the smoke and effects after the first song and into jet were mindblowing, as well as the audio being much improved and unlike I am used to.
I don't know if it is the included Radeon Audio driver that is doing this. I can't confirm it.
Along with that the picture has improved dramatically without any adjustments needed beyond what I did in the initial driver set-up.
I really don't think the Radeon did anything to the effects, but the way they stand out and swirl like a perfect pic made my eyes realize something.


----------



## Rar4f

Quote:


> Originally Posted by *grim.*
> 
> I haven't given any other games a long session yet, tried Day Z but that's glitchy at the best of times. I'm going to try out bf3 and bioshock infinite tomorrow. It only seems to happen after about 2 hours or so though. I just uploaded a vid to youtube, and in doing so noticed that there's a few people on both nvidia and ati suffering with artifacts. *So hopefully it's just bf4 being buggy.*
> 
> The 7970 Matrix Platinum was doing it on bf3 too, and after only 20 minutes of gameplay. Also on benchmarking software like unigine heaven benchmark 4. But this 280x although the same chip, seems to be a better card so far. Hopefully I won't have to change it because I honestly don't know what else to try.


Sounds like it.

EDIT:
could everyone who own a R9 280X check Bios and report back to this thread which chip you got? XT2 or the new XTL ?


----------



## Devildog83

Quote:


> Originally Posted by *maxti6*
> 
> I don't know exactly what it could be but every since I installed my R9 270x my audio has improved. Especially in DVD movies and such. For example I stuck in Rockshow (McCartney live '76) and the smoke and effects after the first song and into jet were mindblowing, as well as the audio being much improved and unlike I am used to.
> I don't know if it is the included Radeon Audio driver that is doing this. I can't confirm it.
> Along with that the picture has improved dramatically without any adjustments needed beyond what I did in the initial driver set-up.
> I really don't think the Radeon did anything to the effects, but the way they stand out and swirl like a perfect pic made my eyes realize something.


I hear the R9 series cards have True Audio which is supposed to enhance the Audio, at least that's what I heard.


----------



## Rar4f

Quote:


> Originally Posted by *Devildog83*
> 
> I hear the R9 series cards have True Audio which is supposed to enhance the Audio, at least that's what I heard.


260X and 290s have True Audio. Rest do not.


----------



## Durvelle27

*List Updated*


----------



## Devildog83

Is there an ETA for the 290 non-reference cards. I really hate reference cards and I am still considering a 290 over X-Fire with a 7870 and a 270x. If it's nt out soon I will just go with dual Devils and then sell them later and get the 290 or 290x.


----------



## TempAccount007

Quote:


> Originally Posted by *Mach 5*
> 
> So ive decided to jump over to the Red team, and ordered myself a HIS 280x, Its been delivered today and I cant wait to get home and put it through its paces!
> 
> Edit: 1500th post!


Can you tell me the max voltage you are able to set?

If it is more than 1.3 volts then I may be ditching my ASUS 280X DirectCU II for a HIS 280X.

Also, I find it weird that HWiNFO64 shows 1.2v under load, no matter what I set it to. My HIS 7950 would show the voltage I set (with vdroop) in HWiNFO64.

GPU Tweak shows 1.25v, even though I set it to 1.3v. I'm not sure if I should believe it or not. Regardless I wasn't expecting a 50mv vdroop when I bought the card with more phases.

ASIC is 68% btw.

All in all, meh.


----------



## hoevito

Finally went Crossfire!!! Matrix Plat on bottom, DC2 Top on top lol...

New and old Firestrike scores as well

I did have some issues though and I have a few questions. Asus GPU tweak causes my system to crash when Crossfire is enabled in AMD Catalyst control center, and my system immediately freezes if GPU tweak boots with windows upon startup. After uninstalling GPU tweak, everything works fine. Does anyone know why that might be? I'm on the latest beta 13.11/9.2 drivers. What gives? Is there any other program I could use to control both cards that won't cause my system to freak out like that?


----------



## Rar4f

Quote:


> Originally Posted by *hoevito*
> 
> Finally went Crossfire!!! Matrix Plat on bottom, DC2 Top on top lol...


Easy or you'll hurt someone with all that awesomeness


----------



## diggiddi

Nice setup Hoevito, I thought it was impossible to CFX the Matrix Platinum with another GPU, I recall reading that somewhere, also what is the distance between your Pcie slots and the length of the CFX bridge, thanks


----------



## BackwoodsNC

Quote:


> Originally Posted by *diggiddi*
> 
> Nice setup Hoevito, I thought it was impossible to CFX the Matrix Platinum with another GPU, I recall reading that somewhere, also what is the distance between your Pcie slots and the length of the CFX bridge, thanks


That's the put the matrix on the bottom.


----------



## diggiddi

Quote:


> Originally Posted by *BackwoodsNC*
> 
> That's the put the matrix on the bottom.


I see, so it only works with Matrix Plat on bottom drat! that changes everything, will it work with a 107 or 120mm CFX bridge?


----------



## hoevito

Quote:


> Originally Posted by *diggiddi*
> 
> Nice setup Hoevito, I thought it was impossible to CFX the Matrix Platinum with another GPU, I recall reading that somewhere, also what is the distance between your Pcie slots and the length of the CFX bridge, thanks


Yeah, due to the Matrix being a space hogging beast, I stays on the bottom. I wouldn't have minded Crossfiring two Matrix's, but as they'd sit right on top of one another and with literally no more than a few millimeters between them...is wasn't worth the risk. Since I bought my Matrix Platinum first, I soooooo wanted to crossfire with a Sapphire Toxic, but since you cannot find them anywhere in stock I opted for a DC2 Top instead. I just got everything hooked up just a few hours ago...and I must say it was a PAIN at first to get everything working right.

I found out that for some reason that Asus gpu tweak would cause MASSIVE problems when Crossfire was enabled. With crossfire disabled, I could run fine off the top card but not the bottom, and since I have gpu tweak set to start upon startup, my entire computer would freeze and crash upon boot!!! I had to fiddle around with it for an hour before I isolated the issue to GPU tweak as everything worked fine just using Catalyst control. So basically I uninstalled gpu tweak and went with MSI afterburner and everything runs well now. Oddly enough, for some reason Battlefield 4 is choppy as hell now, so I'm still working on figuring that out, but everything else, all other games I've tested run fine.

New Valley scores...


----------



## hoevito

Quote:


> Originally Posted by *diggiddi*
> 
> I see, so it only works with Matrix Plat on bottom drat! that changes everything, will it work with a 107 or 120mm CFX bridge?


I don't know what length bridge I'm using...but I will say that I could NOT use the bridge that came with the Matrix, and had to use the one that came with the DC2 instead. The Matrix bridge had these little tabs on the ends of the bridge connectors that wouldn't let me seat it all the way onto the cards. I also upgraded the fans on my Kraken x-60 and took the stock x-60 fans off an positioned them in the case to blow directly at the cards. I have one in front of both, blowing air mainly at the gap between the top and bottom card, and the other on the bottom blowing up at the Matrix. Boy am I glad I did because when they're both running they throw off a lot of heat!!!

And to answer your question, there's a one pci slot gap between them.


----------



## Rar4f

^ A toxic and Matrix Platinum in same case?
That would be Totrix!


----------



## diggiddi

Quote:


> Originally Posted by *hoevito*
> 
> I don't know what length bridge I'm using...but I will say that I could NOT use the bridge that came with the Matrix, and had to use the one that came with the DC2 instead. The Matrix bridge had these little tabs on the ends of the bridge connectors that wouldn't let me seat it all the way onto the cards. I also upgraded the fans on my Kraken x-60 and took the stock x-60 fans off an positioned them in the case to blow directly at the cards. I have one in front of both, blowing air mainly at the gap between the top and bottom card, and the other on the bottom blowing up at the Matrix. Boy am I glad I did because when they're both running they throw off a lot of heat!!!
> 
> And to answer your question, there's a one pci slot gap between them.


What about the bridge that came with your Mobo? ever tried that one


----------



## hoevito

Quote:


> Originally Posted by *diggiddi*
> 
> What about the bridge that came with your Mobo? ever tried that one


Maximus Hero VI only comes with an SLI bridge only...no Crossfire bridge bundle unfortunately. Either way the one that came with the Top works fine!!!


----------



## VegetarianEater

Quote:


> Originally Posted by *Rar4f*
> 
> ^ A toxic and Matrix Platinum in same case?
> That would be Totrix!


someone in this thread has already done this i believe...


----------



## diggiddi

Quote:


> Originally Posted by *Rar4f*
> 
> ^ A toxic and Matrix Platinum in same case?
> That would be Totrix!


or Matrox


----------



## Ashuiegi

been there , done it , work like a charm


----------



## diggiddi

Ashuiegi Are you using GPU Tweak too? What speeds are you running at? and which driver


----------



## kpo6969

MSI Afterburner 3.0.0 Beta 17

http://www.guru3d.com/files_details/msi_afterburner_beta_download.html


----------



## KeepWalkinG

Quote:


> Originally Posted by *kpo6969*
> 
> MSI Afterburner 3.0.0 Beta 17
> 
> http://www.guru3d.com/files_details/msi_afterburner_beta_download.html


http://imageshack.us/photo/my-images/585/az05.jpg/

Where is this voltage control ?


----------



## radier

Quote:


> Guys, I've just noticed that in beta 17 database we unlocked voltage control for reference VGA BIOS only. So those who flash third party BIOS are out of luck. We'll remove reference BIOS restriction shortly and upload updated new version.
> In meanwhile, while revised beta 17 is not uploaded experienced users may unlock voltage control in current beta 17 by adding the following lines to hardware profile file (.\Profiles\VEN_1002.......cfg):
> 
> [Settings]
> VDDC_IR3567B_Detection = 30h
> VDDC_IR3567B_Output = 0
> VDDCI_IR3567B_Detection = 30h
> VDDCI_IR3567B_Output = 1


http://forums.guru3d.com/showpost.php?p=4700906&postcount=2


----------



## DeviousAddict

Hey peeps, Ive updated my build log with pics of my 280X's in Cross-Fire if anyone's interested: http://www.overclock.net/t/1363985/build-log-my-1st-ever-intel-build-56k-warning


----------



## BackwoodsNC

Quote:


> Originally Posted by *kpo6969*
> 
> MSI Afterburner 3.0.0 Beta 17
> 
> http://www.guru3d.com/files_details/msi_afterburner_beta_download.html


Thanks for the post! Finally voltage control with afterburner.

*Tried that and adding those lines to settings and still no voltage control. Having to use trixx for voltage control and using MSI for on screen display. Seems to be working ok at the moment.


----------



## KeepWalkinG

When i clock my sapphire dual x r9 280x with Sapphire Trixx i have 1 problem.
After sleep PC voltage in sapphire trixx going on stock 1.200... and every time i need to move to 1.3v this is my voltage for 1220 core and 7200 memory.
How to change this option after sleep my voltage be stable???

THANKS !!!


----------



## Ashuiegi

i use gpu tweak monitoring only and sapphire trixx to set , but both can work , with some occasional error like the secondary gpu staying at 100% but you just quit/reopen the prog and it s fixxed. since it s the matrix in secondary i can see it on the color led change immediatly , and it s at startup anyway.

it s also a 280x and a 7970 , not 2 x 280x . i use the latest drivers available at amd a week ago think it's beta 9.2 something like that ( i m not at home atm).
i know my 7970 matrix can do 1280 core 1850 mem in games at stock voltage , 1320 core with 1,3v but it does atrifact a bit.
My sapphire 280 toxic can do 1250 core on stock voltage in firestrike and i didn't really tried to oc it much yet .

i run the CF in games at 1150 -1650 usually , i wanted to downclock them to 1100 because i simply had to much power for 60hz 1440p in bf4 ultra, i tried quickly just to down it to 1100 but the toxic didn't like it and i had a strange crash.
i need to take more time to test the toxic alone , i have to redo my wc loop too , i just recieived my 1080 nova 60mm rad ,..... while i drain and test my main rig i will put my toxic on my p67-2500k and tried to oc it the proper way , after that i can tell you what prog work the best .
atm i really only made de CF work and play a bit of BF4 , got it the same week end bf4 went out







but it work grest in bf4 ,i m over 100 fps most of the time in ultra but i just v-sync it to save load on the card , i need to try overide with supersampling to use more.
i have to try with 2 cf bridge too , just to see if something happen.

another pict of it with the matrix in angy 100% load pink ( i really think plain red would have look better)


----------



## Amhro

Quote:


> Originally Posted by *Ashuiegi*
> 
> another pict of it with the matrix in angy 100% load pink ( i really think plain red would have look better)











This makes me want to pick matrix 280x instead of 290


----------



## Ashuiegi

maybe they will do a 290 or 290x matrix in a while lol. there is actually good chance for that if they dont go over to nvidia for the next one , this card has been selling relatively good considering it s higher price and must be a quiet good deal for asus in the end. immagine a 20 phase 290x with a triple slot double 100mm fan


----------



## DomesticViolenz

Here is mine ! 
The only issue i have is as soon as i enable CF the 2nd GPU usage goes up to a stable 99% even if idle.
Any fixes for this ?


----------



## DeviousAddict

Just done a 3D mark 11 run to compare to yours ^^
You've got 5000 points on me lol


----------



## tsm106

Quote:


> Originally Posted by *radier*
> 
> Quote:
> 
> 
> 
> Guys, I've just noticed that in beta 17 database we unlocked voltage control for reference VGA BIOS only. So those who flash third party BIOS are out of luck. We'll remove reference BIOS restriction shortly and upload updated new version.
> In meanwhile, while revised beta 17 is not uploaded experienced users may unlock voltage control in current beta 17 by adding the following lines to hardware profile file (.\Profiles\VEN_1002.......cfg):
> 
> [Settings]
> VDDC_IR3567B_Detection = 30h
> VDDC_IR3567B_Output = 0
> VDDCI_IR3567B_Detection = 30h
> VDDCI_IR3567B_Output = 1
> 
> 
> 
> http://forums.guru3d.com/showpost.php?p=4700906&postcount=2
Click to expand...

That's for Hawaii.


----------



## KeepWalkinG

When i clock my card with Sapphire Trixx my idle voltage is on max why?
i dont use constant voltage or other software for graphic card.
http://imageshack.us/photo/my-images/692/jskt.jpg/


----------



## Ashuiegi

is the gpu usage at 100% or high too ?i have one of my card sometimes going at 100% usage insteed of idle and i fix it by exiting the oc program and opening it again . but my issues come from the sync card option i believe.


----------



## KeepWalkinG

No, in picture you see card is 0 % usage.
When i not using sapphire trixx, my voltage on idle is 0.900v and temperature is 31-2 degreese.
But voltage control for this card is only in sapphire trixx.

This is a dangerous for a card or no ?
In load my temperature is 72-73 max 30 min Valley benchmark 1.0


This high voltage can harm the card ?


----------



## Ashuiegi

if you click on default and save the profile , the card goes back to 1,3v on idle ? or is it only when ocing? that's strange if it s the case.
i would need to make some try with my toxic when i get back home to help you more.


----------



## KeepWalkinG

If i click to default ( reset and apply ) idle is 1.2v








Only after sleep mode my voltage is normal in idle 0.900v
Problem is after sleep mode my voltage always is stock 1.2v
Sleep mode reseting all my setting in sapphire trixx.


----------



## Ruckol1

Hey guys quick question -

I am in the market for a new GPU. I'm leaning towards the 270X, but there are a good number of people suggesting a 7950 over it. I thought they were pretty much the same, so wouldn't it be better to get the newer version - sort of a refreshed 7950?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Ruckol1*
> 
> Hey guys quick question -
> 
> I am in the market for a new GPU. I'm leaning towards the 270X, but there are a good number of people suggesting a 7950 over it. I thought they were pretty much the same, so wouldn't it be better to get the newer version - sort of a refreshed 7950?


270x = rebranded 7870 which is inferior to 7950.

You're much better off with a 7950, higher RAM bandwidth, more VRAM, better GPU...

Get a 7950 with decent cooling and OC the crap outta it.

Here's a comparison. Keep in mind the 7950 is at 800 MHz, average OC's for them go from 1100 MHz and up if you have good cooling.

http://gpuboss.com/gpus/Radeon-R9-270X-vs-Radeon-HD-7950


----------



## Ruckol1

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> 270x = rebranded 7870 which is inferior to 7950.
> 
> You're much better off with a 7950, higher RAM bandwidth, more VRAM, better GPU...


Thanks. What would be the equivalent of the 7950 in the R9 series? R9280, 280X?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Ruckol1*
> 
> Thanks. What would be the equivalent of the 7950 in the R9 series? R9280, 280X?


As of now there is no equivalent.

I'm not sure what the specifics are on the 280, if I had to guess then I'd say it's a lower binned 280x, which would make it a rebranded 7950.


----------



## Ashuiegi

maybe try to disable sleep mode , i will give it a try saturday to see what it s doing on my system, i didn't notice the problem but i have sleep mode disable.


----------



## DomesticViolenz

Quote:


> Originally Posted by *DeviousAddict*
> 
> Just done a 3D mark 11 run to compare to yours ^^
> You've got 5000 points on me lol


The I7 is giving me lotsa points @ Physics and combined score.
Try the latest driver ....amd_catalyst_13.11_betav9.2
Here is my Gpu Tweak setting


----------



## Amhro

Quote:


> Originally Posted by *Ruckol1*
> 
> Thanks. What would be the equivalent of the 7950 in the R9 series? R9280, 280X?


Go for 280x


----------



## MoNtHeR2

please help i have gpu usage and gpu clock drops while playing bf4 what should i do to get a constant usage >???
http://postimg.org/image/g6gizmqrz/


----------



## TempAccount007

Quote:


> Originally Posted by *Amhro*
> 
> Go for 280x


Or wait for a 7950 to go for $180 after rebate around black Friday or cyber Monday. There's a high chance of it happening.

If you get a new 7950 for $180, you will have perhaps the best performance per dollar out there... far better than a 280x or even a 7970 that might go on sale for $250 after rebate.

I would make sure you get a card with a vrm heatsink though. The cheapest (usually powercolor) don't have that from what I've heard.


----------



## Rar4f

Anyone know the cfm of R9 280X Toxic fans? Having trouble finding the info.


----------



## Devildog83

Quote:


> Originally Posted by *Ruckol1*
> 
> Thanks. What would be the equivalent of the 7950 in the R9 series? R9280, 280X?


If you get a Hawk, Devil or Toxic 270x you will be looking at close to 7950 performance, maybe not out of the box but it would be very easy to get there with a very slight overclock. A few more bucks get's the 280x. The best bang for buck card is the 290 but unless you need a reference to put a waterblock on it I would wait a bit for the non-reference cars. They will be out soon.


----------



## darkelixa

Anyone play the new final fantasy a realm reborn with these graphics cards? With my 70 gtx it gets alot of stuttering in video playback on the first video and menu screen yet in game it seems to run ok. Looking at switching teams very soon, the 7950s have gone on a huge sale here , 269 for a xfx 7950

http://www.pccasegear.com/index.php?main_page=product_info&products_id=25636&zenid=66f2655de93b77955cfc85e79dff4a21

Or should I just save for another month or so and buy a r280x His 3gb


----------



## DomesticViolenz

Quote:


> Originally Posted by *DomesticViolenz*
> 
> Here is mine !
> The only issue i have is as soon as i enable CF the 2nd GPU usage goes up to a stable 99% even if idle.
> Any fixes for this ?


Problem solved i had the latest driver installed which has the CF Application profile intigrated + the standalone Application profile, after removing the standalone 1st Gpu stopped throttling and 2nd Gpu load went back to normal.


----------



## mikeyzelda

I'm about to pull the trigger on one of this cards:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202045

or

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125475

can anyone tell me which one is quieter?, i read some reviews... some say the Gigabyte is loud, others that the Sapphire sounds like a jet, dunno which one to trust is always one or the other


----------



## darkelixa

Im close to buying a His r9 280x

http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1559&products_id=25341

Unless a 7970/7950 is more so worth the dollar to perforamnce value, 7970 is $319, 280x $375 7950 $269.

What would ya reccomend people for final fantasy a realm reborn


----------



## Rar4f

Quote:


> Originally Posted by *darkelixa*
> 
> Im close to buying a His r9 280x
> 
> http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1559&products_id=25341
> 
> Unless a 7970/7950 is more so worth the dollar to perforamnce value, 7970 is $319, 280x $375 7950 $269.
> 
> What would ya reccomend people for final fantasy a realm reborn


I recommend one of these 280X if you buy a 280X:

www.amazon.com/ASUS-MATRIX-R9280X-P-3GD5-Graphics-Cards/dp/B00FW47Y5I/ref=sr_1_1?ie=UTF8&qid=1384475267&sr=8-1&keywords=r9+280x+matrix
www.amazon.com/Sapphire-DL-DVI-I-SL-DVI-D-PCI-Express-11221-01-40G/dp/B00FLMKR78/ref=sr_1_6?ie=UTF8&qid=1384475161&sr=8-6&keywords=r9+280x+toxic

Use the spare money you will have (by not buying the HIS r9 280X) on some decent case fans to help you cool the cards


----------



## darkelixa

No where in Aus has those sapphire cards in stock and they are alot more in price, almost $500 aus dollars compaired to $375 for the His and its $449 for that asus one here with none in stock also


----------



## Rar4f

Quote:


> Originally Posted by *darkelixa*
> 
> No where in Aus has those sapphire cards in stock and they are alot more in price, almost $500 aus dollars compaired to $375 for the His and its $449 for that asus one here with none in stock also


Sorry thought it was american dollar








In that case id go with the HIS.


----------



## darkelixa

His is just a good as brand as Asus arent they? Ive never purchased one of them before


----------



## darkelixa

Also the His needs two, 8 pin pcie connectors as the others only need 1x6pin and 1x8 pin


----------



## Rar4f

Quote:


> Originally Posted by *darkelixa*
> 
> His is just a good as brand as Asus arent they? Ive never purchased one of them before


It doesn't seem like a bad card based on what i read from reviews.


----------



## taem

Quote:


> Originally Posted by *darkelixa*
> 
> Also the His needs two, 8 pin pcie connectors as the others only need 1x6pin and 1x8 pin


Vapor x also takes 2 8 pins. Matrix and Toxic probably do also.

I don't think the HIS is a bad card at all from the reviews I've read, at least the turbo model isn't. Cools well, quieter than the vapor x. I might have considered the HIS turbo except it's really long, like 300mm.

Asus TOP and Vapor X use Hynix memory though, and afaik the other cards use elpida. I've always heard Hynix OCs better than elpida for whatever that's worth.


----------



## Rar4f

Quote:


> Originally Posted by *taem*
> 
> Vapor x also takes 2 8 pins. Matrix and Toxic probably do also.
> 
> I don't think the HIS is a bad card at all from the reviews I've read, at least the turbo model isn't. Cools well, quieter than the vapor x. I might have considered the HIS turbo except it's really long, like 300mm.
> 
> Asus TOP and Vapor X use Hynix memory though, and afaik the other cards use elpida. I've always heard Hynix OCs better than elpida for whatever that's worth.


You sure other cards don't use Hynix mem?


----------



## darkelixa

Wow I didnt see it was that long!! Should fit in my Lian Li pc-71a fb

The gigabyte is only 285 long but the 770gtx i have is currently gigabyte and im having no luck with final fantasy a realm reborn so wasnt sure if i should go that brand again


----------



## taem

Quote:


> Originally Posted by *Rar4f*
> 
> You sure other cards don't use Hynix mem?


I'm pretty sure the gigabyte uses elpida, and I think msi does also. To flip it, I know for sure the vapor x and dc2t use Hynix, I've seen the pix.
Quote:


> Originally Posted by *darkelixa*
> 
> Wow I didnt see it was that long!! Should fit in my Lian Li pc-71a fb
> 
> The gigabyte is only 285 long but the 770gtx i have is currently gigabyte and im having no luck with final fantasy a realm reborn so wasnt sure if i should go that brand again


The gigabyte is 270mm. http://www.tomshardware.com/reviews/radeon-r9-280x-third-party-round-up,3655-3.html


----------



## Rar4f

Quote:


> Originally Posted by *taem*
> 
> I'm pretty sure the gigabyte uses elpida, and I think msi does also. To flip it, I know for sure the vapor x and dc2t use Hynix, I've seen the pix.
> The gigabyte is 270mm. http://www.tomshardware.com/reviews/radeon-r9-280x-third-party-round-up,3655-3.html


What about Toxic and Matrix cards?


----------



## TempAccount007

I don't think you will ever reach a point where overclocking the memory increases performance.


----------



## diggiddi

Quote:


> Originally Posted by *kpo6969*


What about the DCII Top V2 any info on the innards, cos it looks just like its an actual 7970

http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=8589219&CatId=8754


----------



## taem

Just told the msi and the his use Hynix also. You have to assume the toxic and matrix do. So I was totally off base, it may be only the gigabyte uses elpida. But the gigabyte definitely uses elpida. http://www.techpowerup.com/mobile/reviews/Gigabyte/R9_280X_OC/31.html edit to add club3d 290x uses elpida so you'd assume their 280x does also.
Quote:


> Originally Posted by *TempAccount007*
> 
> I don't think you will ever reach a point where overclocking the memory increases performance.


Yeah I agree which is why I said "for whatever that's worth." But it seems important to some. And this board being what it is, aren't we obligated to overclock whatever can be overclocked?


----------



## KeepWalkinG

When overclock my memory on 7000:
ROPs - 39 Gpixel/s
BusWidth - 336gb/s


----------



## radier

*To anyone with MSI 270X HAWK or Gaming.*

Could you check OC capabilities with new Afterburner which unlocks voltage regulation?

BR,


----------



## DomesticViolenz

Quote:


> Originally Posted by *KeepWalkinG*
> 
> When overclock my memory on 7000:
> ROPs - 39 Gpixel/s
> BusWidth - 336gb/s


My max stable memory overclock was 6600 Mhz @ 1205 Mhz Gpu clock + 1,3V .


----------



## KeepWalkinG

What is the name of your memory chip, my is SK Hynix.


----------



## sixor

giga 270x here
1215 gpu
1600 vram

what do you think?

i don´t understand

using 1500 vram, gives me better score (just a little better), but come on, 100mhz??????????


----------



## radier

This is how GDDR5 works. There are built-in error correction mechanisms. Because of that sometimes memory seems to run just fine while actually EC kicks in reducing performance to keep memory stable.


----------



## lothop

Quote:


> Originally Posted by *grim.*
> 
> I have an Asus 280x directcuII TOP, and am getting really bad artifacting making bf4 unplayable. This is after already rma'ing my matrix platinum for the same reason!
> 
> Anyone else having problems with this card on Battlefield 4?


I'm having this issue also bad tearing. Just installing it into my 2nd system to eliminate anything other than a crapped out card.

Quote:


> Originally Posted by *Rar4f*
> 
> Do you get artifacts with other games?


Getting artifacts in browser games, VLC and random crashes while watching movies.

ASUS tech support have been really good about it though and quick replies to all of my questions / queries.


----------



## TempAccount007

Quote:


> Originally Posted by *lothop*
> 
> I'm having this issue also bad tearing.


What's the ASIC quality? If it's really low, it might be possible that the stock clock is too high for the stock voltages.

Also, have you tried increasing the voltage?

Edit: screen tearing isn't a symptom of something wrong with the card or its settings, but artifacts usually are.


----------



## lothop

Quote:


> Originally Posted by *TempAccount007*
> 
> What's the ASIC quality? If it's really low, it might be possible that the stock clock is too high for the stock voltages.
> 
> Also, have you tried increasing the voltage?
> 
> Edit: screen tearing isn't a symptom of something wrong with the card or its settings, but artifacts usually are.


VVDC 1.494 max running FurMark.
How do I check ASIC quality?


----------



## taem

Quote:


> Originally Posted by *lothop*
> 
> VVDC 1.494 max running FurMark.
> How do I check ASIC quality?


Gpu-z. Right click upper left corner and the drop-down box has an option to check ASIC.


----------



## Orifiel

Hey,

I am about to get 2 of these http://www.asus.com/us/Graphics_Cards/R9280XDC2T3GD5/#overview

or a single asus GTX780 OC direct cu ii....

I just need opinions for the crossfire, is there any problems? What chpset the asus 280x is wearing? XT-2 or XTL? There is a huge demand in this cards and if I place my order on monday, is going to take a week to come, maybe more, there is no stock yet. lol

bellow some more info about the case and psu


Spoiler: Warning: Spoiler!



whatever I get on monday, is going to a HAF Xm case http://www.coolermaster.com/product/Detail/case/mid-tower/haf-xm.html (my case is still on the box, i bought it for my new rig, i hope its more than enough). Here is a video review for the case http://www.youtube.com/watch?v=5SM4EVGaXek

and a corsair AX850 psu... I just need your real life experiences in crossfire, if I buy them, I will stick with it for the next years and I want it to be good.


----------



## Rar4f

Quote:


> Originally Posted by *Orifiel*
> 
> Hey,
> 
> I am about to get 2 of these http://www.asus.com/us/Graphics_Cards/R9280XDC2T3GD5/#overview
> 
> or a single asus GTX780 OC direct cu ii....
> . [/spoiler]




crossfire is better but it will ask for more cooling and power.
And though i have heard that xfire has some issues, i have also heard that AMD has said they will work on giving better support for xfire. And recent drivers have shown exactly that.


----------



## Devildog83

crossfire is better but it will ask for more cooling and power.
And though i have heard that xfire has some issues, i have also heard that AMD has said they will work on giving better support for xfire. And recent drivers have shown exactly that.[/quote]

I really looks like more than 2 x 280x is a waste of money. 3 and 4 x are not much better. Not that I would go there anyway but that's kinda interesting.


----------



## Orifiel

Thats why i gave info about my case and the psu... Is not enough? (read the spoiler)


----------



## Devildog83

Quote:


> Originally Posted by *Orifiel*
> 
> Thats why i gave info about my case and the psu... Is not enough? (read the spoiler)


Edit: I was talking about the graph that shows 2x 3x and 4x 280x's. There is barely any difference in the scores at all. Unless you were running those tests with an AX 850 I don't think I was referring to you. I was only making a comment about the 280x's in X, tri and quad fire.


----------



## Orifiel

well I think I said 2









I am dyslexic, but I triple checked, I said 2...

after edit : ok sorry for the confusion...


----------



## Devildog83

Quote:


> Originally Posted by *Orifiel*
> 
> well I think I said 2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am dyslexic, but I triple checked, I said 2...


It's cool I was talking about the graph.


----------



## Rar4f

Quote:


> Originally Posted by *Orifiel*
> 
> Thats why i gave info about my case and the psu... Is not enough? (read the spoiler)


The PSU should be sufficient yes. If your not fond of the case, may i suggest a different one?


----------



## Orifiel

but the case is big, it can fit 3 cards, did you see the video review? It can hold water-cooling, or 2x200mm exhaust fans. Its bigger than mid tower. You can also add 2 fans to cool directly the video cards.

http://www.youtube.com/watch?v=5SM4EVGaXek

its totally brand new, havent even opened the box, I could exchange it, but its a really nice case. With hotswap, etc


----------



## Rar4f

Quote:


> Originally Posted by *Orifiel*
> 
> but the case is big, it can fit 3 cards, did you see the video review? It can hold water-cooling, or 2x200mm exhaust fans. Its bigger than mid tower. You can also add 2 fans to cool directly the video cards.
> 
> http://www.youtube.com/watch?v=5SM4EVGaXek


It's not a bad case, but in my opinion you might be better off with this case:



If you take advantage of the case, you will be able to have 7x140mm fans options.


----------



## lothop

Quote:


> Originally Posted by *taem*
> 
> Gpu-z. Right click upper left corner and the drop-down box has an option to check ASIC.


68%


----------



## TempAccount007

Quote:


> Originally Posted by *Orifiel*
> 
> What chpset the asus 280x is wearing? XT-2 or XTL?


I have the card. I might check if someone told me how to.


----------



## Rar4f

Quote:


> Originally Posted by *TempAccount007*
> 
> I have the card. I might check if someone told me how to.


Go to Bios, the info should be there


----------



## TempAccount007

Quote:


> Originally Posted by *Rar4f*
> 
> Go to Bios, the info should be there


I need more info than that.


----------



## Rar4f

Quote:


> Originally Posted by *TempAccount007*
> 
> I need more info than that.


www.ehow.com/how_8206263_access-gpu-bios.html
If Bios info has "XTL" in it, then you know you got XTL chip .


----------



## TempAccount007

Quote:


> Originally Posted by *Rar4f*
> 
> www.ehow.com/how_8206263_access-gpu-bios.html
> If Bios info has "XTL" in it, then you know you got XTL chip .


I've got a ASRock Z77 Extreme4 1155 motherboard and I don't see GPU Settings in the BIOS.


----------



## Orifiel

the only thing we know is this, http://rog.asus.com/forum/showthread.php?39121-Rumour-AMD-s-working-on-an-R9-280X-with-Tahiti-XTL-GPU

but many say that users can check their bios information and see whatt they got. Thats why I asked..


----------



## Ashuiegi

i think in it s price range the haf xm is a good case but for air cooling i would try to stretch the budget to a corsair air 540 or a silverstone raven/forteress


----------



## TempAccount007

Quote:


> Originally Posted by *Orifiel*
> 
> the only thing we know is this, http://rog.asus.com/forum/showthread.php?39121-Rumour-AMD-s-working-on-an-R9-280X-with-Tahiti-XTL-GPU
> 
> but many say that users can check their bios information and see whatt they got. Thats why I asked..


I heard the source is a French model.


----------



## Orifiel

GPU-Z does not provide this info?


----------



## Dragon-Emperor

Hey guys,
If you had to choose between the MSI 280x gaming and the Asus 280x DCUII, which one would you choose?
From my local store, both are $239.
The MSI gaming only has a single DVI port, where as the Asus has 2.
The MSI gaming has 2 display ports, where as the Asus has 1.
I believe MSI has better customer service with RMA when compared to Asus (I hope this is never needed though).
But the Asus seems to be the more silent, better cooled unit according to reviews from Anandtech and Tomshardware.

I think I touched all the bases, but is there anything I'm missing between the two cards that could make one more desirable?
Silence is really important to me, but I had some might bad luck with Asus 7950 DCUII fans making a faint grind noise at low rpms.


----------



## Devildog83

Quote:


> Originally Posted by *Dragon-Emperor*
> 
> Hey guys,
> If you had to choose between the MSI 280x gaming and the Asus 280x DCUII, which one would you choose?
> From my local store, both are $239.
> The MSI gaming only has a single DVI port, where as the Asus has 2.
> The MSI gaming has 2 display ports, where as the Asus has 1.
> I believe MSI has better customer service with RMA when compared to Asus (I hope this is never needed though).
> But the Asus seems to be the more silent, better cooled unit according to reviews from Anandtech and Tomshardware.
> 
> I think I touched all the bases, but is there anything I'm missing between the two cards that could make one more desirable?
> Silence is really important to me, but I had some might bad luck with Asus 7950 DCUII fans making a faint grind noise at low rpms.


Do you mean $339? If you can get those for $239 I would buy all of them I could get my hands on.


----------



## Rar4f

^ Lol i would buy two of them!


----------



## Devildog83

To answer the question which I forgot to do. Both of those cards have had good reviews, both overclock nicely and are relatively quiet so IMHO it would be 6 in one hand and half a dozen in the other. I think the DCII top looks the best and is supposed to have Asus' best Silocon but that's up to you.


----------



## Warl0rdPT

I just got one Asus 280X TOP DCII









Its as silent as my old Gigabyte 460GTX (Windforce).

For now the only negative aspect is that I had to get back to using screws on the expansion slots for this card, in the HAF 932 the plastic clips bump into the card "backplate" and I can't t close them. So screws it is... at least its better secured


----------



## TempAccount007

Quote:


> Originally Posted by *Devildog83*
> 
> DCII top is supposed to have Asus' best Silocon


There have been three posts about ASIC quality of the DC2T and none have been above 70%.

http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-owners-club/1380_20#post_21162698
http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-owners-club/1600_20#post_21200739
http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-owners-club/1520_20#post_21186539

Warl0rdPT let us know the ASIC quality when you get it.


----------



## Warl0rdPT

Mine is 67.8%


----------



## steelkevin

Which explains why EK, out of all the available 280X models, would chose to make a block for the DC2T only







.

Mine was a bit above 60% by the way.


----------



## TempAccount007

So four out of four DC2T's have an ASIC quality below 70%. And three of them are within .2% of each other. Interesting. Chip binning perhaps?

Edit: Also, the whole idea that low ASIC cards do better under water is wrong. They actually have LESS leakage than high ASIC cards.


----------



## taem

Quote:


> Originally Posted by *TempAccount007*
> 
> There have been three posts about ASIC quality of the DC2T and none have been above 70%.
> 
> http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-owners-club/1380_20#post_21162698
> http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-owners-club/1600_20#post_21200739
> http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-owners-club/1520_20#post_21186539
> 
> Warl0rdPT let us know the ASIC quality when you get it.


I don't know if I'm one of the posts you reference, mine was 60. So... High ASIC chips are going into the matrix boards? I know ASIC isn't a simple linear scale but from what I can gather most users, erroneously or not, seem to consider higher ASIC to mean a better card. Is there any link between ASIC and temps? Because I thought my DC2T had higher than normal temps. Reviews on more aggressive systems were reporting 31c idle 70c max load, I was getting 40c idle 75c gaming load. My system cooling is good, 4670k idling at low 20s, mid 40s in 2+ hours of prime95, mobo at <30.


----------



## Ashuiegi

no matrix have low asic , at least the 7970 had , they even said it was on purpose , because they did better under ln2 and water


----------



## Devildog83

The stuff read from Asus must have meant the best chips for water. Guess you can't believe everything you read, even from the so-called pros.

Mine.


----------



## taem

I don't understand this idea of Matrix cards doing better on water. You design a 3 slot monster air cooler, charge a premium for it, with the idea that the end user is going to remove and toss that cooler and put a block on it? I thought water users went reference boards.


----------



## TempAccount007

IMO the only draw of Matrix is the 1.4v


----------



## alancsalt

Quote:


> Originally Posted by *jorvie*
> 
> Hi,
> 
> After getting a r9 270x my screen started going white or grey and completely freezes up at random times. This never happens when gaming, only when just browsing the web or if its idle.
> 
> I upgraded from a hd 7850 and I have used an hd 7870 and gtx 660 within the last 2 weeks without any problems.
> 
> My powersupply is this
> http://www.silverstonetek.com/product.php?pid=253&area=en
> 
> It was working fine for a couple of days, after using guru3d driver sweeper, but then I got a blank screen again. I don't know if it's a coincidence but so far it has only happened while using chrome.
> Could this be a driver issue? I really don't want to rma my card and wait weeks only to find out that the new card have the same problem and it was just a problem with the driver.
> 
> I have now tried the Catalyst V13.10 drivers from sapphire site and I still got the same problem.
> 
> The drivers work perfectly fine with my hd 7850 so I doubt it's a driver issue. The problem has never occoured doring gaming so I doubt that it's because of my powersupply
> 
> My system is as follow
> 
> ASUS P8H77-I Mini ITX motherboard
> 8gb kingston 1600mhz ddr3 memory
> Intel i5-3570k processor
> Windows 7 Ultimate 64bit
> Sapphire DUAL-X R9 270X 2GB graphic card http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=0&psn=000101&lid=1&leg=0
> Silverstone ST45SF 450watt powersupply http://www.silverstonetek.com/product.php?pid=253&area=en
> AM D Catalyst™ 13.11 Beta9.2 / AM D Catalyst™ 13.10 both tested and still give me the same problem
> Google chrome is fully updated to 31.0.1650.57 m


Possibly?

If you're using BitDefender 2010 (and perhaps earier) do this:

Right click the BitDefender icon in the system tray (bottom right of the task bar)
Click on Settings to open the User Interface Settings window/box (top right)
Swithch UI to Expert mode
Close the User Interface Settings box
Click on Antivirus
Click on the Shield tab
Click on Advanced Settings to open BitDefender Active Virus Control Settings.
You need to add Chrome to the Exclusions list.
Click on the "+" button to give you the Open executable files window/box.
Navigate your way through the folder hierachy to C:\Documents and Settings\[Your Windows User Account]\Local Settings\Application Data\Google\Chrome\Application\Chrome.exe
Click on the Open Button

Chrome is now in the Exclusions list and should now work perfectly.


----------



## Ashuiegi

it was more for ln2 then water , part of the bundle was for ln2 , and at the time the 7970 matrix go out , it broke a few record with extrem cooling.
i can remember review that said the asic was low and they asked asus and they replied in was intended like that. The chips will go lower on air but the best one under ln2 are nearly always low asic ,..... plus from what i can see of the max oc in this thread , even on air , it mean nothing.
i have a matrix 7970 at 60% asic that does 1280 mhz stock voltage on air in firestrike , my 280x toxic is at 69% asic and it does 1250 mhz at stock voltage on air.


----------



## TempAccount007

Quote:


> Originally Posted by *Ashuiegi*
> 
> i can remember review that said the asic was low and they asked asus and they replied in was intended like that.


Could you link a source?

Also, I thought the whole idea was that a leaky chip would perform well under water/ln2. Low ASIC chips aren't leaky. It could be that low ASIC chips perform better under extreme cooling... but it's not because they are leaky.


----------



## Ashuiegi

it was a review at the time the matrix 7970 went out , over a year ago , so i could try but i m not sure i will , i went trough ton of them before buying mine so i have no idea which one it was.
i m rebuilding my main ring wc loop today so i wont have the time , maybe later in the week if i have spare time at work.

from the oc on air part i say that because my 280 and my 7970 can do better on stock voltage then most card on this thread and the matrix is at 60 the toxic at 69 ,.... which are considered low. but my matrix at 60% asic can do up to 1280 mhz on stock voltage and stay under 70 no problem ,.....
also from one install to another i've noticed gpu-z didn't read the same asic for the same card , my matrix was a 59% the first time i checked it now it says 60%. for all these reason i think asic cannot give any usefull info on a card for us consumer. maybe it help the manufacturer choose some cap and resistor value or something like that .


----------



## Warl0rdPT

What program are you guys using to test OC for artifacts? GPUTool?


----------



## Durvelle27

*List Updated*

Remember guys if you want to be added you have to pm the required info


----------



## sidp96

Info for list


----------



## TonytotheB

PM'd










MSI 280X Gaming Edition x 2 crossfired


----------



## Durvelle27

*List Updated*


----------



## TempAccount007

aidp96 pm's me the ASIC of his card. It is 69%.

5 out of 5 DC2T's have an ASIC below 70%. 4 out of 5 are within 1% of each other.

Seems to me like there is binning going on.


----------



## KeepWalkinG

Is there a way by which it can remove vdroop on r9 280x with NCP4206 ?


----------



## TonytotheB

Quote:


> Originally Posted by *TempAccount007*
> 
> aidp96 pm's me the ASIC of his card. It is 69%.
> 
> 5 out of 5 DC2T's have an ASIC below 70%. 4 out of 5 are within 1% of each other.
> 
> Seems to me like there is binning going on.


Mine is 66% for both my cards


----------



## Durvelle27

*List Updated*


----------



## Warl0rdPT

I'm having a problem while browsing with firefox, sometimes the screen flickrs and refreshes (it looks like it gets corrupted but quickly refreshes back to normal), this happens in a few ms.

Its not a problem with the site since it happens in every page (here included), and happens a couple of times every hour...

Any idea whats going on or how to fix it?


----------



## neXen

Quote:


> Originally Posted by *Warl0rdPT*
> 
> I'm having a problem while browsing with firefox, sometimes the screen flickrs and refreshes (it looks like it gets corrupted but quickly refreshes back to normal), this happens in a few ms.
> 
> Its not a problem with the site since it happens in every page (here included), and happens a couple of times every hour...
> 
> Any idea whats going on or how to fix it?


Are you using beta drivers?


----------



## Warl0rdPT

yes, amd_catalyst_13.11_betav9.2

I only installed the drivers, not CCC.


----------



## neXen

Quote:


> Originally Posted by *Warl0rdPT*
> 
> yes, amd_catalyst_13.11_betav9.2
> 
> I only installed the drivers, not CCC.


Try the most recent release drivers.

I had a similar issue with beta drivers.

I removed the GPU, uninstalled all GPU drivers, reseated GPU and installed the most recent release drivers.

Havent had an issue in weeks.


----------



## ILLmatik94

Quote:


> Originally Posted by *sidp96*
> 
> Info for list


How are you able to adjust VDDC and memory voltage? I have the same card but GPU Tweak is only showing core/memory clocks and core voltage.


----------



## Devildog83

I am in, http://www.techpowerup.com/gpuz/dquhm/


----------



## TempAccount007

Quote:


> Originally Posted by *ILLmatik94*
> 
> How are you able to adjust VDDC and memory voltage? I have the same card but GPU Tweak is only showing core/memory clocks and core voltage.


I would like an answer as well. Also sidp96 what is the max core voltage for you? Does it stick if you monitor core voltage via gpu tweak? TELL US YOUR SECRETS


----------



## Durvelle27

*List Updated*


----------



## diggiddi

Any DC2T at 1300mhz core yet? or is that the domain of the Matix platinums and Toxics


----------



## sidp96

Quote:


> Originally Posted by *TempAccount007*
> 
> I would like an answer as well. Also sidp96 what is the max core voltage for you? Does it stick if you monitor core voltage via gpu tweak? TELL US YOUR SECRETS


Believe it or not, I've monitored the core voltage and it does not stick there, that oc is stable for benching and stressing but I have to raise it to 1250 if I want to play games without artifacts, which is weird because I get none on Kombustor. Due to the DCU2T-V2 being slightly different than the normal version, I can only raise my core voltage up to 1263









Also, in order to get access to VDDC and other oc features you have to go to settings in GPU tweak and go to tune, there are checkboxes to choose what you want to show.


----------



## TempAccount007

Quote:


> Originally Posted by *diggiddi*
> 
> Any DC2T at 1300mhz core yet? or is that the domain of the Matix platinums and Toxics


Not me. I have tried flashing matrix 7970 and 280x bios to get more than 1.3v, but haven't had success. Even when I set the voltage to 1.3v, gpu tweak monitor shows 1.275v as the max voltage before vdroop.

Best I can do is 1220mhz at 1.23 - 1.25 volts with vdroop and an ASIC quality of 68%. Temps at 100% fan (I play with headphones) are 60 - 63 °C.

Higher voltage or ASIC quality is needed to go further.


----------



## Roaches

Quote:


> Originally Posted by *Devildog83*
> 
> I am in, http://www.techpowerup.com/gpuz/dquhm/


That is FUGGIN SEXY!









If the there's no non-reference R9 290 coming before January, I'm definitely gonna grab Devils for my console build.


----------



## TempAccount007

Quote:


> Originally Posted by *sidp96*
> 
> Believe it or not, I've monitored the core voltage and it does not stick there, that oc is stable for benching and stressing but I have to raise it to 1250 if I want to play games without artifacts, which is weird because I get none on Kombustor. Due to the DCU2T-V2 being slightly different than the normal version, I can only raise my core voltage up to 1263
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, in order to get access to VDDC and other oc features you have to go to settings in GPU tweak and go to tune, there are checkboxes to choose what you want to show.


Ahhh ok I get it. I don't have the V2 so I cant see/change VDDCI or memory voltage. I would recommend flashing a different bios to see if you can raise your max voltage, but I haven't had any luck doing so with mine.


----------



## chiznitz

Hey all, I'm currently in the market for a 280x. I was looking @ the HIS iPower Iceq models. There are 2 models available, the only difference I see is the one has a higher stock clock. Is there something about that card that will make it overclock higher or can the cheaper $299.99 version reach the same clocks?

Which would be best to go with? Is there a completely better option out of the $299-320ish cards?

I'm on an amd fx 8350 and will be upgrading to a sabertooth motherboard soon.


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> That is FUGGIN SEXY!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If the there's no non-reference R9 290 coming before January, I'm definitely gonna grab Devils for my console build.


If there was I may have grabbed the 290 myself as it is cheaper than the 2 cards I have and I could X-fire that later.

Firestrike.


----------



## diggiddi

I don't hink the 280x can reach the 290x performance unless extreme cooling is used, maybe someone else can chime in.


----------



## TempAccount007

Just played Paracel Storm... vdroop'd down to 1.2v on 100% load, even though I set VDDC to 1.3v.

To not artifact, I have to lower clocks until I don't get artifacts at 1.2v. Paracel Storm is pretty much the only time my card is at sustained 100% loads where vdroop gets REALLY bad. For whatever reason, there aren't artifacts and there is a less massive 70 - 80mv vdroop on furmark at 1200mhz despite 100% load.

Moral of story... more phases (DC2T marketing) don't mean less vdroop.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> If there was I may have grabbed the 290 myself as it is cheaper than the 2 cards I have and I could X-fire that later.
> 
> Firestrike.


Is that OC'd if so it seems low

http://www.3dmark.com/fs/655248


----------



## Devildog83

It's at 1200/1400 - here is 1224/1400 with 4.8



Somethings goofy here. I have a lower GPU, CPU and combined but my overall score is higher.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> It's at 1200/1400 - here is 1224/1400 with 4.8
> 
> 
> 
> Somethings goofy here. I have a lower GPU, CPU and combined but my overall score is higher.


Higher OC than me but still a lower score by a good margin

My run was at 1200/1450


----------



## Roaches

Quote:


> Originally Posted by *Devildog83*
> 
> If there was I may have grabbed the 290 myself as it is cheaper than the 2 cards I have and I could X-fire that later.
> 
> Firestrike.


Hopes powercolor pulls off an 290X/290 Devil edition.


----------



## darkelixa

When I play ffxiv a realm reborn with my i5 4670k, z87-ud3h 16gb ram,128gb samsung ssd nvidia 770gtx 2gb i get weird choppiness with my fps, the frame rate stays at 60+fps with vsync on but on the monitor it looks like a little screen tearing/looks like the frame rate jumps a beat, will chaning to this gigabyte mainboard and an amd 8350 solve the issue, or is it more so a gpu issue? I see alot of problems with nvidia gpus and that game so idk if i should change ship and just buy an amd video card


----------



## Roaches

Quote:


> Originally Posted by *darkelixa*
> 
> When I play ffxiv a realm reborn with my i5 4670k, z87-ud3h 16gb ram,128gb samsung ssd nvidia 770gtx 2gb i get weird choppiness with my fps, the frame rate stays at 60+fps with vsync on but on the monitor it looks like a little screen tearing/looks like the frame rate jumps a beat, will chaning to this gigabyte mainboard and an amd 8350 solve the issue, or is it more so a gpu issue? I see alot of problems with nvidia gpus and that game so idk if i should change ship and just buy an amd video card


MMOs are very cpu intensive...My GTX 680 runs the benchmark smooth as butter, even though I never played the game...

You should create a sig rig so everyone can see your specs, makes it easier to trouble shoot...Also screenshots and charts of your frame rate overtime and GPU usage helps as well.


----------



## DiceAir

So I own 2x Club3d R9-280x royalking edition. I would like to know if it's safe overclocking it. My temps full load on gpu1 is about 81C and that's when playing bf4 ultra everything [email protected] It's just that i don't see any memory heatsinks installed so a bit scared i will damage them. Wish I went with 2x Sapphire toxic cards.


----------



## smartdroid

Quote:


> Originally Posted by *Durvelle27*
> 
> Is that OC'd if so it seems low
> 
> http://www.3dmark.com/fs/655248


Actually they are both low results, even a single 280X can get in that ballpark.


----------



## DiceAir

Quote:


> Originally Posted by *smartdroid*
> 
> Actually they are both low results, even a single 280X can get in that ballpark.


that CPU is the bottleneck. Change to intel haswell setup to power them.


----------



## smartdroid

Yes i know, used to have a FX-8350 myself, but even graphics score are not that high.


----------



## Rar4f

Has anyone dismantled the gpu and applied aftermarket thermal paste on their r9 280Xs? If so, did it help alot?


----------



## JoeChamberlain

Would like to join. Got the below card and I should have waited for the Sapphire TOXIC to come in stock, my card just won't overclock very high at all (1110/1700), still happy with the performance though...


----------



## Rar4f

Quote:


> Originally Posted by *JoeChamberlain*
> 
> Would like to join. Got the below card and I should have waited for the Sapphire TOXIC to come in stock, my card just won't overclock very high at all (1110/1700), still happy with the performance though...


Threaten the GPU.

"If you don't go higher then no thermal paste or air cooling for you young man!"

And watch as the clockspeed reaches a titan level lol.


----------



## DiceAir

Should i overclock my Club3d R9-280x Royal king edition? I have 2 of them in Crossfire. temps on bf4 ultra = 83C max


----------



## Rar4f

Quote:


> Originally Posted by *DiceAir*
> 
> Should i overclock my Club3d R9-280x Royal king edition? I have 2 of them in Crossfire. temps on bf4 ultra = 83C max


What resolution are you playing at and monitor do you got?


----------



## DiceAir

Quote:


> Originally Posted by *Rar4f*
> 
> What resolution are you playing at and monitor do you got?


2560x1440 @ 96Hz. Qnix qx2710


----------



## Rar4f

Quote:


> Originally Posted by *DiceAir*
> 
> 2560x1440 @ 96Hz. Qnix qx2710


If there is a game that you play often, and your not reaching fps that you want (60 /96 fps) at maxium settings, then yes you should definetly overclock.
Make sure you got plenty cooling though.
Also make sure that you have profiles that fits games you want to play. For example if you can get 60 FPS and maxium settings for Skyrim on 1000 MHz core and 1550 memory, then switch to that profile.
What would be the point be for me to play Skyrim at highest frequency possible (e.g overclock of 1250core/1700 memory), when 1000 core and 1500 memory could get me the same performance


----------



## DiceAir

Quote:


> Originally Posted by *Rar4f*
> 
> If there is a game that you play often, and your not reaching fps that you want (60 /96 fps) at maxium settings, then yes you should definetly overclock.
> Make sure you got plenty cooling though.
> Also make sure that you have profiles that fits games you want to play. For example if you can get 60 FPS and maxium settings for Skyrim on 1000 MHz core and 1550 memory, then switch to that profile.
> What would be the point be for me to play Skyrim at highest frequency possible (1250core/1700 memory), when e.g 1000 core and 1500 memory could get me the same performance


i have corsair air540 with 3x cougar vortex in front and corsair h100i. CPu not overheating but top gpu can go 81C-85C max and i feel for this case it's to much


----------



## Rar4f

Quote:


> Originally Posted by *DiceAir*
> 
> i have corsair air540 with 3x cougar vortex in front and corsair h100i. CPu not overheating but top gpu can go 81C-85C max and i feel for this case it's to much


Like i said, if you don't need more performance for the games you play there is no point in overclocking.
Something you can do to tackle the temperature is buy better thermal paste (if you haven't already), and apply it to the gpus.


----------



## Devildog83

Quote:


> Originally Posted by *Durvelle27*
> 
> Is that OC'd if so it seems low
> 
> http://www.3dmark.com/fs/655248


Is this about right, this is at stock 1180/1400.

http://www.3dmark.com/3dm/1666206

It must be the new Beta9.2 driver or something but I still don't understand how the Graphics and physics are higher but the combined and overall are lower. It could be how the CPU is overclocked I don't know but I am very happy with the graphics score and the gameplay at 1200/1400 is freakin' awesome, no lag, no stutter, no flickering and everything jumps out at you. I can see things I would never have been able to see before. I played BF3 at well over 100 FPS average, low was about 85 FPS.


----------



## Devildog83

Maybe a bit of help here. I have everything synced-up, or at least I think so but my cards are not running the same clocks. 1 is at 1218/1399 and 1 is at 1100/1250 or stock. the 7870 is at stock. What am I missing?
Edit: Fixed, I just put the 7870 on top and now I can set both cards.

http://www.3dmark.com/3dm/1668033


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> Maybe a bit of help here. I have everything synced-up, or at least I think so but my cards are not running the same clocks. 1 is at 1218/1399 and 1 is at 1100/1250 or stock. the 7870 is at stock. What am I missing?
> Edit: Fixed, I just put the 7870 on top and now I can set both cards.
> 
> http://www.3dmark.com/3dm/1668033


Much better


----------



## Durvelle27

*List Updated*


----------



## taem

Could folks with a Vapor X 280x report on temps and fan noise?


----------



## Mackem

I have a 280X. One of my monitor randomly just turned itself off and back on again .

Also, I can't adjust the gamma in StarCraft II; no matter how high or low I put the slider it refuses to brighten/darken. I'm reinstalling Windows when I get my SSD tomorrow anyway so hopefully that will solve it.

EDIT: Hmm, screen just went off and came back on again. Using 13.11 Beta9.2. Any ideas what could be causing it?


----------



## darkelixa

this is what my 770gtx does so I have had enough and am changing to the AMD SIDE

The choppy video is not recording with fraps and the second is what it looks like while recording

Here is the choppy one,

http://www.youtube.com/watch?v=RNbujPOPwIQ&feature=youtu.be

Here is the a one while recording with fraps, oddly if you record it runs it smooth??

http://www.youtube.com/watch?v=KahPtO21Ekk&feature=youtu.be

Is the gigabyte or His the better side of the AMD vendors?


----------



## Roaches

Quote:


> Originally Posted by *darkelixa*
> 
> this is what my 770gtx does so I have had enough and am changing to the AMD SIDE
> 
> The choppy video is not recording with fraps and the second is what it looks like while recording
> 
> Here is the choppy one,
> 
> http://www.youtube.com/watch?v=RNbujPOPwIQ&feature=youtu.be
> 
> Here is the a one while recording with fraps, oddly if you record it runs it smooth??
> 
> http://www.youtube.com/watch?v=KahPtO21Ekk&feature=youtu.be
> 
> Is the gigabyte or His the better side of the AMD vendors?


Looks like Microstudder, since you're displaying high FPS, the same issue I had with Metro Last Light a few months back before optimized drivers....
Show us a pic of your Graphics card and model....And Afterburner readings of your GPU useage when running the game (see below for example).



If you don't have MSI afterburner and GPUZ, I'd recommend installing it....its great for tracking your GPU health, etc....


----------



## darkelixa

I will install Msi afterburner when i go home for lunch, so the only way to get proper performance in a game is to wait for a driver?


----------



## darkelixa

Also the fps drops in the video not recording and the fraps one stays at 60 fps?


----------



## Luckael

can i run my sapphire r9 280x OC with just only 1pcie power cable? my psu is Corsair RM650.

the pin of the cable is 8pin, then the other side of the cable is 2x6+2pin.

Thanks in advance.


----------



## darkelixa

Gpu usage on windowed borderless, only goes to about 50% still jittery. Can not for the life of me get smooth gameplay on fullscreen in this game on it.

Anyone with a r9 280x play ffxiv?


----------



## Roaches

Quote:


> Originally Posted by *darkelixa*
> 
> 
> 
> Gpu usage on windowed borderless, only goes to about 50% still jittery. Can not for the life of me get smooth gameplay on fullscreen in this game on it.
> 
> Anyone with a r9 280x play ffxiv?


Really odd because my sister rig with a GTX 660 is running smooth with custom max settings in her X58 system in FFX XIV....

My brother rig runs FFX XIV with his XFX 7850 DD fine as well with minor visual artifacts after longs hours of play when temps starts to reach low 70s Celsius...

Have you got around to getting MSI afterburner installed? Its hard to see chart readings from GPUZ than from MSI afterburner...
I normally use GPUZ for VRM Sensors in a stress test scenario...


----------



## darkelixa

I had msi afterburner installed yes but forgot to take a screenshot with it, yes the graphs did look pretty much the same, maxing out at about 50% usuage. Maybe the game is not optimised to run well on the 7 series cards I think as on my mates 660gtx ti he can run the game on max settings no issues he said.

Ill prob just jump the gun and buy a new graphics card and cross my fingers it works!! I have an old 5850 i could try out but i doubt it would do fullscreen settings as its an old card?


----------



## Roaches

Quote:


> Originally Posted by *darkelixa*
> 
> I had msi afterburner installed yes but forgot to take a screenshot with it, yes the graphs did look pretty much the same, maxing out at about 50% usuage. Maybe the game is not optimised to run well on the 7 series cards I think as on my mates 660gtx ti he can run the game on max settings no issues he said.
> 
> Ill prob just jump the gun and buy a new graphics card and cross my fingers it works!! I have an old 5850 i could try out but i doubt it would do fullscreen settings as its an old card?


Could be a driver issue, My bro's PC is running on a Phenom II X4 965 stock and he's running smooth as butter with his XFX 7850 DD....
I'd expected better from Nvidia, maybe you can try posting a topic in the Nvidia General Forums on OCN to see if anyone else with the GTX 7XX having the same issue as you....


----------



## oenone

i have an x6 1090t processor

planning to buy a r9 280x

will the processor bottleneck the video card ?

or should i settle for a 270x instead?


----------



## darkelixa

Ffxiv running 60fps standard desktop quality windows borderless


----------



## Prospector

HI there. My friend bought Msi R9 270X Hawk edition,and it has only one issue. The famous flickering at 2D,its really annoying because we tried many things to fix this (clean format,clean driver from the start) and at a random momment you can see 2-3 refresh rates at less than 10seconds! Also no bios from Msi has been released to fix this issue. Can you suggest any solution for that problem???


----------



## Devildog83

Quote:


> Originally Posted by *Prospector*
> 
> HI there. My friend bought Msi R9 270X Hawk edition,and it has only one issue. The famous flickering at 2D,its really annoying because we tried many things to fix this (clean format,clean driver from the start) and at a random momment you can see 2-3 refresh rates at less than 10seconds! Also no bios from Msi has been released to fix this issue. Can you suggest any solution for that problem???


I had the flickering with my 7870 but when I added the 270x and installed the new beta 9.2 driver it's all gone. I don't know for sure if it's the driver or not. If he doesn't have the beta 9.2 try that.


----------



## Prospector

Yes i installed 13.11 beta 9,2,i was hopping too that the problem will be solved,but ge it happens again and again.. :/ I think its a common issue with some 270x/280x and some companys released bios fix and msi didnt,that what am pissed off..


----------



## Devildog83

Quote:


> Originally Posted by *Prospector*
> 
> Yes i installed 13.11 beta 9,2,i was hopping too that the problem will be solved,but ge it happens again and again.. :/ I think its a common issue with some 270x/280x and some companys released bios fix and msi didnt,that what am pissed off..


Where did he buy it. He may be able to RMA and get a different one. I know it's $40 bucks more but the 270x devil is very sweet.


----------



## Amhro

Quote:


> Originally Posted by *oenone*
> 
> i have an x6 1090t processor
> 
> planning to buy a r9 280x
> 
> will the processor bottleneck the video card ?
> 
> or should i settle for a 270x instead?


Yes you will be fine, any overlock will help however.
I have similiar CPU and planning to go for 280x or even 290 if non-ref cards make it this month.


----------



## Bugdriver

Hey Guys,

first Post - so hello from outer Space.







I just have a Gigabyte 280X Windforce OC-Edition Rev. 1.0 since two weeks, F31 Bios (No Update so far) 13.11 Beta V 9.2 - no problems, apart from Battlefield / Bugfield 4. Just a little question to you guys: My VDDC Voltage is just 1.094 (GPU-Z) under heavy load - anybody here who have the same output? Looks a little bit low for me, if i compare it to other OC 280X Cards.

Thanks, Marc


----------



## diggiddi

Quote:


> Originally Posted by *oenone*
> 
> i have an x6 1090t processor
> 
> planning to buy a r9 280x
> 
> will the processor bottleneck the video card ?
> 
> or should i settle for a 270x instead?


Overclock that 1090t, Minimum 4.1GHZ if possible, but the higher the better


----------



## darkelixa

For the fun of it, pulled out my 770gtx , put the 5850 back in and voila, after installing the drivers game runs at 60fps no issues. Start up windows sounds also do not work with the 770 installed but do when i put the 5850 in. I think its definatly time to invest in a new ati card


----------



## TempAccount007

Quote:


> Originally Posted by *Bugdriver*
> 
> Hey Guys,
> 
> first Post - so hello from outer Space.
> 
> 
> 
> 
> 
> 
> 
> I just have a Gigabyte 280X Windforce OC-Edition Rev. 1.0 since two weeks, F31 Bios (No Update so far) 13.11 Beta V 9.2 - no problems, apart from Battlefield / Bugfield 4. Just a little question to you guys: My VDDC Voltage is just 1.094 (GPU-Z) under heavy load - anybody here who have the same output? Looks a little bit low for me, if i compare it to other OC 280X Cards.
> 
> Thanks, Marc


I get up to 80mv vdroop on my DC2T card. So whatever gigabyte locked your card's voltage to plus vdroop = your VDDC.


----------



## GTR Mclaren

QUESTION 270x users

is the voltage locked in your cards right now??

I have been reading some reviews...with a small OC the card can keep with a 760..like 5% less only

But those reviews mention that the voltage is locked waiting for an update to "unlock" it...

it is now ??

I question this because with an unlocked voltage the 270x maybe can equal or pass a 760....

at 50$ less I would consider more the 270x then.


----------



## Devildog83

Quote:


> Originally Posted by *GTR Mclaren*
> 
> QUESTION 270x users
> 
> is the voltage locked in your cards right now??
> 
> I have been reading some reviews...with a small OC the card can keep with a 760..like 5% less only
> 
> But those reviews mention that the voltage is locked waiting for an update to "unlock" it...
> 
> it is now ??
> 
> I question this because with an unlocked voltage the 270x maybe can equal or pass a 760....
> 
> at 50$ less I would consider more the 270x then.


Mine is locked, but it's clocked so high already I am running close to stock. Mine is locked and reads about 1.260v at full load. It has hit almost 140w power draw at that. My 7870 at 1.30v draws about 130w at full load.


----------



## Crowe98

What would you guys do? On my website that i use currently they're selling their 7950's, $279 for a Sapphire 7950 Vapor X OC Boost.


----------



## Devildog83

Quote:


> Originally Posted by *Crowe98*
> 
> What would you guys do? On my website that i use currently they're selling their 7950's, $279 for a Sapphire 7950 Vapor X OC Boost.


http://www.youtube.com/watch?v=tW_hIJRJsxo


----------



## mikeyzelda

Here's mine: Gigabyte 280X OC, forgot to take a pic, hope a screen of gpu-z will be enough, seems i got one of them "new" ones


----------



## darkelixa

Im looking to buy the HIS R9 280X http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1559&products_id=25341

It says it needs a minimum 750w psu which I have, will that be big enough to power the psu or do i have to buy a bigger psu, i see asus and gigabyte need much lower psus?

http://www.pccasegear.com/index.php?main_page=product_info&cPath=15_1470&products_id=18459

Thats my psu


----------



## Roaches

Quote:


> Originally Posted by *darkelixa*
> 
> Im looking to buy the HIS R9 280X http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1559&products_id=25341
> 
> It says it needs a minimum 750w psu which I have, will that be big enough to power the psu or do i have to buy a bigger psu, i see asus and gigabyte need much lower psus?
> 
> http://www.pccasegear.com/index.php?main_page=product_info&cPath=15_1470&products_id=18459
> 
> Thats my psu


You should be fine, 750W is overkill for a single GPU

See
http://www.tomshardware.com/reviews/radeon-r9-280x-r9-270x-r7-260x,3635-18.html


----------



## Crowe98

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *darkelixa*
> 
> Im looking to buy the HIS R9 280X http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1559&products_id=25341
> 
> It says it needs a minimum 750w psu which I have, will that be big enough to power the psu or do i have to buy a bigger psu, i see asus and gigabyte need much lower psus?
> 
> http://www.pccasegear.com/index.php?main_page=product_info&cPath=15_1470&products_id=18459
> 
> Thats my psu






Straya'. Me too.


----------



## Devildog83

Quote:


> Originally Posted by *darkelixa*
> 
> Im looking to buy the HIS R9 280X http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1559&products_id=25341
> 
> It says it needs a minimum 750w psu which I have, will that be big enough to power the psu or do i have to buy a bigger psu, i see asus and gigabyte need much lower psus?
> 
> http://www.pccasegear.com/index.php?main_page=product_info&cPath=15_1470&products_id=18459
> 
> Thats my psu


Yes that's good if it's anything close to a Corsair or Seasonic gold. The power draw is just a bit over 200w. I am drawing almost 290w with a 270x and a 7870 in X-Fire with a seasonic 660 platinum and a 125w TDP CPU and I am doing just fine. I don't know the brand but it should be OK.

Edit: I am not a power supply expert but I looked at the specs closer for your PSU and noticed that it has 4 12v rails. From what I hear 1 12v rail is much better for stable power delivery. It's something to think about.


----------



## BenDer-X

Im soo in love with 280X but currently facing minor issues before buying the card









Hope everything will work well *keeping fingers crossed*


----------



## candy_van

Quote:


> Originally Posted by *Devildog83*
> 
> Yes that's good if it's anything close to a Corsair or Seasonic gold. The power draw is just a bit over 200w. I am drawing almost 290w with a 270x and a 7870 in X-Fire with a seasonic 660 platinum and a 125w TDP CPU and I am doing just fine. I don't know the brand but it should be OK.\
> 
> Edit: I am not a power supply expert but I looked at the specs closer for your PSU and noticed that it has 4 12v rails. From what I hear 1 12v rail is much better for stable power delivery. It's something to think about.


Single V multi-rail setups make no difference, there are few excellent articles on this by JonnyGuru (excellent PSU source btw).
As for the intial question, even a quality (key word) 450w unit would be sufficient, if there are no plans to go CF down the road you won't need more than a good 500w unit.


----------



## PsYcHZ

Quote:


> Originally Posted by *mikeyzelda*
> 
> Here's mine: Gigabyte 280X OC, forgot to take a pic, hope a screen of gpu-z will be enough, seems i got one of them "new" ones


How do you know if its one of the newer ones? Just interested, my 280x has exactly the same info in gpu-z. Oh and what asic quality is yours?


----------



## darkelixa

So its not amperage of the 12v rail but just the pure wattage of the psu?? Its a 750w psu gold and has a maximum output of 720 w with 18a on 4 multi rails, so that would be 60 amps?


----------



## candy_van

No amperage still matters, I was saying that multi vs single rail doesn't (they're just two different ways to get the chicken cooked).

As said before your PSU will be more than enough for a 280X, you could go for a CF setup and be fine


----------



## mikeyzelda

Quote:


> Originally Posted by *PsYcHZ*
> 
> How do you know if its one of the newer ones? Just interested, my 280x has exactly the same info in gpu-z. Oh and what asic quality is yours?


The first 280x released using gpu-z said Radeon 7970 on name and on released date it was Dec 2011 if i'm not mistaken, my asic quality? 67.2%, looks a bit low doesn't it?, but the gpu runs flawlessly for me, so i don't care


----------



## ignus1212

Quote:


> Originally Posted by *taem*
> 
> Could folks with a Vapor X 280x report on temps and fan noise?


I just got this today, it runs pretty quiet(fan speed runs tops around 40-54% without tweaking), around 37c idle(I live in a tropical country), 60-62c load(Valley benchmark) and I'm seeing 55c'ish on normal gaming(Guild Wars 2).


----------



## taem

Quote:


> Originally Posted by *ignus1212*
> 
> I just got this today, it runs pretty quiet(fan speed runs tops around 40-54% without tweaking), around 37c idle(I live in a tropical country), 60-62c load(Valley benchmark) and I'm seeing 55c'ish on normal gaming(Guild Wars 2).


Thanks for info. I want this card but some reviews were saying it's really noisy. Those temps are great. I live in a temperate region on the coast, with my Asus TOP I was getting 38-40c idle and 75c gaming load, in a system that idles at low 20s for cpu and mb. Some guy earlier was telling me I was delusional for wanting a stock overclocked 280x that runs in the 60s but it sounds like that's what you're getting.


----------



## TempAccount007

Quote:


> Originally Posted by *taem*
> 
> Thanks for info. I want this card but some reviews were saying it's really noisy. Those temps are great. I live in a temperate region on the coast, with my Asus TOP I was getting 38-40c idle and 75c gaming load, in a system that idles at low 20s for cpu and mb. Some guy earlier was telling me I was delusional for wanting a stock overclocked 280x that runs in the 60s but it sounds like that's what you're getting.


At 100% fan speed the max my DC2T ever reaches is 63 with the core at 1220mhz @ 1.3v.

If you want those temps without fan speed/noise and without water, then I would say you are delusional.


----------



## taem

Quote:


> Originally Posted by *TempAccount007*
> 
> At 100% fan speed the max my DC2T ever reaches is 63 with the core at 1220mhz @ 1.3v.
> 
> If you want those temps without fan speed/noise and without water, then I would say you are delusional.


You're not the first to tell me that, like I said. But that guy I quote is getting unbelievable temps at moderate fan speed. And there are reviews that report the thermal and acoustic performance I'm looking for.

Tom's Hardware has the HIS Turbo gaming in high 60s at 41.7dB.

Kitguru has the Vapor X gaming at 63c at 34.3dB. (Too good to be true, no?)

(Can't provide links, this post disappears every time I switch tabs on my ipad, reviews are easy to find.)

In any event I think the Asus TOP I returned was a bad performer, 40c idle and 75c gaming load in a cool system? That's not completely out of line with reviews but it's at the poorer end of the spectrum. Almost all the reviews report 30-32c idle. Though, gaming load seems a consistent 72-75c at 42% fan speed which is inaudible.

Edit to add, I'm talking about running stock not oc'd like you're doing. I was getting 40/75c at the stock 1070mHz 1.2v.


----------



## bloodyredd

I would like to ask if my PSU would be able to handle this setup since I plan to upgrade my video card by december.

CPU: Intel i5-3300
GPU: Sapphire 280X Toxic
Mobo: ASRock z77 Pro 3
SSD: Intel 330 120gb
HDD: Seagate 500Gb

FANS: 4x Corsair SP120 and 1 Stock fan

PSU: Seasonic M12II 520w Bronze


----------



## candy_van

Quote:


> Originally Posted by *bloodyredd*
> 
> I would like to ask if my PSU would be able to handle this setup since I plan to upgrade my video card by december.
> 
> CPU: Intel i5-3300
> GPU: Sapphire 280X Toxic
> Mobo: ASRock z77 Pro 3
> SSD: Intel 330 120gb
> HDD: Seagate 500Gb
> 
> FANS: 4x Corsair SP120 and 1 Stock fan
> 
> PSU: Seasonic M12II 520w Bronze


Absolutely, it will handle that without a problem


----------



## TempAccount007

@taem

I can understand your desire for the best possible acoustics and temperatures. These cards aren't cheap and no one should keep a card they aren't satisfied with.

My card is an Iron Egg product so I don't have to commit to it until the end of January,

Are you going to go with a different brand? Let us know how the replacement is.


----------



## taem

Quote:


> Originally Posted by *TempAccount007*
> 
> @taem
> 
> I can understand your desire for the best possible acoustics and temperatures. These cards aren't cheap and no one should keep a card they aren't satisfied with.
> 
> My card is an Iron Egg product so I don't have to commit to it until the end of January,
> 
> Are you going to go with a different brand? Let us know how the replacement is.


I always go iron egg for purchases like this, and I should have held on to my card a while.

At this point I think I've read every 280x review and I think the Asus TOP is the pick of the litter, so I might go back to it. Maybe I'll get one that's slightly cooler. The Vapor X is the only other one that interests me (sexy backplate!!) but most of the reviews put it at near 60dB at load, with kitguru being an outlier. But I think the stock fan curve ramps to 70% very quickly, maybe if I set it at under 50% I could get near TOP quietness without temps being too high? That's why I ask for info on the card from users. Iwish the custom 290s would come out soon so I can see if I want to go that route instead. I'm also limited by my case, Toxic and HIS are too long. Even the Asus forces me to rotate the hdd cage.

But, the Asus TOP is a really nice card, I dunno about whether seeking 60s temp is delusional, but looking for a better deal than this card might be.


----------



## darkelixa

Had a bit of a panic attack earlier lol

My Bios says that my 12v + rail is running at 12.24v


When i use the software hwmonitor it says 3.36v on the 12v+ rail


Is this just a software bug or is the Psu/graphics card going bad??


----------



## ignus1212

Quote:


> Originally Posted by *taem*
> 
> You're not the first to tell me that, like I said. But that guy I quote is getting unbelievable temps at moderate fan speed. And there are reviews that report the thermal and acoustic performance I'm looking for.
> 
> Tom's Hardware has the HIS Turbo gaming in high 60s at 41.7dB.
> 
> Kitguru has the Vapor X gaming at 63c at 34.3dB. (Too good to be true, no?)
> 
> (Can't provide links, this post disappears every time I switch tabs on my ipad, reviews are easy to find.)
> 
> In any event I think the Asus TOP I returned was a bad performer, 40c idle and 75c gaming load in a cool system? That's not completely out of line with reviews but it's at the poorer end of the spectrum. Almost all the reviews report 30-32c idle. Though, gaming load seems a consistent 72-75c at 42% fan speed which is inaudible.
> 
> Edit to add, I'm talking about running stock not oc'd like you're doing. I was getting 40/75c at the stock 1070mHz 1.2v.


Maybe it's bad TIM application by the factory? I have a Powercolor 6950 with unlocked shaders that was running 90c onfull load but got down to 70-80's after reapplying the TIM.


----------



## toshevopc

does anyone know something about R9 280 release date or something related to it?


----------



## Rar4f

Quote:


> Originally Posted by *toshevopc*
> 
> does anyone know something about R9 280 release date or something related to it?


It has been released







Long time ago it was.


----------



## Warl0rdPT

What frequency are you guys being able to achieve on a 280X?


----------



## DizzlePro

has anyone been able to unlock there toxic's voltage yet?


----------



## Ashuiegi

i m pretty happy with my toxic stock cooler , it above my matrix under water so it has 1,5 cm of clerance maybe less and on the stock fan profil i can keep it under 70 @ 1150 with 0 extra noise.
i managed to up the voltage at 1,3 v in trixx but i had issues with my matrix and couldn't sync the cards so i just gave up on it for the moment, and i couldnt get a higher clock from it anyway .
in CF i can run them at 1220 mhz , but alone my toxic do 1250 and 1280 for the matrix on the stock voltage.
My matrix as not been able to achieve higher clock by going from stock 1,256 V to 1,3 V so i m not even sure it s worth it


----------



## Devildog83

Even my 270x Devil is voltage lock. Even at stock volts is can get up there on the memory although I will never use it because of X-Fire.


----------



## diggiddi

Quote:


> Originally Posted by *Ashuiegi*
> 
> i m pretty happy with my toxic stock cooler , it above my matrix under water so it has 1,5 cm of clerance maybe less and on the stock fan profil i can keep it under 70 @ 1150 with 0 extra noise.
> i managed to up the voltage at 1,3 v in trixx but i had issues with my matrix and couldn't sync the cards so i just gave up on it for the moment, and i couldnt get a higher clock from it anyway .
> in CF i can run them at 1220 mhz , but alone my toxic do 1250 and 1280 for the matrix on the stock voltage.
> My matrix as not been able to achieve higher clock by going from stock 1,256 V to 1,3 V so i m not even sure it s worth it


Did you ever run them CF on air? if so what Frequencies and temps


----------



## toshevopc

Quote:


> Originally Posted by *toshevopc*
> 
> does anyone know something about R9 280 release date or something related to it?


Quote:


> Originally Posted by *Rar4f*
> 
> It has been released
> 
> 
> 
> 
> 
> 
> 
> Long time ago it was.


i mean the replacer of radeon 7950 - only R9 280 (not 280X)


----------



## taem

Quote:


> Originally Posted by *ignus1212*
> 
> Maybe it's bad TIM application by the factory? I have a Powercolor 6950 with unlocked shaders that was running 90c onfull load but got down to 70-80's after reapplying the TIM.


I'd love to reapply TIM but afaik it almost always kills warranty. Some cards even have stickers or some other system for making sure you can't take the cooler off. I've heard you can call XFX and get their permission to reapply TIM or even put a water block on it while keeping your warranty. It's something I would do as soon as warranty expired though. That's like 1 year, 2 at most these days anyway.


----------



## darkelixa

Putting an order in next week for an R9 280X, just need to decide between, Gigabyte,XFX Or His


----------



## Devildog83

Quote:


> Originally Posted by *darkelixa*
> 
> Putting an order in next week for an R9 280X, just need to decide between, Gigabyte,XFX Or His


Don't like Asus MSI or Powercolor.


----------



## Warl0rdPT

When I stress test with kombustor my PC just reboots after a few seconds, what do you guys think, PSU?


----------



## NinjaToast

Just ordered my Asus DC2T today, I am excite.


----------



## Ashuiegi

yes first test both card were on air , i was at 70 on the matrix on the bottom and 75 on the toxic on top with stock fan profile, they run both at 1150-1650 for everyday use.
now i have only the matrix under water the toxic still has the stock cooler


----------



## SupahSpankeh

Hmm.

Was rock solid at 1.238v/1200MHz (no mem OC) and +20% power but this morning I'm really not. Backed clocks down to 1170 but still crashes to desktop in a lot of games, or locks up in furmark.

What do you think's happened? Have I lightly damaged the card or something?

EDIT: Asic quality is still 71% if that helps...


----------



## SupahSpankeh

Update: Got it pretty stable at 1150/1.25v.

This is really odd. Any advice?


----------



## hoevito

Just figured I'd share...the 9.4 beta has made a pretty significant improvement to my GPU temps. I'm currently running crossfire with a DC2 Top on top and a Matrix platinum on bottom and with the DC2 being the primary card with one full slot separating them, it would spike around 95* when overclocked to 1150mhz. These new drivers brought my max temp down to 78*!!! A pretty nice improvement I must say, with no other changes made to my system other than the drivers. Anyone else have any similar experiences after updating?


----------



## Rar4f

That is nice!


----------



## diggiddi

@Ashuiegi Thx for feedback

@Hoevito good for ya







I still can't install the newer drivers cos of BSOD stuck on WHQL


----------



## sonic2911

Newegg doesn't have bf4 bundle with the asus 280x -,- do u know what store have guys?


----------



## Rar4f

Quote:


> Originally Posted by *sonic2911*
> 
> Newegg doesn't have bf4 bundle with the asus 280x -,- do u know what store have guys?


XFX and Sapphire Dual X is only cards that were offically said would have BF4.


----------



## radier

MSI too.

Wysłane z mojego GT-N7000 przy użyciu Tapatalka


----------



## radier

http://event.msi.com/vga/2013/R9_series_FB4/

Wysłane z mojego GT-N7000 przy użyciu Tapatalka


----------



## Shurtugal

Does anyone know what the temps would be with two matrix's (Next to each other, 3 slot cards in 6 slots)?
At stock clocks, would they stay under 80 degrees celcius?
Cheers


----------



## darkelixa

With my lian li pc-a71b I dont think the His will fit into the pcie slot as it has a tool less holder and the clips may hit the top of the cooler so i might have to go with a different design, i think gigabyte are the only ones that dont have a weird cooler on them that stick up very high near where you screw them down


----------



## SupahSpankeh

Quote:


> Originally Posted by *SupahSpankeh*
> 
> Hmm.
> 
> Was rock solid at 1.238v/1200MHz (no mem OC) and +20% power but this morning I'm really not. Backed clocks down to 1170 but still crashes to desktop in a lot of games, or locks up in furmark.
> 
> What do you think's happened? Have I lightly damaged the card or something?
> 
> EDIT: Asic quality is still 71% if that helps...


Shabumply?


----------



## mAs81

Hey guys,I ordered this card:
msi radeon R9 280x OC gaming edition 3GB..
I know it's not the best of the 280 bunch, but it was the nearest to my funds..
Any thought??


----------



## oenone

will project mantle only support the new r9 , r7 etc... models of ati video cards ?

or will it support also the older models like the 5870 ?


----------



## TonytotheB

7XXX series and RX only


----------



## Devildog83

Quote:


> Originally Posted by *oenone*
> 
> will project mantle only support the new r9 , r7 etc... models of ati video cards ?
> 
> or will it support also the older models like the 5870 ?


I know it's supposed to be an open platform but I am not sure you will be able to unless they release driver or bios updates to do so which I doubt. At some point even NVidia could take advantage of it with future cards.


----------



## Devildog83

Quote:


> Originally Posted by *TonytotheB*
> 
> 7XXX series and RX only


That's good to know, my 7870 and 270x will be able to take advantage of it.


----------



## Rar4f

Quote:


> Originally Posted by *mAs81*
> 
> Hey guys,I ordered this card:
> msi radeon R9 280x OC gaming edition 3GB..
> I know it's not the best of the 280 bunch, but it was the nearest to my funds..
> Any thought??


I saw some good temps during load for Twin Frozr . So if thats the cooler you got then nice. The card looks nice as well.


----------



## mAs81

Quote:


> Originally Posted by *Rar4f*
> 
> I saw some good temps during load for Twin Frozr . So if thats the cooler you got then nice. The card looks nice as well.


I think thats the one :
http://www.msi.com/product/vga/R9-280X-GAMING-3G.html
Those were my thoughts exactly,even though ASUS 280x is a lot more quiet on full load
(still 20-30 euros more expensive)


----------



## Rar4f

Quote:


> Originally Posted by *mAs81*
> 
> I think thats the one :
> http://www.msi.com/product/vga/R9-280X-GAMING-3G.html
> Those were my thoughts exactly,even though ASUS 280x is a lot more quiet on full load
> (still 40 euros more expensive)


If you use a headphone noise will be irelevant. R9 280X Toxi, which is one of if not the best R9 280X, makes some noise.
I also read that MSI R9 280X was equipped with the new Tahiti chip (XTL) early on. XTL was said to be more power efficient and provide better cooling.


----------



## mAs81

Quote:


> Originally Posted by *Rar4f*
> 
> If you use a headphone noise will be irelevant. R9 280X Toxi, which is one of if not the best R9 280X, makes some noise.
> I also read that MSI R9 280X was equipped with the new Tahiti chip (XTL) early on. XTL was said to be more power efficient and provide better cooling.


Thanks man,did not know that....
As for the headphones,you're head on!!My last GPU was an asus radeon 5770!!!
It was almost like I had a jet airplane in my tower!!!!Decent graphics though.....


----------



## Rar4f

Quote:


> Originally Posted by *mAs81*
> 
> Thanks man,did not know that....
> As for the headphones,you're head on!!My last GPU was an asus radeon 5770!!!
> It was almost like I had a jet airplane in my tower!!!!Decent graphics though.....


What price did you buy the MSI for again? And what are the price for Matrix and Toxic 280Xs?


----------



## Warl0rdPT

Quote:


> Originally Posted by *sonic2911*
> 
> Newegg doesn't have bf4 bundle with the asus 280x -,- do u know what store have guys?


I don't think Asus has any 280X bundle with BF4


----------



## mAs81

Quote:


> Originally Posted by *Rar4f*
> 
> What price did you buy the MSI for again? And what are the price for Matrix and Toxic 280Xs?


Well, I live in Greece,so things are a little overpriced over here..
I bought the msi for 339€, matrix starts from 365€,and I can't find toxic in any sellers here at the moment..
So,yeah,you get the idea,right?


----------



## Rar4f

Quote:


> Originally Posted by *mAs81*
> 
> Well, I live in Greece,so things are a little overpriced over here..
> I bought the msi for 339€, matrix starts from 365€,and I can't find toxic in any sellers here at the moment..
> So,yeah,you get the idea,right?


Yeah, it's expensive here too. I asked because i thought if the price difference for a MSI and a Matrix/Toxic was not big you could get them instead.


----------



## mAs81

Quote:


> Originally Posted by *Rar4f*
> 
> Yeah, it's expensive here too. I asked because i thought if the price difference for a MSI and a Matrix/Toxic was not big you could get them instead.


Well the price difference between the two isn't that high per se,but I prefer to use those extra euros in buying more ram,If you know what I mean...
What really bugs me is that unfortunatelly eventhough my mb says it supports xfire,the second PCIE is X4(my mb is gigabyte Z87M-D3H),
so even when the prices go down,no crossfire for me...:'(


----------



## Rar4f

Quote:


> Originally Posted by *mAs81*
> 
> Well the price difference between the two isn't that high per se,but I prefer to use those extra euros in buying more ram,If you know what I mean...
> What really bugs me is that unfortunatelly eventhough my mb says it supports xfire,the second PCIE is X4(my mb is gigabyte Z87M-D3H),
> so even when the prices go down,no crossfire for me...:'(


I think the second slot is just less efficient.
The X16 slot is Pciexpress 3.0, while the X4 is Pciexpress 2.0.

How significant the difference is , i do not know, but it definetly seems like it supports Xfire.

EDIT: Here is a performance pic


----------



## mAs81

Quote:


> Originally Posted by *Rar4f*
> 
> I think the second slot is just less efficient.
> The X16 slot is Pciexpress 3.0, while the X4 is Pciexpress 2.0.
> 
> How significant the difference is , i do not know, but it definetly seems like it supports Xfire.
> 
> EDIT: Here is a performance pic


Well,it says so in the box too,but I always thought that for an efficient Xfire the PCIE speed has to be the same..
Well,I never had 2 gpus,so I'm kind of a noob to tell you the truth..Anyway,I guess I'll cross that bridge when the time (and the money)comes..
Hey,thanks for your input,man...


----------



## BackwoodsNC

Quote:


> Originally Posted by *mAs81*
> 
> Hey guys,I ordered this card:
> msi radeon R9 280x OC gaming edition 3GB..
> I know it's not the best of the 280 bunch, but it was the nearest to my funds.
> Any thought??


Good referenceboard. I have two of them


----------



## Rar4f

Yeah, i want to Xfire too in future. The case i will have is excellent. A Fractal R4 Define. I can with adapter install a HDD in secon 5.25 bay, and install two SSDs (if i ever buy another) behin the motherboard.
With this i can remove the top and lower HDD cage, allowing for even better air flow to 2x R9 280X.

Epic case is epic


----------



## mAs81

[/quote]

Hmmm..According to this,the difference is very small,2-3 fps max..
Thanks dude,you got my hopes up for the future!!!!!!(l'll need a new psu,though)









@BackwoodsNC
Thanks manI'd like to think so too..








What temps do you have????


----------



## mAs81

Quote:


> Originally Posted by *Rar4f*
> 
> Yeah, i want to Xfire too in future. The case i will have is excellent. A Fractal R4 Define. I can with adapter install a HDD in secon 5.25 bay, and install two SSDs (if i ever buy another) behin the motherboard.
> With this i can remove the top and lower HDD cage, allowing for even better air flow to 2x R9 280X.
> 
> Epic case is epic


I'm currently building a Corsair 350D,and I believe there is enough room for 2 280x for sure..
Good luck man,epic case is epic indeed!!!!!


----------



## Rar4f

Quote:


> Originally Posted by *mAs81*
> 
> I'm currently building a Corsair 350D,and I believe there is enough room for 2 280x for sure..
> Good luck man,epic case is epic indeed!!!!!


That's a nice case too! And thanks


----------



## nortrop

Quote:


> Originally Posted by *mAs81*
> 
> 
> 
> Hmmm..According to this,the difference is very small,2-3 fps max..
> Thanks dude,you got my hopes up for the future!!!!!!(l'll need a new psu,though)


You should note that those results are at full speed x16 in 3.0 and 2.0 for the sake of equal conditions.

TPU had a more exhaustive test with a GTX680 and a 7970 at all PCIE speeds

At 2.0 x4 there are some performance hits, some more severe than others, but it appears it mostly depends on how well optimised the game is, i.e BF3.


----------



## mAs81

Accidental doublepost..Sorry


----------



## mAs81

Quote:


> Originally Posted by *nortrop*
> 
> You should note that those results are at full speed x16 in 3.0 and 2.0 for the sake of equal conditions.
> 
> TPU had a more exhaustive test with a GTX680 and a 7970 at all PCIE speeds
> 
> At 2.0 x4 there are some performance hits, some more severe than others, but it appears it mostly depends on how well optimised the game is, i.e BF3.


Thanks for the link nortrop,I get what you're trying to say..
So,in your opinion if i Xfire 2 gpus,one in a X16 slot and another in a X4 slot,
I will downgrade the "speed"of both gpus?Meaning that it would be better to have one gpu with better results or It will only affect benchmarks for some games?I'm kind of a noob,never had 2 gpus before:-D


----------



## nortrop

Quote:


> Originally Posted by *mAs81*
> 
> Thanks for the link nortrop,I get what you're trying to say..
> So,in your opinion if i Xfire 2 gpus,one in a X16 slot and another in a X4 slot,
> I will downgrade the "speed"of both gpus?Meaning that it would be better to have one gpu with better results or It will only affect benchmarks for some games?I'm kind of a noob,never had 2 gpus before:-D


Like I said, it all depends on the game. Some titles scale well in crossfire, others don't. At x16/x4 the overall performance would surely be less than that at x8/x8 or x16/x16. Whether it would be 1fps or 50fps it all depends on the game and driver support/optimisation.

Here is some more food for thought, despite the article being a bit dated it's a fair representation.

Your motherboard is not an optimal choice for crossfire. Even if it was, multi GPU poses more tradeoffs than benefits, compared to single GPU (more heat, noise, higher power consumption, microstutter, driver support...). If I was you I'd stick to a single top end card.

Edit

Initially I wanted to link this OCN user test, but couldn't find it, so here it is: *Behemoth777's Crossfire x16,x4 vs x8,x8 Comparison*
Quote:


> Conclusion: If your looking to go crossfire, or even thinking about doing it in the future, make sure you go with a board that supports x8,x8 crossfire. It is a much better experience and also frees up your other pci-e slots for sound cards, raid cards, ssd's, etc.


----------



## mAs81

@nortrop
thanks again,very interesting articles..
I guess I'll stick to a single gpu for now,at least with this motherboard...


----------



## bangbangbowman

I'm in need of a huge upgrade in the GPU area. I'm eyeing the 280x but I wanted to get a feel for what the best brand is for it. I'm welcome to other options as well. Thanks


----------



## Rar4f

Quote:


> Originally Posted by *bangbangbowman*
> 
> I'm in need of a huge upgrade in the GPU area. I'm eyeing the 280x but I wanted to get a feel for what the best brand is for it. I'm welcome to other options as well. Thanks


Most of the brands are decent, with two of them (in my opinion) coming off as the best:

Asus Matrix PLATINUM and Sapphire Toxic - top two, build quality and frequency (1100 MHz out of box).

Then comes other cards which are good as well but not as good as the two i mentioned:
Sapphire Vapor X, Gigabyte Windforce, HIS, Asus , etc

So if you want the best R9 280X, it's either Matrix Platinum or Toxic.
The Matrix comes with a mousepad as well.



Then again these two cards aren't cheap, and you should only get one if it's appropriate for you.


----------



## TonytotheB

Go Toxic. I had real problems with the ASUS 7970 Matrix Plat and I will never go back to them!


----------



## Rar4f

Quote:


> Originally Posted by *TonytotheB*
> 
> Go Toxic. I had real problems with the ASUS 7970 Matrix Plat and I will never go back to them!


What was the problems?


----------



## TonytotheB

Quote:


> Originally Posted by *Rar4f*
> 
> What was the problems?


I bought two for crossfire and both developed problems even at stock settings of 1100 GPU clocks. Artifacting and temps of 90+. I RMA'd them and bought two 280Xs


----------



## Rar4f

Quote:


> Originally Posted by *TonytotheB*
> 
> I bought two for crossfire and both developed problems even at stock settings of 1100 GPU clocks. Artifacting and temps of 90+. I RMA'd them and bought two 280Xs


and you had sufficient cooling?


----------



## TonytotheB

Quote:


> Originally Posted by *Rar4f*
> 
> and you had sufficient cooling?


Well I had two MSI 280Xs in the same case and they never went above 60. So yes, I would say so. I won't go near ASUS again. There are loads of forum posts about problems with the MP 7970


----------



## Rar4f

Quote:


> Originally Posted by *TonytotheB*
> 
> Well I had two MSI 280Xs in the same case and they never went above 60. So yes, I would say so. I won't go near ASUS again. There are loads of forum posts about problems with the MP 7970


According to this forum post by a user in late october there was no reference to the new Tahiti XTL chip in Matrix card. The transition from XT2 chip to XTL was said to happen in november.
But the MSI r9 280X (along with HIS) had signs of having the XTL chip already late october.
And what did XTL chip promise? Better cooling and power efficiency.

Plus the fact that Matrix Platinum runs 100 MHz (core) and 400 MHz (memory)more than MSI R9 280X.
So perhaps reason why you had issues with your card is because you did not have sufficient cooling? On basis that Matrix has higher frequencies, and that MSI R9 280X may had the XTL chip already.

How many intake and exhaust fans did you have, and how big were they all?


----------



## taem

Quote:


> Originally Posted by *TonytotheB*
> 
> Well I had two MSI 280Xs in the same case and they never went above 60.


Seriously?! That seems absurdly low. I too wonder what you system cooling setup is. I've read pretty much every 280x review out there and not a single one reports a game load temp that low, and that's at single card. Did you tweak card fan speeds to a constant 100%?


----------



## bangbangbowman

So is getting a 290 over a 280x worthwhile for the price difference? I want to get a card that will do me great for quite some time


----------



## TonytotheB

Quote:


> Originally Posted by *taem*
> 
> Seriously?! That seems absurdly low. I too wonder what you system cooling setup is. I've read pretty much every 280x review out there and not a single one reports a game load temp that low, and that's at single card. Did you tweak card fan speeds to a constant 100%?


My case was the HAF XB. Same conditions and same fan profile for both. I always have fans at 80% always when gaming using an MSI AB profile. I am basing my suggestion on my experience. I won't touch ASUS again. I bought mine before reading the countless forum posts about crappy temps on the Matrix. I don't rate that cooler at all. My 280Xs never went about 60 at 80% fans and 99% usage in modified BF4.


----------



## Rar4f

Quote:


> Originally Posted by *bangbangbowman*
> 
> So is getting a 290 over a 280x worthwhile for the price difference? I want to get a card that will do me great for quite some time


What kind of monitor will you have? 1080 or 1440?


----------



## bangbangbowman

Quote:


> Originally Posted by *Rar4f*
> 
> What kind of monitor will you have? 1080 or 1440?


my monitor currently only goes to 1680 x 1050 (ts 6 years old or so) I plan on getting a 1080p i guess, I'll be shopping for one on cyber monday most likely


----------



## Rar4f

Quote:


> Originally Posted by *TonytotheB*
> 
> My case was the HAF XB. Same conditions and same fan profile for both. I always have fans at 80% always when gaming using an MSI AB profile. I am basing my suggestion on my experience. I won't touch ASUS again. I bought mine before reading the countless forum posts about crappy temps on the Matrix. I don't rate that cooler at all. My 280Xs never went about 60 at 80% fans and 99% usage in modified BF4.


The HAF XB (nice case) has only 2x 120mm intake fans. With no exhaust fans. Assuming you had no more fan, it's no wonder your Matrix Platinums in Xfire overheated.
Quote:


> Originally Posted by *bangbangbowman*
> 
> my monitor currently only goes to 1680 x 1050 (ts 6 years old or so) I plan on getting a 1080p i guess, I'll be shopping for one on cyber monday most likely


For a 1920 x 1080p monitor , a R9 280X will be a good pick. But if you can get a good bargain for a R9 290, then get one.
However, a R9 280X will do well for most games on 1080p.


----------



## bangbangbowman

I guess there is no chance of them going on sale for bf/cyber monday since they just came out


----------



## Rar4f

Quote:


> Originally Posted by *bangbangbowman*
> 
> I guess there is no chance of them going on sale for bf/cyber monday since they just came out


can't hurt to be on the look out.
Another good card for 1080p is GTX 770. I wouldn't recommend getting a 770 over a R9 280X unless it's really cheap. EVGA GTX 770 with acx cooling is good pick. And if you dont plan to overclock, you can try to get the overclocked versions.

Nvidia has a new technology called G-sync. It helps with stutter, lagg and tearing. It's quite innovating i hear.
To take use of this techonlogy you need a particular monitor (A G-sync monitor) with a GTX 650 TI boost card or better.
G-sync monitors will be released (dunnu when), but one monitor that is already out in market that you can incorporate G-sync technology into is ASUS VG248QE.
It's a good monitor.

With all this said, if you can get a really good offer for a GTX 770 you can then buy a ASUS VG248QE, or wait with buying a monitor til G-sync monitors are released and buy one then.
You will then be able to have G-sync technology.
R9 280X may get between performance boost as result of new API called Mantle that AMD will launch in december. But this promise of performance boost has not been shown yet in contrast to G-sync which people have experienced.

You can't go much wrong with either a GTX 770 or a R9 280X. But if you go with a GTX 770 because its super cheap :
1. Try to get a EVGA acx one
2. Either buy a ASUS VG248QE monitor and wait for G-sync module to be released by Nvidia that you can buy and install into your monitor OR stick with your current monitor til monitor's with G-sync module built into them are released and buy one.


----------



## Ashuiegi

the toxic is 1150 out of the box ,.....


----------



## Amhro

Quote:


> Originally Posted by *Rar4f*
> 
> Most of the brands are decent, with two of them (in my opinion) coming off as the best:
> 
> Asus Matrix PLATINUM and Sapphire Toxic - top two, build quality and frequency (1100 MHz out of box).
> 
> Then comes other cards which are good as well but not as good as the two i mentioned:
> Sapphire Vapor X, Gigabyte Windforce, HIS, Asus , etc
> 
> So if you want the best R9 280X, it's either Matrix Platinum or Toxic.
> The Matrix comes with a mousepad as well.
> 
> 
> 
> Then again these two cards aren't cheap, and you should only get one if it's appropriate for you.


Yeah, matrix and toxic are best 280xs you can get, however, adding another 20€ to that price will get you a 290..







That's why I decided to go with 290, just waiting for non-ref


----------



## Warl0rdPT

Quote:


> Originally Posted by *mAs81*
> 
> Well, I live in Greece,so things are a little overpriced over here..
> I bought the msi for 339€, matrix starts from 365€,and I can't find toxic in any sellers here at the moment..
> So,yeah,you get the idea,right?


But you also live inside EU, so why not buy from any other shop that sells them cheaper, like Germany,Spain? You won't pay any extra taxes and some shops even have free P&P.


----------



## Rar4f

Quote:


> Originally Posted by *Ashuiegi*
> 
> the toxic is 1150 out of the box ,.....


Without boost.


----------



## Amhro

Quote:


> Originally Posted by *Warl0rdPT*
> 
> But you also live inside EU, so why not buy from any other shop that sells them cheaper, like Germany,Spain? You won't pay any extra taxes and some shops even have free P&P.


Any shop tips? I'm also interested in this


----------



## mAs81

Quote:


> Originally Posted by *Warl0rdPT*
> 
> But you also live inside EU, so why not buy from any other shop that sells them cheaper, like Germany,Spain? You won't pay any extra taxes and some shops even have free P&P.


Well,to tell you the truth,I did a little research but couldn't find any deal that was in my favor at that time...
Don't get me wrong though,I shop online for my pc all the time,but for someting like a gpu where a ton of things could go wrong,I prefer the local retailers where,afterwards I can go and pick a fight with them face to face...!!!


----------



## Rar4f

If you live in EU, this site is good to look for cheaper products:
www.kelkoo.com


----------



## mAs81

Quote:


> Originally Posted by *Rar4f*
> 
> If you live in EU, this site is good to look for cheaper products:
> www.kelkoo.com


Kudos Rar4f,bookmarked for future reference


----------



## Shurtugal

Quote:


> Originally Posted by *Amhro*
> 
> Yeah, matrix and toxic are best 280xs you can get, however, adding another 20€ to that price will get you a 290..
> 
> 
> 
> 
> 
> 
> 
> That's why I decided to go with 290, just waiting for non-ref


Yeah, I was a tad dissapointed when the 290's were only $20 more than my matrix








Anyhow, happy with mine


----------



## Ashuiegi

boost is really not a thing anymore , they just downclock if they go really too hot , mine always run at idle or 1150 , and at 75 C it still is at 1150. so it s base clock is 1150 exactly like the matrix is at 1100.


----------



## Warl0rdPT

Quote:


> Originally Posted by *Amhro*
> 
> Any shop tips? I'm also interested in this


Before I bought mine I took a look at this two

germany: mindfactory.de
spain: pccomponentes.com

I ended up buying in Portugal because the price difference for the 280X TOP wasn't that big, bought it for ~300€


----------



## Rar4f

Matrix is premium card. If you compare Matrix with R9 290 reference, you will see that you get a noisy card that looks like crap and probably has a mediocre cooler.
Right now there is a 83$ difference between a Matrix R9 280X and a reference R9 290 in my part of the world. If a aftermarket R9 290 is released, it will be $115 difference.
The R9 280X Toxic and R9 290 reference price difference is $106. Afermarket 290 should make the difference $138.

So bear in mind comparing a reference R9 290 with a aftermarket R9 280X is not right. The closest to a reference card R9 280X is a Dual X R9 280X.
Dual X R9 280X costs $180 less than a reference R9 290.
But i agree, the Matrix is expensive. But you gotta admit it looks like a beast


----------



## mAs81

Quote:


> Originally Posted by *Warl0rdPT*
> 
> Before I bought mine I took a look at this two
> 
> germany: mindfactory.de
> spain: pccomponentes.com
> 
> I ended up buying in Portugal because the price difference for the 280X TOP wasn't that big, bought it for ~300€


Bookmarked and bookmarked and....totaly depressed now...
Neverheard of any of these shops..If I had I might have saved a euro or two in the end..
Oh well..I guess that's why I signed up at OCN,to find info like these from guys that know better..Thnx!!


----------



## darkelixa

Been trying everywhere in Australia to buy an sapphire r9 280x but no one has them and stock and next shipment of stock is next year, is the r9 270x sapphire toxic any good or is it more worth the wait for the 280x to come into stock next year?


----------



## Durvelle27

Remember if you want to be added to the list pm required info


----------



## Rar4f

Quote:


> Originally Posted by *darkelixa*
> 
> Been trying everywhere in Australia to buy an sapphire r9 280x but no one has them and stock and next shipment of stock is next year, is the r9 270x sapphire toxic any good or is it more worth the wait for the 280x to come into stock next year?


I could buy and ship you a R9 280X Toxic if it's doable without expensive bill for both of us. Then again warranty support becomes a hassel, but if the prices in Australia are very expensive compared to my resident county, then it might not be a bad idea









EDIT: Nope, Australia and my resident country has roughly same prices.


----------



## nortrop

Quote:


> Originally Posted by *darkelixa*
> 
> Been trying everywhere in Australia to buy an sapphire r9 280x but no one has them and stock and next shipment of stock is next year, is the r9 270x sapphire toxic any good or is it more worth the wait for the 280x to come into stock next year?


Any particular reason it has to be a Sapphire. HIS has one of the best solutions right now and the price isn't that bad.

Regarding the 270x - don't buy something that you don't want.


----------



## Amhro

Quote:


> Originally Posted by *Rar4f*
> 
> Matrix is premium card. If you compare Matrix with R9 290 reference, you will see that you get a noisy card that looks like crap and probably has a mediocre cooler.
> Right now there is a 83$ difference between a Matrix R9 280X and a reference R9 290 in my part of the world. If a aftermarket R9 290 is released, it will be $115 difference.
> The R9 280X Toxic and R9 290 reference price difference is $106. Afermarket 290 should make the difference $138.
> 
> So bear in mind comparing a reference R9 290 with a aftermarket R9 280X is not right. The closest to a reference card R9 280X is a Dual X R9 280X.
> Dual X R9 280X costs $180 less than a reference R9 290.
> But i agree, the Matrix is expensive. But you gotta admit it looks like a beast


In my country, matrix 280x is 340€ while sapphire 290 is 365€. But yeah reference and non-reference cards shouldn't be compared.

Btw checked those sites, price difference is minimal hmm


----------



## Rar4f

^ R9 290 all the way then


----------



## ShadowMAN280x

Hey everybody. Joined up the FX 8320/8350 club and since I got a R9 280x I'd like to post my findings here too.

Got my FX 8320 max stable from 3.5Ghz to 4.56Ghz with 8GB 1906Mhz memory
*NOW*
ASUS R9280X-DC2T-3GD5 Radeon R9 280X 3GB 384-bit for $309 on newegg.com

Give it great props on fan noise/overclockability/looks/heat. Comes with great software to up your voltages/fan profiles/GPU&Memory clock speeds.

Gonna start doing extreme testing with it today. Gonna reach that 90C mark







Right now I'm at stock 1.2v at 1140Mhz CPU clock and 6448Mhz Memory Clock. With FurMark stress test I get 80c after 15 minutes with fan at 70%.

*QUESTION:*
Can anybody recommend good stress test software that catches artifacts well or is FurMark it?


----------



## Amhro

Quote:


> Originally Posted by *ShadowMAN280x*
> 
> Hey everybody. Joined up the FX 8320/8350 club and since I got a R9 280x I'd like to post my findings here too.
> 
> ASUS R9280X-DC2T-3GD5 Radeon R9 280X 3GB 384-bit for $309 on newegg.com
> 
> Give it great props on fan noise/overclockability/looks/heat. Comes with great software to up your voltages/fan profiles/GPU&Memory clock speeds.
> 
> Gonna start doing extreme testing with it today. Gonna reach that 90C mark
> 
> 
> 
> 
> 
> 
> 
> Right now I'm at stock 1.2v at 1140Mhz CPU clock and 6448Mhz Memory Clock. With FurMark stress test I get 80c after 15 minutes with fan at 70%.
> 
> *QUESTION:*
> Can anybody recommend good stress test software that catches artifacts well or is FurMark it?


Battlefield 4


----------



## Ashuiegi

yeah bf4 or bf3 for games, and the second test of firestrike for bench.

furmark is totally useless for ocing or getting any real world information , all it does it is to max power your board and heat it up ,...


----------



## ShadowMAN280x

Got into problems quickly. Can't get reliable software. FurMark doesn't test for artifacts to well, where my card was deemed stable at 1200 GPU clock "which I know it isnt'" and the software OCCT I found in the HD7970 form, can't get it to work even on stock settings.


----------



## sonic2911

use heaven or valley


----------



## ShadowMAN280x

Found great program called EVGA OC Scanner X that seems to work flawlessly for picking up artifacts on my Asus r9 280x

http://www.evga.com/ocscanner/

Key features:
MultipleTests for maxing out the GPU's memory
Multiple Test for maxing out the GPU core clock
Settings to shut off if reach a temperature/or if a artifact is found.
Creates a log file of the run.


----------



## Warl0rdPT

Quote:


> Originally Posted by *ShadowMAN280x*
> 
> *QUESTION:*
> Can anybody recommend good stress test software that catches artifacts well or is FurMark it?


GPUTool, OCCT


----------



## zainy

I was just on Newegg, and saw that Gigabyte just came out with a revised 280x. http://www.newegg.com/Product/Product.aspx?Item=N82E16814125490 From the picture, the PCB looks black, but it's hard to tell. What do you guys think? Plus, it said that it comes with BF4. Very tempting!


----------



## TheRoot

Quote:


> Originally Posted by *zainy*
> 
> I was just on Newegg, and saw that Gigabyte just came out with a revised 280x. http://www.newegg.com/Product/Product.aspx?Item=N82E16814125490 From the picture, the PCB looks black, but it's hard to tell. What do you guys think? Plus, it said that it comes with BF4. Very tempting!


i think its great


----------



## darkelixa

I have a Lian Li pca-71fb case which has the tooless pci-e holders and the His r9 290x will hit the holders and will not fit in the case so i have to go a lower profiled gpu like the sapphire toxics or idk gigabyte?


----------



## NinjaToast

Quote:


> Originally Posted by *zainy*
> 
> I was just on Newegg, and saw that Gigabyte just came out with a revised 280x. http://www.newegg.com/Product/Product.aspx?Item=N82E16814125490 From the picture, the PCB looks black, but it's hard to tell. What do you guys think? Plus, it said that it comes with BF4. Very tempting!


Actually quite easy to tell that it is indeed black, which is fantastic because blue looks awful.


----------



## darkelixa

Also, the toxic comes with a very sexy and sturdy backplate


----------



## Roaches

Quote:


> Originally Posted by *zainy*
> 
> I was just on Newegg, and saw that Gigabyte just came out with a revised 280x. http://www.newegg.com/Product/Product.aspx?Item=N82E16814125490 From the picture, the PCB looks black, but it's hard to tell. What do you guys think? Plus, it said that it comes with BF4. Very tempting!


It has a new PCB and shroud but it still uses the old 3 heatpipe and fin array from the old Windforce 3 and not the new version with 6 heatpipes. http://www.gigabyte.com/products/product-page.aspx?pid=4845#ov

R9 280X Rev 2




450W Windforce 6 heatpipes and fin array


----------



## Rar4f

That new Gigabyte LOOKS nice


----------



## Modovich

Guys, I got problems with my 280x. I bought it an country across the border, so if I need to send it for repair (replacement) I need to be sure they can't say "Its not a problem, eff you"

Video 1: https://www.youtube.com/watch?v=e0qg-EMhuSE
Video 2: https://www.youtube.com/watch?v=7ENJfghnP4s
Oil leakage: http://i.imgur.com/VF5fQti.jpg


----------



## Devildog83

I have a problem too, every time I enable X-Fire now the second card stops working. The fans stop and everything. I loaded the new driver which had some other issues so I went back to beta 9.2 and then this. I reloaded the 9.4 and still this. What the heck, how does X-Fire just stop working. The card will come back to life if I dissabe X-Fire.


----------



## sonic2911

Quote:


> Originally Posted by *Roaches*
> 
> It has a new PCB and shroud but it still uses the old 3 heatpipe and fin array from the old Windforce 3 and not the new version with 6 heatpipes. http://www.gigabyte.com/products/product-page.aspx?pid=4845#ov
> 
> R9 280X Rev 2
> 
> 
> 
> 
> 450W Windforce 3 heatpipes and fin array


which is better?


----------



## Roaches

Quote:


> Originally Posted by *sonic2911*
> 
> which is better?


Seeing that only the PCB and shroud has changed on the 280X Rev 2. we'll have to wait for reviews and full PCB shots to tell how its better from the first revision, However the WF3 450W cooler (which have 6 heatpipes) are only found on Nvidia cards for the time being ranging from the GTX 760, 770 and 780. where as the 280X Rev2 is still using 3 heatpipes by looking carefully at the cooler at Gigabyte's product page.

It just looks better than the plastic shroud version, but I expect cooling not to be any different other than It might use Tahiti XTL chips which are said to be more efficient than Tahiti XT.


----------



## Mestoth

I am having the same issue as described in this post of Clement 1987 post 950.

ASUS r9-280X CU TOP II -> comes STOCK overclocked from reference design. 100% stable @ desktop, running Heaven, FURMARK, 3d mark (Win 8.1).
Peak temp of about 75-78 degrees C with stock cooling.

750W Antec High current gaming PSU,
i5 4670 3.4
ASROCK extreme6 Mobo z87
Gskill 1866 DDR3 ramm, 16gb.
Brand new system, brand new install onto samsung 840 EVO SSD.

Moment I start playing ARMA3 Multiplayer (sometimes on load, others after a couple of mins), Hard lock of PC, vertical colour lines in whatever colour was on the screen at the time, or black screen. No windows error log\arma error log etc, just "windwos unexpectadly shutdown" note in the admi ntool log.

Also occours on World of Tanks, and EU IV (after prolonged play). I am not sure if its drivers, however
I have tried all the recent AMD drivers (Drive sweeper in between in safemode), and have also tried the "ASUS reccommended" WHQL version (13.152 if I recall correctly).

Memtest86+ single core mode passes 4 hours of testing
Memtest86+ SMP crashes on test 7 (seems a known bug from their forums on UEFI boards)
Passmark Memtest86 UEFI beta 5.0 passes multicore\SMP mode testing regieme.
Prime95 runs for hours with no problems.

getting pretty close to RMA'ing the card...but ASUS RMA in Australia is a PITA.

Any ideas?


----------



## NinjaToast

I got my Asus 280x today I AM EXCITE, will definitely be fiddling with OCing in the next few days, but now PICS!

Excuse the quality, using my phone.


Spoiler: Warning: PICS!!



Dat chicken Scratch hand writing.


That size comparison (my 7850 on the top of course)


Inside the case


----------



## Tobiman

Quote:


> Originally Posted by *Mestoth*
> 
> I am having the same issue as described in this post of Clement 1987 post 950.
> 
> ASUS r9-280X CU TOP II -> comes STOCK overclocked from reference design. 100% stable @ desktop, running Heaven, FURMARK, 3d mark (Win 8.1).
> Peak temp of about 75-78 degrees C with stock cooling.
> 
> 750W Antec High current gaming PSU,
> i5 4670 3.4
> ASROCK extreme6 Mobo z87
> Gskill 1866 DDR3 ramm, 16gb.
> Brand new system, brand new install onto samsung 840 EVO SSD.
> 
> Moment I start playing ARMA3 Multiplayer (sometimes on load, others after a couple of mins), Hard lock of PC, vertical colour lines in whatever colour was on the screen at the time, or black screen. No windows error log\arma error log etc, just "windwos unexpectadly shutdown" note in the admi ntool log.
> 
> Also occours on World of Tanks, and EU IV (after prolonged play). I am not sure if its drivers, however
> I have tried all the recent AMD drivers (Drive sweeper in between in safemode), and have also tried the "ASUS reccommended" WHQL version (13.152 if I recall correctly).
> 
> Memtest86+ single core mode passes 4 hours of testing
> Memtest86+ SMP crashes on test 7 (seems a known bug from their forums on UEFI boards)
> Passmark Memtest86 UEFI beta 5.0 passes multicore\SMP mode testing regieme.
> Prime95 runs for hours with no problems.
> 
> getting pretty close to RMA'ing the card...but ASUS RMA in Australia is a PITA.
> 
> Any ideas?


Increase memory voltage.


----------



## Tobiman

Quote:


> Originally Posted by *Devildog83*
> 
> I have a problem too, every time I enable X-Fire now the second card stops working. The fans stop and everything. I loaded the new driver which had some other issues so I went back to beta 9.2 and then this. I reloaded the 9.4 and still this. What the heck, how does X-Fire just stop working. The card will come back to life if I dissabe X-Fire.


Disable ULPS to get both cards working all the time. It's a power saving feature. You can also set your system to performance mode to see if that helps.


----------



## sonic2911

Quote:


> Originally Posted by *Roaches*
> 
> Seeing that only the PCB and shroud has changed on the 280X Rev 2. we'll have to wait for reviews and full PCB shots to tell how its better from the first revision, However the WF3 450W cooler (which have 6 heatpipes) are only found on Nvidia cards for the time being ranging from the GTX 760, 770 and 780. where as the 280X Rev2 is still using 3 heatpipes by looking carefully at the cooler at Gigabyte's product page.
> 
> It just looks better than the plastic shroud version, but I expect cooling not to be any different other than It might use Tahiti XTL chips which are said to be more efficient than Tahiti XT.


ordered 1 to check =] im using gtx770 now with the wf3, it's really good!


----------



## rquinn19

Anyone having problems wit their monitor waking up? I have to power cycle the monitor or unplug and plug back in the hdmi cord to get it to wake from sleep. Tried changing the PCI-E power saving setting and also made sure PEG was set in bios.

Using WIndows 8.1. MSI 280X

Bought the card open box and got it for a steal and it performs flawlessly in gaming and with everything else. Just this monitor wake up thing is bothering me.


----------



## Crowe98

Quote:


> Originally Posted by *Roaches*
> 
> Seeing that only the PCB and shroud has changed on the 280X Rev 2. we'll have to wait for reviews and full PCB shots to tell how its better from the first revision, However the WF3 450W cooler (which have 6 heatpipes) are only found on Nvidia cards for the time being ranging from the GTX 760, 770 and 780. where as the 280X Rev2 is still using 3 heatpipes by looking carefully at the cooler at Gigabyte's product page.
> 
> It just looks better than the plastic shroud version, but I expect cooling not to be any different other than It might use Tahiti XTL chips which are said to be more efficient than Tahiti XT.


Do you think they will still be using the Elpida memory chips? Hopefully not


----------



## sonic2911

Quote:


> Originally Posted by *Crowe98*
> 
> Do you think they will still be using the Elpida memory chips? Hopefully not


i will tell u when i get it


----------



## ShadowMAN280x

Reporting some findings on my Asus R9 280x DC2T-3GD5

All stress test software is useless (wasted 2+ hours) finding max overclock. Crysis 3 with a visual look was my stress test.

Default clocks are GPU: 1070Mhz /Memory: 6400

Max clock I can get is GPU: 1175Mhz at + .05 volts bringing it to 1.25v...the memory clock doesn't do to much on FPS but I have it at 6700Mhz.

+105Mhz boost on GPU and +300Mhz boost on memory...looks good to me.

The temps I get is around 70C after a hour of playing. I can probably leave the fan on non-audible %30 and it will probably never reach 80C....I read that the max safe temp is in the low 90C's.

Bringing the voltage up higher doesn't do a whole lot on further overclocking.


----------



## Offender_Mullet

Quote:


> Originally Posted by *sonic2911*
> 
> which is better?


I called Gigabyte earlier. Asked the rep what specifically has been improved or changed on the card & why the fans/shroud were different.

His answer: "Nothing. The V1 card will still have better cooling". I asked him twice more, wording my question differently and even asked about the XTL chip but got the same answer. His English was pretty bad and he wasn't very talkative so I gave up.


----------



## Ashuiegi

get a r9 290 and ditch the case ,.... and anyway if a r9 290 ref don't fit , the toxic will never fit too,....
this new gigabyte cooler with 3 heat pipe is just like the toxic , it look like 3x 10 mm heatpipe.


----------



## JoeChamberlain

Quote:


> Originally Posted by *SupahSpankeh*
> 
> Update: Got it pretty stable at 1150/1.25v.
> 
> This is really odd. Any advice?


You have the TOP or non-TOP card? I can't get past 1100/1700.


----------



## Devildog83

Quote:


> Originally Posted by *Tobiman*
> 
> Disable ULPS to get both cards working all the time. It's a power saving feature. You can also set your system to performance mode to see if that helps.


I had the ULPS disabled, it appears as though it was a broken X-Fire bridge although I will not know if it was the cause or effect until I get a new one. The local shop here only had a short one for side by side slots and I have to use slots 1 and 3. Thanks for the advise.


----------



## sonic2911

Quote:


> Originally Posted by *Offender_Mullet*
> 
> I called Gigabyte earlier. Asked the rep what specifically has been improved or changed on the card & why the fans/shroud were different.
> 
> His answer: "Nothing. The V1 card will still have better cooling". I asked him twice more, wording my question differently and even asked about the XTL chip but got the same answer. His English was pretty bad and he wasn't very talkative so I gave up.


=]] cs of gigabyte are so bad, i will try it







2more days
hope newegg won't give me another higher 5% off coupon


----------



## Warl0rdPT

Quote:


> Originally Posted by *Mestoth*
> 
> Any ideas?


Go check your LiveKernelReports folder (inside windows folder) and if you have anything there upload it here: http://www.osronline.com/page.cfm?name=analyze


----------



## darkelixa

r9 290s are not as high as most of the r9 280xs...

The toxic will easily fit as it is not as high as most other r9 280xs


----------



## Ashuiegi

well basing your gpu choice on the size of Lan-li pice holder is a bit crazy. it s like buying a haf-x based on their gpu shroud and holder that basically limits you to reference or low profile when the case is actually huge and can fit any card in the market ,....


----------



## darkelixa

Oh ok, so the pcie toolless holder wont stop me from installing the card and it wont hit the top of the holder? Its quite hard to line the card up under the toolless holders as you have to hold the tabs then put the card in


----------



## darkelixa

You also cant just remove the holder as there are no screws on the case its pins for the holder, so yes i want to be sure i dont spend $400 on a card that wont fit


----------



## NinjaToast

Quote:


> Originally Posted by *darkelixa*
> 
> You also cant just remove the holder as there are no screws on the case its pins for the holder, so yes i want to be sure i dont spend $400 on a card that wont fit


Screw the holder, just break/mod the damn thing off and if you want something to hold up the card make one that will allow you to fit the cards you buy, it shouldn't be too hard to do so.


----------



## Roaches

Would be alot more helpful for us to help him if he tells us what case he has or show us some pictures of his rig internals...


----------



## NinjaToast

Quote:


> Originally Posted by *Roaches*
> 
> Would be alot more helpful for us to help him if he tells us *what case he has* or show us some pictures of his rig internals...


He did, he has a lian li PC A71b


----------



## darkelixa

http://www.lian-li.com/en/dt_portfolio/pc-a71f/

There is the case


----------



## Offender_Mullet

Quote:


> Originally Posted by *darkelixa*
> 
> http://www.lian-li.com/en/dt_portfolio/pc-a71f/
> 
> There is the case


I just sold mine not even a few weeks ago. I loved that case. I have a Gigabyte 280x and it fit fine inside it without any issues.


----------



## NinjaToast

Yeah, by the looks of that case everything should fit fine I also don't see a shroud attached to it anywhere so I don't see the issue of it not being detachable.


----------



## darkelixa

Did you still use the tooless pci holder? If you removed it there would be no way to screw down the card to the case, yes the gigabyte is alot lower than say the xfx brand


----------



## Offender_Mullet

Quote:


> Originally Posted by *darkelixa*
> 
> Did you still use the tooless pci holder? If you removed it there would be no way to screw down the card to the case, yes the gigabyte is alot lower than say the xfx brand


Yes, I used the toolless mounting system. Did you remove yours? That was the easiest gpu mounting I've ever seen in a case before. If you removed it, you should just be able to use 2 case thumb screws to tighten the card. I've done that before as well.


----------



## darkelixa

Nah man I love the Gpu holder and im not removing it any time soon!!! Hence why I was looking for a card that would fit the case easily


----------



## Offender_Mullet

Quote:


> Originally Posted by *darkelixa*
> 
> Nah man I love the Gpu holder and im not removing it any time soon!!! Hence why I was looking for a card that would fit the case easily


So, have you tried one of the new R9 cards and had problems mounting it? Or just wondering? Dude, that case will fit 30 cards and a 3-foot long sub sammich so no worries.


----------



## darkelixa

Lol no I have not tried to fit a new card to it, was just checking if the dimensions of cards would be an issue with the case before i purchase, either buying an xfx r9 280x or the sapphire toxxic r9 280x if they ever come in stock sometime this year


----------



## darkelixa

I really wanted to get the HIS branded r9 280x but was afraid it wouldnt fit, so you say that it will? The His is one of the most tallest of the r9 ranges


----------



## Offender_Mullet

It'll fit. The only thing taller cards do is cover-up more downward space (other pci-e, pci slots, etc.) on your motherboard.


----------



## darkelixa

Ah downward space is not an issue as i only use the gpu slot.

Would a HIS be better than a gigabyte r9 280x/sapphire toxxic r9 280x, i love the backplate cover on the sapphire


----------



## Offender_Mullet

I've had 2 HIS cards and liked them but I usually buy Asus or Gigabyte. No real reason just personal perference. Once I find brands I'm comfortable with I stick with them. The 280X Gigabyte can get loud but I believe there's a fan curve bios coming out, so it'll make it quieter. The Sapphire Toxic costs in upwards of $60 more (and consumes more powahhh if I'm not mistaken) so that's up to you.

I'm getting tired of the double or triple fan setups that just blow hot air around a case. AMD really needs to pull an Nvidia 780/Titan stock cooler design overhaul on their next-gen cards. I am praying for that!

I'm surprised to see Club3D cards on NewEgg. When did this start happening? I thought they were only sold in Europe?


----------



## darkelixa

The last asus card i had , 560gtx stopped working after 3 months, and i have a gigabyte 770gtx 2gb which just preforms horrible so they were two brands that I was trying to steer clear from, I have a 750w psu so I hope that should be good enough for either of the two cards, I do believe thou xfx still exhast air out the back of the cards. The Asus r9 280x are alot higher than the gigabyte versoin which seems to be very low profile


----------



## sonic2911

Quote:


> Originally Posted by *Offender_Mullet*
> 
> I've had 2 HIS cards and liked them but I usually buy Asus or Gigabyte. No real reason just personal perference. Once I find brands I'm comfortable with I stick with them. The 280X Gigabyte can get loud but I believe there's a fan curve bios coming out, so it'll make it quieter. The Sapphire Toxic costs in upwards of $60 more (and consumes more powahhh if I'm not mistaken) so that's up to you.
> 
> I'm getting tired of the double or triple fan setups that just blow hot air around a case. AMD really needs to pull an Nvidia 780/Titan stock cooler design overhaul on their next-gen cards. I am praying for that!
> 
> I'm surprised to see Club3D cards on NewEgg. When did this start happening? I thought they were only sold in Europe?


the club3d's 280x is hot and loud -,- how about is ur giga rev1, temp and noise?


----------



## Offender_Mullet

Quote:


> Originally Posted by *sonic2911*
> 
> the club3d's 280x is hot and loud -,- how about is ur giga rev1, temp and noise?


My new case is a Corsair 330R which has sound deadening material inside. I assume my cpu/gpu temps will be higher than a 'normal' case but I don't monitor them. I know I know







I should be like everyone else does on here but things like that just don't interest me anymore.

It gets louder than what I would like. However, compared to having it in my previous Lian-Li PC-A71F case, the card is definitely more bearable.


----------



## gh0stfac3killa

hi there all, saw there was an owners club for the 280x and figured this was the place to be since i am a proud owner of two gigabyte r9 280's. slammed them into my troopa build, replacing some aging matrix gtx580's. so far i love the power jump and big jump in fps.. if there any tips on making these bad boys chew up games even better please let me know. im new to the amd line of cards i usualy use nvidia. here is a pic of my gpus..


so far there low temp and quiet compared to my 580's.


----------



## taem

What do you guys think of this?

http://forums.overclockers.co.uk/showpost.php?p=25195666&postcount=13

Says XTL will have less OC headroom than the current XT2s.

If that's true that's a tough one for me, I was waiting on custom 290s and/or XTL 280xs before I but since I'm in no hurry. I want lowest temps possible but a card like the current Asus TOP at 70-75c is acceptable and if it has more oc potential then maybe that's the better call. Ehh so hard to pick a new card.


----------



## kpo6969

Quote:


> Originally Posted by *Offender_Mullet*
> 
> I'm surprised to see Club3D cards on NewEgg. When did this start happening? I thought they were only sold in Europe?


http://www.techpowerup.com/194830/club-3d-launches-with-newegg-in-north-america.html


----------



## Lysergix710

Have some issues with not being able to use crossfire in the latest 13.11 beta it would bsod with a driver code.

So i did a couple re isntalls with no luck. Got on this morning and uninstalled all again and use amd auto detect to find drivers and it gave me 13.9 and has the cards labelled as 79 series again. Crossfire works and no more bsod's. Strange.

Any reason why the drivers would start acting differently.. the first thing i did was remove both cards to install one at a time for proper xfire installation. I think its definitely a driver issue but i think i may have been using a version later than 13.9 because i dont remember ever seeing them labelled as 79 series.

Anyone had any similar issue ?


----------



## gh0stfac3killa

Quote:


> Originally Posted by *Lysergix710*
> 
> Have some issues with not being able to use crossfire in the latest 13.11 beta it would bsod with a driver code.
> 
> So i did a couple re isntalls with no luck. Got on this morning and uninstalled all again and use amd auto detect to find drivers and it gave me 13.9 and has the cards labelled as 79 series again. Crossfire works and no more bsod's. Strange.
> 
> Any reason why the drivers would start acting differently.. the first thing i did was remove both cards to install one at a time for proper xfire installation. I think its definitely a driver issue but i think i may have been using a version later than 13.9 because i dont remember ever seeing them labelled as 79 series.
> 
> Anyone had any similar issue ?


i havent had the bsod issues as of yet but i did notice my fps and seems that performance over all has gone down a bit since i went from windows 8 to 8.1 and did the new beta drivers. not sure why either. seems the old driver worked better. i did the update beta driver cause it said it was installing a new crossfire profile for ghosts hoping it ould fix the game issues but yahhhhh not so much.... and i did a test with tomb raider- i was getting 185 max- 154 min fps wich is amazing coming from my two gtx580's that barly hit 60fps max. when i did the new beta driver though it dropped to 156 max- 110 min with all the same setting and thats on ultimate with tresfx. which is still great but seems slower like i said. im thinking like you must be something in the driver. im going to try a complete driver cleaning and then reinstalling see if the perfomance comes back if not ill go back to the older driver.

on another note does anyone here know when there suppose to add directx 11.2 to these cards i mean i have windows 8.1 pro and everywhere i see it should have 11.2 and my gpus are 11.2 ready so where the heck is my 11.2 lol..


----------



## gh0stfac3killa

Quote:


> Originally Posted by *taem*
> 
> What do you guys think of this?
> 
> http://forums.overclockers.co.uk/showpost.php?p=25195666&postcount=13
> 
> Says XTL will have less OC headroom than the current XT2s.
> 
> If that's true that's a tough one for me, I was waiting on custom 290s and/or XTL 280xs before I but since I'm in no hurry. I want lowest temps possible but a card like the current Asus TOP at 70-75c is acceptable and if it has more oc potential then maybe that's the better call. Ehh so hard to pick a new card.


after reading the back and forth i think it is a bad idea on the manifacturers part to do that, im not as intuned as the over clockers of today that go to some pretty seriouse extremes sometimes. but taking away the ability to push the hard ware is bad i think. not to mention is seems like there trying to use a cheaper way of making these gpus and still charging the same price, all comes back to money money money. give them a cheaper product and charge the same price.....


----------



## Warl0rdPT

Quote:


> Originally Posted by *Ashuiegi*
> 
> well basing your gpu choice on the size of Lan-li pice holder is a bit crazy. it s like buying a haf-x based on their gpu shroud and holder that basically limits you to reference or low profile when the case is actually huge and can fit any card in the market ,....


I couldn't also use the toolless adaptors of my HAF 932 with the 280X TOP, they bump into the card "backplate", so they are open and I use the good old screws.


----------



## madorax

need opinion about 280x:

1. is it worth it for sapphire toxic over sapphire vapor-x or msi gaming?
2. if it is... which is even worth for toxic or powercolor 290? i know 290 is hot, but the difference is only $30 between toxic & 290.
3. which one draw more power in full load? toxic or 290?
4. if toxic not that worth, what 280x should i get instead? vapor-x or msi gaming? or even the cheaper HIS or powercolor?

thanks for the enlightment, before i'm using 760, already sold it this morning. looking for radeon card


----------



## Warl0rdPT

AFAIK toxic and matrix are the best (and more expensive). the rest (that have voltage control) should OC the same, so its up to good looking, noise/temperature, bundled software.

I went to the asus TOP because it was the most silent vs msi. The MSI has the afterburner which is a good plus, I use it but I can't control voltage on my asus, would had to use the asus software that kinda sucks.


----------



## madorax

Quote:


> Originally Posted by *Warl0rdPT*
> 
> AFAIK toxic and matrix are the best (and more expensive). the rest (that have voltage control) should OC the same, so its up to good looking, noise/temperature, bundled software.
> 
> I went to the asus TOP because it was the most silent vs msi. The MSI has the afterburner which is a good plus, I use it but I can't control voltage on my asus, would had to use the asus software that kinda sucks.


so you recommended MSI gaming over toxic? but what about that 290? the tempting is sooooo....


----------



## Warl0rdPT

I would only buy the 290 if I had the money for the card + a 3rd party cooler (and don't care about voiding the warranty)


----------



## ShadowMAN280x

Quote:


> Originally Posted by *JoeChamberlain*
> 
> You have the TOP or non-TOP card? I can't get past 1100/1700.


_have the R9280X-DC2T-3GD5
ASUS Radeon R9 280X DirectCU II_

From newegg...http://www.newegg.com/Product/Product.aspx?Item=N82E16814121803 came with the 1070Mhz clock there is a version 2 card also, that has a slower 6000Mhz Memory instead of the 6400Mhz


----------



## JCH979

Quote:


> Originally Posted by *ShadowMAN280x*
> 
> _have the R9280X-DC2T-3GD5
> ASUS Radeon R9 280X DirectCU II_
> 
> From newegg...http://www.newegg.com/Product/Product.aspx?Item=N82E16814121803 came with the 1070Mhz clock there is a version 2 card also, that has a slower 6000Mhz Memory instead of the 6400Mhz


Going to that link shows they have gone out of stock..


----------



## darkelixa

Yep just as i thought the asus cards would be too big for toolless holders..


----------



## taem

Quote:


> Originally Posted by *Warl0rdPT*
> 
> I would only buy the 290 if I had the money for the card + a 3rd party cooler (and don't care about voiding the warranty)


Speaking of third party coolers if you gpu water cooling but don't want to deal with the setup and maintenance of a custom loop, you can put a closed loop cpu aio on a videocard: http://us.hardware.info/news/37936/nzxt-kraken-g10-makes-all-in-one-compatible-with-gpu

Basically a bracket with a built in fan. It's cheap i think, like $30 msrp. Of course you have to buy the cpu cooler also... But an Accelerometer Hybrid III would cost as much.

I'm tempted to do this with a 290 reference but I'd feel like a fool if the DCU2 and Vapor X customs came out a week later.


----------



## raghu78

Quote:


> Originally Posted by *madorax*
> 
> need opinion about 280x:
> 
> 1. is it worth it for sapphire toxic over sapphire vapor-x or msi gaming?
> 2. if it is... which is even worth for toxic or powercolor 290? i know 290 is hot, but the difference is only $30 between toxic & 290.
> 3. which one draw more power in full load? toxic or 290?
> 4. if toxic not that worth, what 280x should i get instead? vapor-x or msi gaming? or even the cheaper HIS or powercolor?
> 
> thanks for the enlightment, before i'm using 760, already sold it this morning. looking for radeon card


go with asus r9 280x dc2t. the best reviewed and quietest r9 280x

noise videos
http://www.tomshardware.com/reviews/radeon-r9-280x-third-party-round-up,3655-6.html

http://www.hardocp.com/article/2013/10/07/asus_r9_280x_directcu_ii_top_video_card_review/8

"The DirectCU II heatsink and hybrid fan are a real winner on the ASUS R9 280X DC2T video card. It idles the lowest temperature and at full-load hits exactly what ASUS claimed it would. We saw it hit 74c and never exceeded this. It did so with a 42% fan speed. Therefore, the GPU was running cooler than the GTX 770 and Radeon HD 7970 GHz Edition. *There was no fan noise, even during full-load.* The cooling configuration is a success on the ASUS R9 280X DC2T. "

http://www.anandtech.com/show/7400/the-radeon-r9-280x-review-feat-asus-xfx/20

"As impressive as the XFX card was at idle, *it doesn't begin to compare to the Asus card under load. We have a card that's channeling nearly 250W of heat out and away on a sustained basis, and yet for all of that work it generates just 41.5dB(A) of noise on our testbed. This is simply absurd in the most delightful fashion*. Most of the cards in our data collection idle at just 2dB lower than this, never mind noise under load"

http://techreport.com/review/25466/amd-radeon-r9-280x-and-270x-graphics-cards/10

"*Well, that massive cooler on the Asus R9 280X TOP card certainly pays off handsomely. The 280X produces the lowest noise levels under load-the same decibel level as several other cards at idle-and turns in the lowest GPU temperatures*. That's the card that drew the highest wattage under load, no less. Tons of credit to Asus and the big hunk of metal it strapped on the card for making that happen."


----------



## hoevito

So call me silly...but I returned my Asus DC2t and got a Toxic to replace it lol. The Asus couldn't do over 1150/1650mhz without either crashing or artifacting pretty bad, and considering it had to be my top card when paired with the Matrix, I just couldn't take it anymore. The Matrix is totally stable up to 1220/1800mhz and can play quite a few games up to 1240 without artifacts, so I figured I'd go for something that hopefully clocks better. It also looked like crap in my case, without a backplate and all, but I guess I'm just being picky now...









Oh yeah, if anyone cares...Sapphire Toxics are IN STOCK at Newegg


----------



## madorax

@raghu78

i can't take Asus, the price is silly in my place. it's even more expensive than 290, the difference between toxic and DC2T is about $100 in here, so ridiculous.

how about this... (forgive me for asking this in here), zotac GTX 770 AMP is in the same price as sapphire vapor-x which is under $400, should i get that instead? i've been asked to distro and all sapphire card is out of stock...









For the Info... Zotac GTX 770 AMP is even $15 cheaper than MSI R9 280x gaming which is ready in here. so i think i will get this 770 amp, since people recomenned 280x over 770 because the price right? now how if the price is cheaper for 770 than 280x like in my place... it's very weird price concept here in Indonesia though...


----------



## Ashuiegi

Quote:


> Originally Posted by *hoevito*
> 
> So call me silly...but I returned my Asus DC2t and got a Toxic to replace it lol. The Asus couldn't do over 1150/1650mhz without either crashing or artifacting pretty bad, and considering it had to be my top card when paired with the Matrix, I just couldn't take it anymore. The Matrix is totally stable up to 1220/1800mhz and can play quite a few games up to 1240 without artifacts, so I figured I'd go for something that hopefully clocks better. It also looked like crap in my case, without a backplate and all, but I guess I'm just being picky now...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh yeah, if anyone cares...Sapphire Toxics are IN STOCK at Newegg


got the same combo of cards


----------



## Warl0rdPT

Quote:


> Originally Posted by *hoevito*
> 
> So call me silly...but I returned my Asus DC2t and got a Toxic to replace it lol. The Asus couldn't do over 1150/1650mhz without either crashing or artifacting pretty bad, and considering it had to be my top card when paired with the Matrix, I just couldn't take it anymore. The Matrix is totally stable up to 1220/1800mhz and can play quite a few games up to 1240 without artifacts, so I figured I'd go for something that hopefully clocks better. It also looked like crap in my case, without a backplate and all, but I guess I'm just being picky now...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh yeah, if anyone cares...Sapphire Toxics are IN STOCK at Newegg


thats all about luck, my DC2T is solid at 1175/1700 and I'm not even touching the voltage. I can even do runs on unigine at 1200/1800 in without crashing/artifacting.


----------



## DiceAir

Quote:


> Originally Posted by *Warl0rdPT*
> 
> thats all about luck, my DC2T is solid at 1175/1700 and I'm not even touching the voltage. I can even do runs on unigine at 1200/1800 in without crashing/artifacting.


My card's doesn't even overclock over 1100 but then again from 1100 to 1200 there is not much fps gain on 2560x1440


----------



## sugarhell

Quote:


> Originally Posted by *DiceAir*
> 
> My card's doesn't even overclock over 1100 but then again from 1100 to 1200 there is not much fps gain on 2560x1440


The scaling is almost linear until 1400-1450. After that point you dont get that much. Personal experience with dice 7970s. If you dont gain anything over 1100 its either

1) bad oc/unstalbe oc
2)Cpu bottleneck
3)Bad crossfire scaling


----------



## Warl0rdPT

in valley I gain 4fps going from 1070 to 1200MHz at 1200p, all maxed up. its not much, but hey, its free


----------



## ignus1212

Quote:


> Originally Posted by *madorax*
> 
> @raghu78
> 
> i can't take Asus, the price is silly in my place. it's even more expensive than 290, the difference between toxic and DC2T is about $100 in here, so ridiculous.
> 
> how about this... (forgive me for asking this in here), zotac GTX 770 AMP is in the same price as sapphire vapor-x which is under $400, should i get that instead? i've been asked to distro and all sapphire card is out of stock...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For the Info... Zotac GTX 770 AMP is even $15 cheaper than MSI R9 280x gaming which is ready in here. so i think i will get this 770 amp, since people recomenned 280x over 770 because the price right? now how if the price is cheaper for 770 than 280x like in my place... it's very weird price concept here in Indonesia though...


Maybe you can get a Sapphire Dual-X for alot cheaper? The difference here is around 25$ from the Dual-X and Vapor-X and 60$ for Toxic. Realistically speaking, you're only gonna gain 5~ish more FPS at stressful games for your extra 25/60$ on the Vapor-X/Toxic, but are still acceptable for playing. If you're overclocking it'll probably be same depending on your luck, just different temps.
http://www.anandtech.com/show/7406/the-sapphire-r9-280x-toxic-review/3

If you're concerned about temps/noise/looks they are definitely worth it though, the blackplate is really sexy









ASUS/EVGA/MSI are priced higher in SEA region because distro's are saying there's not much demand so they can't import high volumes and they can't do a thing about it(but then again it always out of stock LOL).


----------



## Ashuiegi

no we recommand 280x over 770 because the performance is really better , larger memory bus , more mem and better oc stability. 770 card is a no go for anyone since it got out , never was worth it.


----------



## Rar4f

Plus R9 280X is better suited for mining Bitcoin/Alternative coins. If you don't know what that is, just understand that if you have free electricity or very cheap electricity bill, then a R9 280X can earn you some nice chunk of money on a monthly basis.


----------



## madorax

@ all above

yes i do think 280x is better since it got more vram that i need for skyrim ENB. so i pending my buy today and wait till monday hopefully i can get vapor-x or any other 280x that is ready in the distro.

thanks all


----------



## Rar4f

MSI seems to have a offer with their R9 280X, giving you Battlefield 4:
www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-owners-club/1900#post_21279833
A nice offer but FOR me its more expensive than the Toxic, which is at 1100 MHz core without Boost.
100 more than the MSI.
And i'm not a Battlefield fan, even though i would love to experience Mantle with it.

But if your a BF4 fan, and the card is priced well, then not a bad bargain.


----------



## sonic2911

Quote:


> Originally Posted by *Rar4f*
> 
> MSI seems to have a offer with their R9 280X, giving you Battlefield 4:
> www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-owners-club/1900#post_21279833
> A nice offer but FOR me its more expensive than the Toxic, which is at 1100 MHz core without Boost.
> 100 more than the MSI.
> And i'm not a Battlefield fan, even though i would love to experience Mantle with it.
> 
> But if your a BF4 fan, and the card is priced well, then not a bad bargain.


gigabyte released rev2 with bf4 too


----------



## Rar4f

Quote:


> Originally Posted by *sonic2911*
> 
> gigabyte released rev2 with bf4 too


I hope i can get a Toxic with BF4 without paying more then


----------



## sonic2911

Quote:


> Originally Posted by *Rar4f*
> 
> I hope i can get a Toxic with BF4 without paying more then


for that price, i will get 290


----------



## Rar4f

Quote:


> Originally Posted by *sonic2911*
> 
> for that price, i will get 290


On Amazon the price between a Toxic 280X and a REF R9 290 is around $25. BUT where i live the Toxic is $73 cheaper.
However, i might just get a REF 290 if good sale comes


----------



## darkelixa

everywhere in aus does not have a toxic 280x, would it be worth waiting for that card or just spend the extra 30 bucks on a sapphire r9 290?


----------



## sonic2911

Quote:


> Originally Posted by *darkelixa*
> 
> everywhere in aus does not have a toxic 280x, would it be worth waiting for that card or just spend the extra 30 bucks on a sapphire r9 290?


try giga rev2, asus or get the 290


----------



## darkelixa

everywhere in aus does not have a toxic 280x, would it be worth waiting for that card or just spend the extra 30 bucks on a sapphire r9 290?


----------



## Roaches

I'd wait for non reference 290s to be released within mid December, unless you want to deal with noise.


----------



## darkelixa

Pretty useto my 5850 making a ton of fan noise


----------



## GRQ

So, I've ran 1 Gigabyte R9 280X in Furmark at 15min preset. Everything went just fine.

Now, I Crossfired them and right after 2 or so minutes my PC went down. Then it restarted. Looks like I don't have enough power here?
Using Cougar A760 760W. That's kinda strange, that amount of power should be more than enough.

Please advise!


----------



## Roaches

Quote:


> Originally Posted by *GRQ*
> 
> So, I've ran 1 Gigabyte R9 280X in Furmark at 15min preset. Everything went just fine.
> 
> Now, I Crossfired them and right after 2 or so minutes my PC went down. Then it restarted. Looks like I don't have enough power here?
> Using Cougar A760 760W. That's kinda strange, that amount of power should be more than enough.
> 
> Please advise!




Seems to be enough power to run both cards, maybe OCP triggered the shutdown?
I did some googling and some are saying 850W is recommended where as 750W is the minimum...

Keep in mind CPU overclocks also effects power consumption if you are overclocked.

http://forums.anandtech.com/showthread.php?t=2216373

http://hardforum.com/showthread.php?t=1673037

http://forums.bit-tech.net/showthread.php?t=234644


----------



## madorax

so i just update 280x ready on distro in my place, MSI gaming, Gigabyte, & Powercolor. the cheapest is powercolor with only $306, gigabyte $389, and MSI gaming $383.

i've search this thread and not find any single powercolor card mentioned, looked on youtube and only linus from ncix preview it. so nobody using powercolor? and between MSI and giga, witch is better to choose? i read giga is locked voltage in the back2 page of this thread.


----------



## Ashuiegi

furmark is not a usefull test , there is little to no chance that you will ever reach these power level during normal utilisation, i don't like running close to the limit of my psu , so for my cf 280x i have a 1200. how many fan,hdd and extra do you run ? these add up quickly in numbers.


----------



## Sgt Bilko

Hi there boys and girls,

So unfortunately my 290x has decided it doesn't want to work properly anymore and i've started an RMA process.

So i'm just looking for people's opinions on the 280x, how are they performing, which cards have Voltage unlocked etc.

Thinking about picking up 2 DCU II cards but the Vapor-X is also tempting.


----------



## Troublemak3r

hi guys , i own a r9 280x vapor-x and i was wondering what waterblocks can i use on this card or if you can at all
thanks!


----------



## raghu78

Quote:


> Originally Posted by *Troublemak3r*
> 
> hi guys , i own a r9 280x vapor-x and i was wondering what waterblocks can i use on this card or if you can at all thanks!


there are no full cover waterblocks for the r9 280x vapor-x . you have to go for universal gpu cooler and vrm heatsinks + fan blowing air towards the vrms.


----------



## Troublemak3r

cant i use a 7970 waterblock ?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Troublemak3r*
> 
> cant i use a 7970 waterblock ?


A 7970 waterblock is different from the Vapor-X's PCB so no you can't.

best option is as raghu78 suggested and get a universal block and air cool the vrms and vram with heatsinks.


----------



## darkelixa

Oh sgt would you suggest not to buy a 290?? Im on borderline between the 280x and 290 and i cant decide


----------



## sonic2911

Quote:


> Originally Posted by *darkelixa*
> 
> Oh sgt would you suggest not to buy a 290?? Im on borderline between the 280x and 290 and i cant decide


what games do u play? and ur monitor?
get asus/giga/sap 280x or wait the 290 with custom cooler


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Oh sgt would you suggest not to buy a 290?? Im on borderline between the 280x and 290 and i cant decide


Tbh my 290x was a damn awesome card until it gave up. i would suggest you weigh up your options and decide what you think would suit you better, me personally im looking at either Crossfire 290's or 280x's.

But with the recent news that non-ref 290/x's are delayed until January i'm just trying to get peoples opinions on the 280x's.


----------



## darkelixa

I currently play in 1920x1080p.

I only play final fantasy a real reborn, and crysis 3 currently, so just get a 280x?


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> I currently play in 1920x1080p.
> 
> I only play final fantasy a real reborn, and crysis 3 currently, so just get a 280x?


A good 280x would be best i think, Unless you are planning to Watercool or you can't wait until Jan for non-ref 290's then i'd go for a 280x such as the DCU II or a Vapor-X

But if you can wait until January then get a R9 290.


----------



## darkelixa

Oh man my 5850 just struggles in crysis 3 and ffxiv the vapor x are no where to be found in aus its so annoying


----------



## sonic2911

Quote:


> Originally Posted by *darkelixa*
> 
> I currently play in 1920x1080p.
> 
> I only play final fantasy a real reborn, and crysis 3 currently, so just get a 280x?


280x is enough for u, what brands can u buy?


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Oh man my 5850 just struggles in crysis 3 and ffxiv the vapor x are no where to be found in aus its so annoying


Mwave has the Toxic in stock and Scorptech has the Dual-X but i can't seem to find the Vapo-X either


----------



## gh0stfac3killa

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Tbh my 290x was a damn awesome card until it gave up. i would suggest you weigh up your options and decide what you think would suit you better, me personally im looking at either Crossfire 290's or 280x's.
> 
> But with the recent news that non-ref 290/x's are delayed until January i'm just trying to get peoples opinions on the 280x's.


i own two gigabyte r9 280x's and i love them so far. they tear threw every game i throw tat them on high to ultra. i play all the big games from bf 4 to any other new game you can think of, and i run them all at the highest settings possible. for the price in my opinion was a great buy. i only had 600 dollars or so to spend give or take a few bucks cant role with the big dogs on all the big hitting gpu's. so for 600 dollars for two of the 280x's over one 290x or even a 780 for that matter i got two 280's. from all the things i read on them before i bought them two 280x's in crossfire were beating out the 780 and the 290x single card set up on those. long story short im very happy with these two 280x's, im a pure gamer im all about the game and getting the best gaming experience i can but keeping it in my budget range. so i give the r9 280x two thumbs up














and a third if i had one... no complaints at all.


----------



## Sgt Bilko

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> i own two gigabyte r9 280x's and i love them so far. they tear threw every game i throw tat them on high to ultra. i play all the big games from bf 4 to any other new game you can think of, and i run them all at the highest settings possible. for the price in my opinion was a great buy. i only had 600 dollars or so to spend give or take a few bucks cant role with the big dogs on all the big hitting gpu's. so for 600 dollars for two of the 280x's over one 290x or even a 780 for that matter i got two 280's. from all the things i read on them before i bought them two 280x's in crossfire were beating out the 780 and the 290x single card set up on those. long story short im very happy with these two 280x's, im a pure gamer im all about the game and getting the best gaming experience i can but keeping it in my budget range. so i give the r9 280x two thumbs up
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and a third if i had one... no complaints at all.


Well tbh i'd expect Crossfire 280x's to be able to stomp a 780 or 290x, I usually run single cards due to AMD's back run with stuttering but since it seems they have fixed that i'm back looking at a Crossfire set-up again.

just undecided atm between 280x's and 290's. 2 x DCU II 280x's = $820 and 2 x Ref 290's = $1k


----------



## darkelixa

Have you ever bought from Mwave??? I have never bought from them just Umart really and IJK. Would be so much easier to talk to you on skype

http://www.mwave.com.au/product/sapphire-amd-radeon-r9-280x-3gb-toxic-overclocked-video-card-ab52088

Says pre order for the toxic 280x


----------



## gh0stfac3killa

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well tbh i'd expect Crossfire 280x's to be able to stomp a 780 or 290x, I usually run single cards due to AMD's back run with stuttering but since it seems they have fixed that i'm back looking at a Crossfire set-up again.
> 
> just undecided atm between 280x's and 290's. 2 x DCU II 280x's = $820 and 2 x Ref 290's = $1k


just a strange question where are you getting the prices for your asus gpus at 820 for the 280's i know every country seems to have differnet prices. newegg has the asus 280x's at 309.99 and the ones i bought were 299.99, the 290x's are going 549 to 589 a piece.

but i understand what your saying about the crossfire deal and there drivers thats the main reason why i stayed away from ati/amd gpus for so long. now that the drivers seem to be better and the prices are way better than nvidia gpus i gumped back into the amd ring give them a try. so far like i siad its been all smiles. so i dont think you can go wrong with either set up 280 or 290. best bang for the buck though is the duel 280x's but if you have the money then as i always say "GO BIG!!' lol. if i could afford two 290x's i would be all over those in a heart beat.


----------



## Sgt Bilko

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> just a strange question where are you getting the prices for your asus gpus at 820 for the 280's i know every country seems to have differnet prices. newegg has the asus 280x's at 309.99 and the ones i bought were 299.99, the 290x's are going 549 to 589 a piece.
> 
> but i understand what your saying about the crossfire deal and there drivers thats the main reason why i stayed away from ati/amd gpus for so long. now that the drivers seem to be better and the prices are way better than nvidia gpus i gumped back into the amd ring give them a try. so far like i siad its been all smiles. so i dont think you can go wrong with either set up 280 or 290. best bang for the buck though is the duel 280x's but if you have the money then as i always say "GO BIG!!' lol. if i could afford two 290x's i would be all over those in a heart beat.


Im in Aus so prices for us are usually pretty crap. http://www.pccasegear.com/index.php?main_page=index&cPath=193_1559

I usually buy through these guys because they have always been good to me.

my 290x was $700 for the BF4 edition so if i get store credit back then i can afford 2 more cards and thats the dilemma i'm facing, non-ref 280x's or ref 290's and watercool them later (thats looking very pricy atm)

And i've only ever bought from PCCG and Megaware in Sydney, i stopped buying from Megaware when they priced things too high but they seem to have decent prices now.


----------



## darkelixa

Another issue i have with my case is that if the card is too high, like the asus or xfx his, i wont be able to mount them becuase of the toolless pci holders, the 290s would fit really easy as they are not as high


----------



## Ashuiegi

on any resolution under 1440p there is no reason to get a 290 i think , the 280x will be more then enough and a 280x CF will perform far better , 290/290x are for 4k or multi monitor or get big number in bench/epeen.


----------



## darkelixa

Ah I will buy a 280x then!! With my pc, when i took it to the store they installed new ram to it also to the existing 8gb i had, so now it totals 16gb, however 2 of the 4gb sticks are corsair xmp3 ram 1.65 in xmp and the other 2 of 4gb sticks are 1.5v in xmp, im not running it in xmp thou. Would this affect my fps also and i would have to buy new ram again?


----------



## Ashuiegi

if you have no plan to go on higer res you better get a 280x or a 290
if you like playing with 24xSS override setting then yes get a 290x or maybe some games with a lot of mod installed can use the 4 gb vram too.
i would wait for 290 with custom cooler but it will be more expensive and if you want to add a card later it will be more expensive then the 280x too,


----------



## darkelixa

Dont plan on going any higher than 1920x1080 as i cannot afford a new monitor!!

Looks like I should be getting a 280x


----------



## Ashuiegi

for the ram i have no experience with mixxing different kind , if they are reconized, you don't get crash and it s working now then it should be ok.
you will be perfectly fine with 8 gb for gamming anyway.


----------



## darkelixa

Yeah its all working fine, never had a crash nor a memory error.

Just trying to find a gpu that will fit now

I believe i only really have the choice of the gigabyte r9 280x to fit. Is gigabyte a good brand, my gigabyte 770gtx preforms horribly in ffxiv


----------



## Ashuiegi

i have no experience with gygabyte either i m a his/asus/sapphire guy but it often down to luck more then anything , except a few flawed models that are rare , look for some review and buyer comment on the web.
and some games can have problem with amd or nvidia , you can check review for that


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Yeah its all working fine, never had a crash nor a memory error.
> 
> Just trying to find a gpu that will fit now
> 
> I believe i only really have the choice of the gigabyte r9 280x to fit. Is gigabyte a good brand, my gigabyte 770gtx preforms horribly in ffxiv


Im using a Gigabyte 7970 Ghz atm and it's a great card, keeps cool, it's fairly quiet and it's only a 2 slot cooler.

I've heard some horror stories with Gigabytes warrenty department but i have no experiance with that.

I've had Sapphire, HIS and Gigabyte cards and they have all been great (apart from my dead 290x)


----------



## darkelixa

Just placed an order for a sapphire r9 280x, reckin i should get new ram or just see how the new video card goes? Ram is so expensive these days, 100 bucks for 8gb


----------



## darkelixa

Or would a r9 270x toxic edition be a better choice?


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Just placed an order for a sapphire r9 280x, reckin i should get new ram or just see how the new video card goes? Ram is so expensive these days, 100 bucks for 8gb


I'd stick with the 280x, and 8GB of system ram is enough for gaming purposes atm.

If you want 16GB though you can get a good set for around $150-$200

I'm planning on 2 x 280x's now, just gotta wait for the RMA process to clear.


----------



## darkelixa

I currenly have 16gb of ddr3 so I will see how it goes with the new card and see if i still get stuttering in games, if so I might try and replace the ram,. It has a new psu in it so i know that's defiantly not the issue.

It says on the IJK website 1-2 days to get the card in so I hope to recieve it sometime this week


----------



## gh0stfac3killa

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Im in Aus so prices for us are usually pretty crap. http://www.pccasegear.com/index.php?main_page=index&cPath=193_1559
> 
> I usually buy through these guys because they have always been good to me.
> 
> my 290x was $700 for the BF4 edition so if i get store credit back then i can afford 2 more cards and thats the dilemma i'm facing, non-ref 280x's or ref 290's and watercool them later (thats looking very pricy atm)
> 
> And i've only ever bought from PCCG and Megaware in Sydney, i stopped buying from Megaware when they priced things too high but they seem to have decent prices now.


man that's like almost a 200 dollar diff, yikes... well if your going to us store cred at 700, id go for two 280x's leaves you some left over for a game or maybe even putting towards a third for tri-fire. once I get a few hundred more bucks im going to throw another 280x in the mix.

what case are you running cause I have two cases one is my full tower the trooper and I have a mid tower the k-380 both cooler masters and I can run three of these bad boys in both of them with another board I can go 4 way crossfire, in theory...


----------



## madorax

Quote:


> Originally Posted by *madorax*
> 
> so i just update 280x ready on distro in my place, MSI gaming, Gigabyte, & Powercolor. the cheapest is powercolor with only $306, gigabyte $389, and MSI gaming $383.
> 
> i've search this thread and not find any single powercolor card mentioned, looked on youtube and only linus from ncix preview it. so nobody using powercolor? and between MSI and giga, witch is better to choose? i read giga is locked voltage in the back2 page of this thread.


back to my previous questions. is it worth that $70 more for MSI or giga over powercolor 280x? tomorrow (monday) i planned to buy 280x and by last saturday hunting only the card from that three brand available in here right now. outside 280x there is HIS 290 with $413, and zotac 770 amp with $363.

i only played game with 1 display @ 1080p.


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> I currenly have 16gb of ddr3 so I will see how it goes with the new card and see if i still get stuttering in games, if so I might try and replace the ram,. It has a new psu in it so i know that's defiantly not the issue.
> 
> It says on the IJK website 1-2 days to get the card in so I hope to recieve it sometime this week


If you have 16GB then you don't need anymore









The stuttering was mostly fixed in a driver update iirc
Quote:


> Originally Posted by *gh0stfac3killa*
> 
> man that's like almost a 200 dollar diff, yikes... well if your going to us store cred at 700, id go for two 280x's leaves you some left over for a game or maybe even putting towards a third for tri-fire. once I get a few hundred more bucks im going to throw another 280x in the mix.
> 
> what case are you running cause I have two cases one is my full tower the trooper and I have a mid tower the k-380 both cooler masters and I can run three of these bad boys in both of them with another board I can go 4 way crossfire, in theory...


I've got a CM Storm Trooper, my rig is in my sig (very helpful), I'll be going for 2 x 280x's so either Sapphire, Gigabyte or Asus, all depends on which are volt locked and which are using Hynix Memory (I'm going to have these for a while)

And i'll be mostly single 1080p for starters then maybe triple 2080p down the line


----------



## darkelixa

Ah ok, its not an issue i have with the 5850 but an issue i have with the 770gtx, got damn waste of 500 bucks that was


----------



## Ashuiegi

no sticks to a 280x for 1080p it s perfect


----------



## Sgt Bilko

So does anyone know which 280x's are Volt locked and which aren't?

I'm planning on picking 2 cards and so far i'm stumped between Asus, Sapphire and Gigabyte/


----------



## gh0stfac3killa

Quote:


> Originally Posted by *Sgt Bilko*
> 
> If you have 16GB then you don't need anymore
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The stuttering was mostly fixed in a driver update iirc
> I've got a CM Storm Trooper, my rig is in my sig (very helpful), I'll be going for 2 x 280x's so either Sapphire, Gigabyte or Asus, all depends on which are volt locked and which are using Hynix Memory (I'm going to have these for a while)
> 
> And i'll be mostly single 1080p for starters then maybe triple 2080p down the line


guess I should of looked down before asking about your rig, lol.. well you should have plenty of room for must about anything with your set up especialy since I orginaly had two matrix gtx 580 smashed into mine. and just fyi the gigabyte 208x is voltage locked. so if your looking for something with more head room and voltage adjustment then the gigabyte ones are out the pic. they do push farther than the already stock over clock but you cant do anything with the voltage.


----------



## Sgt Bilko

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> guess I should of looked down before asking about your rig, lol.. well you should have plenty of room for must about anything with your set up especialy since I orginaly had two matrix gtx 580 smashed into mine. and just fyi the gigabyte 208x is voltage locked. so if your looking for something with more head room and voltage adjustment then the gigabyte ones are out the pic. they do push farther than the already stock over clock but you cant do anything with the voltage.


Ah thats a shame about Gigabyte, Price is good, cooler is good and they are quiet as well.

Hmm gonna have to see if i can do Crossfire with the DCU II's considering their size.

If not then it looks like im going for Sapphire's.....which are pretty much out of stock here lol.


----------



## darkelixa

I got a sapphire on order

Dont think id buy a gigabyte again,. even my 770gtx was voltage locked


----------



## Sgt Bilko

Actually after reading some reviews here and there i've decided to go for a couple of HIS IceQ X2's.

at stock the performance is really all the same and with 2 cards and only 1080p i'm not going to need to overclock them for a while.

now the waiting game begins.......


----------



## darkelixa

I really wanted His but they are too tall for my case


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> I really wanted His but they are too tall for my case


I have exactly 322mm of space so i can get away with it with about 25mm spare.


----------



## Warl0rdPT

Quote:


> Originally Posted by *madorax*
> 
> back to my previous questions. is it worth that $70 more for MSI or giga over powercolor 280x? tomorrow (monday) i planned to buy 280x and by last saturday hunting only the card from that three brand available in here right now. outside 280x there is HIS 290 with $413, and zotac 770 amp with $363.
> 
> i only played game with 1 display @ 1080p.


I would go for the powercolor.


----------



## darkelixa

Tall was in the width not lenght, they are 150mm wide


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Tall was in the width not lenght, they are 150mm wide


186mm there, I love my case


----------



## darkelixa

Oh ok, so you just screw them down obviously, id have heaps if i didnt have the lovely tooless holder


----------



## darkelixa

How good are the powercolor brand, never heard of them before, i see they have a backplate


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Oh ok, so you just screw them down obviously, id have heaps if i didnt have the lovely tooless holder


What case do you have?

I don't see it listed in your sig


----------



## darkelixa

Ill get you a link,

http://www.lian-li.com/en/dt_portfolio/pc-a71f/

Updated my sig


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Ill get you a link,
> 
> http://www.lian-li.com/en/dt_portfolio/pc-a71f/
> 
> Updated my sig


Well according to that you have good space: CPU cooler height:175mm

I usually base it off that unless you want to get a tape out and measure it yourself.

It's a very pretty case btw, Lian-Li always made some good stuff


----------



## DeadZeus

To anyone using the Asus GPU tweaker, my Asus GPU tweaker only shows 2000mb memory for my R9 280x. Is this normal?


----------



## gh0stfac3killa

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ah thats a shame about Gigabyte, Price is good, cooler is good and they are quiet as well.
> 
> Hmm gonna have to see if i can do Crossfire with the DCU II's considering their size.
> 
> If not then it looks like im going for Sapphire's.....which are pretty much out of stock here lol.


yah the r9 series as a whole seem to be going fast. as far as going tripple screens i did eyefinity with mine just last night and even with the higher rez the gpus in crossfire did outstanding. so even with the voltage locked the factory overclock seems to be realy good, and they run realy cool and quiet. as far as the asus gpus the tripple card slot you should be able to run two in gpu slot one and gpu slot 3 if im not mistaken as long as you have a crossfrie bridge long enough to reach. i have to look at the crosshair board but i have the maximus v formula and with the spacing between gpu slot 1 and 2 two tripple slot gpus fit i ran two matrix gtx 580's for a while on this board, and with my other board the rampage iii extreme i use gpu slot 1 and 3 with the matrix cards. since i took them out the maximus build for the r9's didnt want to see them matrix's go to waist. lol..


----------



## Devildog83

Well tbh i'd expect Crossfire 280x's to be able to stomp a 780 or 290x, I usually run single cards due to AMD's back run with stuttering but since it seems they have fixed that i'm back looking at a Crossfire set-up again.

I would hope so, 2 x 7870's are in the 290x range.

Edit: sorry I hit reply instead of quote. The above was a quote from Sgt. Bilko


----------



## Devildog83

Quote:


> Originally Posted by *darkelixa*
> 
> How good are the powercolor brand, never heard of them before, i see they have a backplate


I never trusted Powercolor before, I thought is was kind of a budget brand until I bought a Devil. I like them and have had no issues at all. I can't tell you about customer service as I haven't had to deal with them yet but the cards are very solid and work as advertised. The Elpida chips might be the only drawback along with the 270x devil being voltage locked but since the card is pretty much maxed out on the memory and core anyhow it doesn't really matter much. If I was getting a new card Powercolor would be one of my choices from my experience with them so far.


----------



## Warl0rdPT

Quote:


> Originally Posted by *DeadZeus*
> 
> To anyone using the Asus GPU tweaker, my Asus GPU tweaker only shows 2000mb memory for my R9 280x. Is this normal?


You can probably set the max/min values of each graph window, at least you can do that on Afterburner.


----------



## DeadZeus

Quote:


> Originally Posted by *Warl0rdPT*
> 
> You can probably set the max/min values of each graph window, at least you can do that on Afterburner.


There is no option for that in the asus gpu tweak as far as i could find :s


----------



## Milestailsprowe

Can I get some help with card. For some reason the card stays at 80c when I play games. Every review I see says around 69 but I get 80c. How do I fix this?


----------



## ignus1212

Quote:


> Originally Posted by *madorax*
> 
> back to my previous questions. is it worth that $70 more for MSI or giga over powercolor 280x? tomorrow (monday) i planned to buy 280x and by last saturday hunting only the card from that three brand available in here right now. outside 280x there is HIS 290 with $413, and zotac 770 amp with $363.
> 
> i only played game with 1 display @ 1080p.


I read on a local(non-english) forum that cooler performance on the Powercolor is not so good (70c playing BF 4 @90% fan speed, 1000mhz on the core).

If you don't mind the noise(or higher temps if you choose lower fan speed) go for the Powercolor. Personally, I would just spend more and buy MSI/Gigabyte if I have to choose tomorrow.


----------



## Durvelle27

*List Updated*


----------



## Devildog83

Quote:


> Originally Posted by *Milestailsprowe*
> 
> Can I get some help with card. For some reason the card stays at 80c when I play games. Every review I see says around 69 but I get 80c. How do I fix this?


That seems very high for that card. Have you tried new thermal past. What clocks are you running?

I just ran Valley 1.0 at 1200/1400 with the fans on auto and got these temps. I don't usually run the fans on auto for bench's but I could not here them run at all. At 50% fan in games I normally see low 60's.


----------



## Milestailsprowe

Quote:


> Originally Posted by *Devildog83*
> 
> That seems very high for that card. Have you tried new thermal past. What clocks are you running?


Stock clocks of 1080/1400 that came with the cards. I'm gonna order some better paste.


----------



## Themisseble

Anyone overclocking R9 270X?


----------



## psychok9

Today I've ordered a SAPPHIRE TOXIC R9 280X!








Soon in the club


----------



## Devildog83

Quote:


> Originally Posted by *Themisseble*
> 
> Anyone overclocking R9 270X?


I did a 3DMark11 run at 1235/1550, the score wasn't great but no instability.


----------



## darkelixa

And the place that i ordered the sapphire r9 280x just got back to me and said no eta on the card


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> And the place that i ordered the sapphire r9 280x just got back to me and said no eta on the card


Go for the HIS then









on another note, its starting to look like i won't get any cash back from my 290x going bad.......Sapphire doesn't like it when you remove the stock cooler to re-apply paste.

Here's hoping they are good to me









EDIT: PCCG doesn't have any HIS 280x's in stock.......


----------



## darkelixa

I dont think the His would fit its way to big!!!

ijk said maybe the XFX R9-280X-TDFD come in stock on Wednesday with free battlefield 4 but its 143mm wide, i think it might fit, not 100% sure thou.


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> I dont think the His would fit its way to big!!!
> 
> ijk said maybe the XFX R9-280X-TDFD come in stock on Wednesday with free battlefield 4 but its 143mm wide, i think it might fit, not 100% sure thou.


Just pull out the measuring tape and see?

I'm kinda set on the HIS for myself now, has great temps and clocks plus i've always liked HIS cards so why not?


----------



## darkelixa

Having a rough measure with the tape i prob wouldnt get really more so than 134mm, i have a 770gtx which is 292x129x43 and 7870 which is 280x134x42.5. Both of those cards just fit so I think xfx is def out of the picture


----------



## darkelixa

So asus is too big
Sapphire out of stock everywhere ( umart have vapor on preorder i belive)
Gigabyte will fit but a bit iffy
His wont fit
XFX def wont fit
Powercolor will fit


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Having a rough measure with the tape i prob wouldnt get really more so than 134mm, i have a 770gtx which is 292x129x43 and 7870 which is 280x134x42.5. Both of those cards just fit so I think xfx is def out of the picture


Well that somewhat sucks, sorry to hear it.

Here, just pulled these from Toms:



Model Length L Height H Depth D1 Depth D2

Asus R9 280X DirectCU II OC 287 mm 139 mm 38 mm 3 mm

Club3D R9 280X royalKing OC 265 mm 112 mm 36 mm 5 mm

Gigabyte R9 280X WindForce 3X OC 270 mm 112 mm 37 mm 4 mm

HIS R9 280X IceQ X² 297 mm 130 mm 34 mm 3 mm

MSI R9 280X Gaming OC 268 mm 120 mm 34 mm 3 mm

Sapphire R9 280X Toxic OC 301 mm 112 mm 34 mm 5 mm

Sapphire R9 280X Vapor-X OC 280 mm 122 mm 44 mm 5 mm

EDIT: Link here if you can't read that :http://www.tomshardware.com/reviews/radeon-r9-280x-third-party-round-up,3655-3.html

Looked much better before i posted it.....


----------



## darkelixa

Thanks a ton for that graph!

By the looks then, Ones that will fit will be the saphires, msi, gigabyte and His would be a tight ass fit


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Thanks a ton for that graph!
> 
> By the looks then, Ones that will fit will be the sapphires, msi, gigabyte and His would be a tight ass fit


no probs, Hit that rep button if you think i earned it







I found this a little earlier and have been basing my own decisions off it.

That review is pretty comprehensive, I'm sold on the HIS due to the lower temps and the decent price. but in your case yeah......tight fit, MSI looks good as well, same for Gigabyte if you aren't planning on overclocking due to the locked voltage.


----------



## darkelixa

Gave ya some rep loving!

Gigabyte i am very iffy on as my 770gtx is a gigabyte and urrrrggg it just gives me a huge headache just thinking about it.

So i would be more so inclined to either wait for the sapphires to come into stock which i dont really want to do, they arent expected till at least the 12th of december so getting before christmas might be a bit of a hit and miss or get a less expensive Msi and have no backplate


----------



## darkelixa

Msi say on there website 128mm and on toms says 120mm hmm


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Msi say on there website 128mm and on toms says 120mm hmm


Tom's might be measuring from the board itself as where MSI might be measuring from the PCIe Pins..........Maybe, not sure tbh.

Someone in here might be able to give you a definitive answer.


----------



## darkelixa

Either way the msi would definitely fit! Have to look and see if its worth the money thou


----------



## rcoolb2002

Just got 4x 280x Toxic.

I believe they are 2x8pin pci-2 each. My 1300G2 only has 6x8 connectors. Will I be alright with the molex->8pin adapter?

Also wondering what OC levels to expect with these.


----------



## madorax

Finally I ordered MSI 280x Gaming today over Gigabyte & Powercolor. although the price is the highest among other 2 brand i mentioned, I think MSI is more suitable for me. it should come by tomorrow or the day after ^^


----------



## Cid

It's been 6 weeks and still nobody has any HIS R9 280Xs in stock.


----------



## neXen

Quote:


> Originally Posted by *rcoolb2002*
> 
> Just got 4x 280x Toxic.
> 
> I believe they are 2x8pin pci-2 each. My 1300G2 only has 6x8 connectors. Will I be alright with the molex->8pin adapter?
> 
> Also wondering what OC levels to expect with these.


The molex adapter will work fine as long as your PSU can handle the load.


----------



## Mackem

I'm on 13.11 Beta9.4 with my 280X and every now and again I get this bug where for like a second whatever is on the display sort of flickers and shifts downwards by like 20 pixels then goes back to normal.


----------



## darkelixa

Either going to get the Msi r9 280x or a sapphire r9 290/xfx 290


----------



## hoevito

Ever so slight update...Matrix and Toxic together as one lol


----------



## DeadZeus

I'm on my second R9 280x now and i still get artifacts while i never had them with my good old HD 6870... Why is this happening


----------



## taem

So Newegg has a listing for Asus DC2T that prominently states "12/05/13 release date". http://www.newegg.com/Product/ProductList.aspx?Submit=Property&Subcategory=48&N=100007709&IsNodeId=1&IsPowerSearch=1&SrchInDesc=&MinPrice=&MaxPrice=&OEMMark=N&PropertyCodeValue=679%3A446073&Tpk=R9%20280x

So is that an XLT for sure? What do you guys think?


----------



## jamor

Quote:


> Originally Posted by *taem*
> 
> So Newegg has a listing for Asus DC2T that prominently states "12/05/13 release date". http://www.newegg.com/Product/ProductList.aspx?Submit=Property&Subcategory=48&N=100007709&IsNodeId=1&IsPowerSearch=1&SrchInDesc=&MinPrice=&MaxPrice=&OEMMark=N&PropertyCodeValue=679%3A446073&Tpk=R9%20280x
> 
> So is that an XLT for sure? What do you guys think?


I just ordered the Asus 280x on newegg for $300 total after my 5% discount.

It's a pre-order though and the rep told me hundreds are coming in this week


----------



## TempAccount007

Let us know which one you have once you get your dc2t.

I still don't know how to tell which chip my dc2t is.

Also, sharing your ASIC quality wouldn't hurt.


----------



## darkelixa

At this rate it wont be until next year that Australian retailers have sapphire vaporx/toxic r9 280s in stock.

Or I could just jump the gun and get an r9 290


----------



## Dimaggio1103

Pulled the trigger on a 270x devil edition. Here's hoping my return to AMD is met with good performance. Any idea on when mantle should be out?

Also will a phenom I I x 4 at 4ghz have any trouble feeding this GPU?


----------



## sonic2911

I got the giga 280x rev2 today. So disappointed










vs
giga wf gtx770


----------



## sugarhell

Furmark as a benchmark? For real?


----------



## darkelixa

So the gigabyte isnt a good brand to buy?


----------



## sonic2911

i've just tested the temp at 100% fan, to compare the wf performance of 2 revs










the shroud of rev2 is PLASTIC too, not METAL. CRAP -,-
Quote:


> Originally Posted by *darkelixa*
> 
> So the gigabyte isnt a good brand to buy?


its gtx770 is good but I won't choose 280x of gigabyte


----------



## sugarhell

Quote:


> Originally Posted by *sonic2911*
> 
> i've just tested the temp at 100% fan, to compare the wf performance of 2 revs


Its not a real load. Furmark just create heat and when you have a higher tdp gpu obviously it will gonna heat more if they have the same cooler. Try to loop valley or fs and then compare temps


----------



## sonic2911

Quote:


> Originally Posted by *sugarhell*
> 
> Its not a real load. Furmark just create heat and when you have a higher tdp gpu obviously it will gonna heat more if they have the same cooler. Try to loop valley or fs and then compare temps


I tried it. the 280x warmer ~3*C but lower noise
/what driver r u using guys? Im using 13.10


----------



## darkelixa

My 770 gtx just stutters like crazy, its gigabyte


----------



## sonic2911

Quote:


> Originally Posted by *darkelixa*
> 
> My 770 gtx just stutters like crazy, its gigabyte


i dont have any problem with it


----------



## darkelixa

Then I must have a bad 770gtx


----------



## sonic2911

Quote:


> Originally Posted by *darkelixa*
> 
> Then I must have a bad 770gtx


i think so, but try to reinstall ur windows, softwares and games...


----------



## darkelixa

Ive reinstalled windows 7, windows 8,windows 8.1 on two different ssds and 1 normal hdd and tried every single nvidia driver from the cards release date and it stutters every single time so I would assume its a nvidia driver/ card issue


----------



## sonic2911

Quote:


> Originally Posted by *darkelixa*
> 
> Ive reinstalled windows 7, windows 8,windows 8.1 on two different ssds and 1 normal hdd and tried every single nvidia driver from the cards release date and it stutters every single time so I would assume its a nvidia driver/ card issue


exchange another vga?


----------



## darkelixa

I cant just exchange it, the store i got it from say its not artifacting so nothing wrong with it. Its also been 4 or so months so no exchange. Just gonna swap to amd i think since the issue does not happen with the 7870/5850


----------



## sonic2911

get the asus 280x if u can, DCU2 ver


----------



## darkelixa

Only the sapphire/gigabyte/msi r9 280xs will fit with only msi in stock


----------



## Roaches

Stay away from Gigabyte until they make a version with the 450W WF3 cooler.
Quote:


> Originally Posted by *sonic2911*
> 
> i've just tested the temp at 100% fan, to compare the wf performance of 2 revs


So I was right the Rev.2 does has 3 heatpipes and aluminum fin array from Rev.1

Wow, they really went that far to changing the shroud and leaving it plastic instead of metal.
They're trying on improving image rather than the product itself, which is pretty ironic since Tahiti XT runs hotter than GK-104. It seriously screams for a better cooler.


----------



## sonic2911

Quote:


> Originally Posted by *Roaches*
> 
> Stay away from Gigabyte until they make a version with the 450W WF3 cooler.
> So I was right the Rev.2 does has 3 heatpipes and aluminum fin array from Rev.1
> 
> Wow, they really went that far to changing the shroud and leaving it plastic instead of metal.
> They're trying on improving image rather than the product itself, which is pretty ironic since Tahiti XT runs hotter than GK-104. It seriously screams for a better cooler.


Agree! but if giga use 450w cooler for amd cards, then price will go up


----------



## Roaches

Quote:


> Originally Posted by *sonic2911*
> 
> Agree! but if giga use 450w cooler for amd cards, then price will go up


ASUS pulled it off with their DCUII by reusing the GTX 780 cooler which is proven very effective and just 10 dollars over reference price also being the most quiet of the 280X.

I'm not sure why Gigabyte is putting out a Rev 2 since its virtually no different than Rev 1. It had the chance to use a new cooler during Rev 2 development phase and they shot themselves on the foot for ditching it.


----------



## sonic2911

do u think they will release rev3 with better cooler ?


----------



## Roaches

Quote:


> Originally Posted by *sonic2911*
> 
> do u think they will release rev3 with better cooler ?


If they want to be competitive


----------



## TempAccount007

Why is gigabyte a consideration for anyone?

Buying a voltage locked card doesn't make sense.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roaches*
> 
> ASUS pulled it off with their DCUII by reusing the GTX 780 cooler which is proven very effective and just 10 dollars over reference price also being the most quiet of the 280X.
> 
> I'm not sure why Gigabyte is putting out a Rev 2 since its virtually no different than Rev 1. It had the chance to use a new cooler during Rev 2 development phase and they shot themselves on the foot for ditching it.


I thought the HIS IceQ was the quietest 280x?

I'm using a Gigabyte Rev1 7970 atm 1100/1450 (locked voltage







) and max temp is 69c. considering it was 32c here today i'm calling that not bad at all.

i still can't recommend anything from Gigabyte but surely the 280x version can't be that different from mine?


----------



## Roaches

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I thought the HIS IceQ was the quietest 280x?
> 
> I'm using a Gigabyte Rev1 7970 atm 1100/1450 (locked voltage
> 
> 
> 
> 
> 
> 
> 
> ) and max temp is 69c. considering it was 32c here today i'm calling that not bad at all.
> 
> i still can't recommend anything from Gigabyte but surely the 280x version can't be that different from mine?


Sounds a bit louder IMO....


----------



## gh0stfac3killa

Quote:


> Originally Posted by *sonic2911*
> 
> I tried it. the 280x warmer ~3*C but lower noise
> /what driver r u using guys? Im using 13.10


Quote:


> Originally Posted by *sonic2911*
> 
> I tried it. the 280x warmer ~3*C but lower noise
> /what driver r u using guys? Im using 13.10


im running two of the orginal the non rev2 gigabyte r9 280x's and im running the current 13.11 beta 2 driver and they both run like champs and run quiet and cool. furmark like they say is jut a heat producer and with the factory overclock already in place gpu's will runn hotter than normal. as far as in game where it counts im trashing threw every game i throw at it aside from the garbage ghosts wich is a whole other thread lol. bf4, tomb raider, all the batmans, metros's my fps's are threw the roof compared to my two matrix gtx580s. yah i didnt compare them to the 600 series or 700 series gpus from nvidia, but from all the forums i read price for price performace and fps in game these get it done.

like i mentioned in a few messages earlier these are voltage locked, they do push a little further than the stock overclock but not very far. but for the price they throw down. and i just picked up two xfx double d r9 280x's to go in my little cm k-380 case. newegg had them in stock and i pulled the trigger on them so i will post soon on how they work and if there voltage locked or not. they say they come with a free copy of bf4 so looks like i should have two free copies, any takers.... lol.


----------



## gh0stfac3killa

ok oc peeps just pulled the trigger on some new xfx r9 280x's to throw in my little k-380 case. so once they get here i will post on some more info on them. i have read some good things and a few bad but you know how that goes. so far the other two r9's i have are working great once i get a handle on geting my directx 11.2 and all the other features there suppose to have like the mantel stuff. so if anyone has any info on weather or not directx 11.2 is available yet or if it is where and how i get it loaded please let a borther know cause so far did all the updates and new drivers and all i see is directx 11 on my pc...


----------



## kpo6969

Quote:


> Originally Posted by *taem*
> 
> So Newegg has a listing for Asus DC2T that prominently states "12/05/13 release date". http://www.newegg.com/Product/ProductList.aspx?Submit=Property&Subcategory=48&N=100007709&IsNodeId=1&IsPowerSearch=1&SrchInDesc=&MinPrice=&MaxPrice=&OEMMark=N&PropertyCodeValue=679%3A446073&Tpk=R9%20280x
> 
> So is that an XLT for sure? What do you guys think?


Quote:


> Originally Posted by *jamor*
> 
> I just ordered the Asus 280x on newegg for $300 total after my 5% discount.
> 
> It's a pre-order though and the rep told me hundreds are coming in this week


Release date no longer listed, only for Powercolor.
The Asus looks like the best all-around 280X.


----------



## taem

Quote:


> Originally Posted by *kpo6969*
> 
> Release date no longer listed, only for Powercolor.
> The Asus looks like the best all-around 280X.


Newegg catalog is refreshing quite frequently. Release date is back up for the DC2T. And yeah I think it's the best 280x also. I had one, returned it because it was too long. Unless your case is passively cooled you won't hear this card at full load, and temps don't go above 75c. The 2 DL-DVI ports are a huge plus to me as well since I'm into the Korean 1440p screens that only do dual link dvi and active displayport to dvi adapters are like $100. (Afaik only the Asus and Sapphire cards pack 2 DL dvi ports.) Really wish it were 5mm shorter...


----------



## hoevito

Here's some size comparison pics between a Matrix platinum and sapphire toxic if anyone's interested.

I have a few observations and thoughts to share:

The Sapphire toxic is MUCH quieter than people give it credit for!!! I just sent back an Asus DC2T just a few days ago, and I don't agree with all the reviews stating how quiet it was. It may have been quieter than a matrix platinum, but the matrix is noisy as hell anyway!!! The toxic by comparison I feel is far less noisy than the DC2T, and even at full load is relatively pleasant and nowhere near as bad as the matrix.

Also, the toxic is long and narrow compared to both of the Asus cards. I stacked the toxic on top of the matrix in the third shot as you can see, and the matrix is about a good 3/4 of an inch wider although the toxic seems to be nearly a full inch longer overall.

Now for the bad. The ASIC quality of this toxic I just received was only 66%, and it doesn't clock very well either. The highest stable frequency I've gotten out of it so far is 1200/1650mhz, and anything higher than that it's starts artifacting pretty badly. It also seems to be stuck at 1.256 volts, and even when raised higher in afterburner and Trixxx, it stays put. The ASIC of my matrix by the way is 68%, so it's not much better but it can at least clock up to 1220/1800mhz stable.


----------



## madorax

My MSI has come ^^

already flash the BIOS with newer version from techpowerup site to fix heat. default is 1050 / 1500. if i'm try to OC it, what volt should be safe to play?


----------



## mAs81

Quote:


> Originally Posted by *madorax*
> 
> My MSI has come ^^
> 
> already flash the BIOS with newer version from techpowerup site to fix heat. default is 1050 / 1500. if i'm try to OC it, what volt should be safe to play?


Hey man ,I got the same card and I was wondering how I can replace the old Bios with the newer one..
I downloaded the TF Gaming Bios(my default is 1020 / 1500)
EDIT:
Just checked and my card has the 0015.040.000.000 Bios version and the one I have downloaded is 015.039.000.001.003331..
Is it possible that I have the newer Bios


----------



## Twentytwo2277

Hello, I'm having a lot of trouble with my new XFX RADEON Double D R9 280X. I'm not getting the card to be recognized by Windows 8.1 at all.

I have 750w psu and I AM getting power to the card, and no POST errors. I'm thinking that my motherboard is the cause of the problems, its a Asus P8Z68-V Pro (I'm pretty sure its first generation.). I just updated the BIOS to 3606 (newest version), tested old card, which work, tested on board video, which also work. I even went as far as to bring the card to a buddies and it boots just fine on his rig. Does anyone have any suggestions for questions for me? Sorry for the wall-o-text but I've been at it all night, really annoying.

Thanks!


----------



## madorax

Quote:


> Originally Posted by *mAs81*
> 
> Hey man ,I got the same card and I was wondering how I can replace the old Bios with the newer one..
> I downloaded the TF Gaming Bios(my default is 1020 / 1500)
> EDIT:
> Just checked and my card has the 0015.040.000.000 Bios version and the one I have downloaded is 015.039.000.001.003331..
> Is it possible that I have the newer Bios


I'm using guide from THIS thread for flashing, before i save my original BIOS using GPU-Z.

and the update BIOS is official from MSI, HERE

This is the SS for the New BIOS btw:


This is the default clock / mem result:


Trying to OC a little bit, i get stable and get this score:


well considering my proc is not the OC version that only run default @ 3.2Ghz (3.6Ghz Turbo) i think the score is pretty good though ^^

Quote:


> Originally Posted by *Twentytwo2277*
> 
> Hello, I'm having a lot of trouble with my new XFX RADEON Double D R9 280X. I'm not getting the card to be recognized by Windows 8.1 at all.
> 
> I have 750w psu and I AM getting power to the card, and no POST errors. I'm thinking that my motherboard is the cause of the problems, its a Asus P8Z68-V Pro (I'm pretty sure its first generation.). I just updated the BIOS to 3606 (newest version), tested old card, which work, tested on board video, which also work. I even went as far as to bring the card to a buddies and it boots just fine on his rig. Does anyone have any suggestions for questions for me? Sorry for the wall-o-text but I've been at it all night, really annoying.
> 
> Thanks!


Have you tried to switch the BIOS in the card? other solution you can try is put it in another pcix slot. or try choose NORMAL boot instead of FAST in the UEFI BIOS.


----------



## Twentytwo2277

Hello, yes I've tried the other slots but still the same no signal BS. I couldn't seem to find the switch on my card, I'm wondering if its somewhere under the black casing. I know the switches and they're pretty big and hard to miss, so I'm not too sure. Do you recommend taking off the plastic that surrounds the fans?


----------



## madorax

Quote:


> Originally Posted by *Twentytwo2277*
> 
> Hello, yes I've tried the other slots but still the same no signal BS. I couldn't seem to find the switch on my card, I'm wondering if its somewhere under the black casing. I know the switches and they're pretty big and hard to miss, so I'm not too sure. Do you recommend taking off the plastic that surrounds the fans?


sorry my mistake, i don't know XFX card well, after i checked seem like it has only 1 BIOS and no switch... i though all 280x has double bios just like 7970...









maybe another member can help you out, i don't have any more solution for your problem right now...

=====================

anyway stable 3 times on firestrike, so iguest this setting is legit


----------



## Sgt Bilko

Quote:


> Originally Posted by *Twentytwo2277*
> 
> Hello, yes I've tried the other slots but still the same no signal BS. I couldn't seem to find the switch on my card, I'm wondering if its somewhere under the black casing. I know the switches and they're pretty big and hard to miss, so I'm not too sure. Do you recommend taking off the plastic that surrounds the fans?


The switch should be right next to the Crossfire connectors, same as every other 79xx / 280x, i can't see why XFX would change it.

EDIT: This is the best High Res pic i can find: http://content.hwigroup.net/images/products/xl/200211/2/xfx_radeon_r9_280x_double_dissipation_black_oc_3gb.jpg

next to the Crossfire connectors is a screw and either side there are the numbers 1 & 2. I'm assuming this is the Bios Switch, i can't see it but you might be able to.

EDIT2: Ok i might be wrong about there being a Bios switch altogether, taken from Fudzilla: "XFX actually has two bios versions for the card. The second bios, which is practically the overclocking bios, will be available online and it will allow users to flash the bios from Windows or DOS."


----------



## Twentytwo2277

Ok cool, yea i see the 1 & 2 in that pic and thats what I was looking for all along, but nothing on XFX I guess. I found the drivers so I'll be trying those shorty. Not sure how to flash the card bios in Windows, but I'll see what I can do.


----------



## jamor

Quote:


> Originally Posted by *kpo6969*
> 
> Release date no longer listed, only for Powercolor.
> The Asus looks like the best all-around 280X.


Ya the Asus is the only 280x I felt confident buying so when I saw I could get it at $300 I jumped.

It should come in next week according to their timeline - what do I look for>?


----------



## Sgt Bilko

Quote:


> Originally Posted by *jamor*
> 
> Ya the Asus is the only 280x I felt confident buying so when I saw I could get it at $300 I jumped.
> 
> It should come in next week according to their timeline - what do I look for>?


i am tempted by the Asus but i want two 280x's and i can fit 2 DCU II's in my rig but im not sure if the crossfire connectors would be long enough









Long way between the 1st and 3rd slot on a CVF

EDIT: nvm, Asus have it all covered: http://us.estore.asus.com/index.php?l=product_detail&p=4812


----------



## darkelixa

Sup Sgt,
Decide on a card yet?

Im close to buying a 290 and calling it a day


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Sup Sgt,
> Decide on a card yet?
> 
> Im close to buying a 290 and calling it a day


I have to wait until my RMA goes through before i can do anything, Not sure how it's gonna go tbh.......still hoping they will give me a refund and then i can grab 2 HIS 280x's and call it, although i'm still hopeful that the non-ref 290's will be out by then (last i heard was late Dec - early Jan) and i can grab 2 of them instead.

Ahh....decisions decisions.......


----------



## mAs81

Quote:


> Originally Posted by *madorax*
> 
> I'm using guide from THIS thread for flashing, before i save my original BIOS using GPU-Z.
> 
> and the update BIOS is official from MSI, HERE
> 
> This is the SS for the New BIOS btw:


The second link for the official MSI update directs me also to the tutorial..
Anyway,I'll check the info you gave me and I'll see what I can do..
Cheers,mate.


----------



## darkelixa

Even the non ref 290s are really good, they arent that bad of a cooler, my 5850 xfx is a ref card and it sits on 80 degress and i cant hear it over my case fans


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Even the non ref 290s are really good, they arent that bad of a cooler, my 5850 xfx is a ref card and it sits on 80 degress and i cant hear it over my case fans


yeah, they are loud..........if you want to keep the card off the thermal limit anyway.

And with Summer coming for us........i want a card that can cool down without sounding like a F-15 in my case


----------



## madorax

Quote:


> Originally Posted by *mAs81*
> 
> The second link for the official MSI update directs me also to the tutorial..
> Anyway,I'll check the info you gave me and I'll see what I can do..
> Cheers,mate.


Sorry my mistake, THIS is the right link for the bios.


----------



## Dimaggio1103

Since my question seemed to have gotten skipped over.

Will a phenom 965 at 4ghz be enough to feed a 270x? Or will it be a massive bottlneck?

Also any eta on mantle I searched and found nothing.


----------



## sugarhell

965 will be fine. Mantle will come 25 december on bf4


----------



## battleaxe

Quote:


> Originally Posted by *sugarhell*
> 
> 965 will be fine. Mantle will come 25 december on bf4


Agreed, you will be fine. Just no crossfire or you will bottleneck for sure.


----------



## darkelixa

Air con for the win, I doubt Australian retailers are going to have stock of any decent r9 280xs any time soon man


----------



## darkelixa

Just had a look on pccg, His 280x has been removed..


----------



## Dimaggio1103

Quote:


> Originally Posted by *battleaxe*
> 
> Agreed, you will be fine. Just no crossfire or you will bottleneck for sure.


I will definitely be crossfire down the road but will be sure to grab a fx 8 core when I do. Thanks fellas plus rep.


----------



## jamor

Quote:


> Originally Posted by *Sgt Bilko*
> 
> i am tempted by the Asus but i want two 280x's and i can fit 2 DCU II's in my rig but im not sure if the crossfire connectors would be long enough
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Long way between the 1st and 3rd slot on a CVF
> 
> EDIT: nvm, Asus have it all covered: http://us.estore.asus.com/index.php?l=product_detail&p=4812


So the Asus x2 would work with that connector in your PC?

Still going hIS?


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Just had a look on pccg, His 280x has been removed..


Scorptech still has them in stock atm, worst case scenario i order them from the US








Quote:


> Originally Posted by *jamor*
> 
> So the Asus x2 would work with that connector in your PC?
> 
> Still going hIS?


And yeah, i've seen some pics and a 100mm Bridge will fit a Triple slot cooler.

here is another if you are interested: http://www.ebay.com/itm/ATI-Long-CrossFire-Bridge-Connecctor-10cm-10-cm-100mm-/280510092319

I'll still be going HIS if all goes well, if only for the fact that Crossfire DCU II's means no gap between the PCB of the 2nd card and the fan of the first.

that and i like HIS, still have a pair of IceQ4 4850 Turbo's that are still being used day in day out for over 5 years now


----------



## jamor

Thanks I'll have to bookmark that.

Just got an auto-notify that Newegg has stocked HIS 280x!

http://www.newegg.com/Product/Product.aspx?Item=N82E16814161441


----------



## Sgt Bilko

Quote:


> Originally Posted by *jamor*
> 
> Thanks I'll have to bookmark that.
> 
> Just got an auto-notify that Newegg has stocked HIS 280x!
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814161441


No probs.

Nice!

Argh, i want this RMA to be over already


----------



## Mackem

Still got this random issue with my Asus 280X when all of the pixels randomly move down the screen by like 20 pixels for like a second then go back to normal. Not sure whether to try any different drivers or not. I'm currently using 13.11 Beta 9.4. Which drivers are the most stable?


----------



## Faster_is_better

What's the general consensus for best 280x ? I'd like one with good cooler and voltage control, but not necessarily the top of the line model. $300-310 would be best.

Going to be using for mining or gaming, or both. I see the latest mining craze has made them all go OOS, or maybe they have been OOS for a while due to popularity?


----------



## ImJJames

Quote:


> Originally Posted by *Faster_is_better*
> 
> What's the general consensus for best 280x ? I'd like one with good cooler and voltage control, but not necessarily the top of the line model. $300-310 would be best.
> 
> Going to be using for mining or gaming, or both. I see the latest mining craze has made them all go OOS, or maybe they have been OOS for a while due to popularity?


If you find a 280x in stock for $300 right now you better buy it...they are all going out of stock.


----------



## jamor

Quote:


> Originally Posted by *Faster_is_better*
> 
> What's the general consensus for best 280x ? I'd like one with good cooler and voltage control, but not necessarily the top of the line model. $300-310 would be best.
> 
> Going to be using for mining or gaming, or both. I see the latest mining craze has made them all go OOS, or maybe they have been OOS for a while due to popularity?


Some people can't choose Asus because it's 11.2 inches so you should check to make sure you can fit it.

I looked around and Asus is my favorite 280x here.. Direct cu ii cooling (HIS has great cooling too), quality craftsmanship, and it has the best factory Memory overclock at 1600 mhz vs 1500 which is nice for those who don't want to overclock their cards for stability and longevity reasons

Just picked up the 280x asus for $300 flat after the 5% coupon (expired)


----------



## Faster_is_better

Quote:


> Originally Posted by *jamor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Faster_is_better*
> 
> What's the general consensus for best 280x ? I'd like one with good cooler and voltage control, but not necessarily the top of the line model. $300-310 would be best.
> 
> Going to be using for mining or gaming, or both. I see the latest mining craze has made them all go OOS, or maybe they have been OOS for a while due to popularity?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Some people can't choose Asus because it's 11.2 inches so you should check to make sure you can fit it.
> 
> I looked around and Asus is my favorite 280x here.. Direct cu ii cooling (HIS has great cooling too), quality craftsmanship, and it has the best factory Memory overclock at 1600 mhz vs 1500 which is nice for those who don't want to overclock their cards for stability and longevity reasons
> 
> Just picked up the 280x asus for $300 flat after the 5% coupon (expired)
Click to expand...

Ok, thanks. I see Newegg has a Powercolor card that can be backordered, anyone know if it has voltage control? It could be a long wait to find one of the preferred cards back in stock...


----------



## jamor

Quote:


> Originally Posted by *Faster_is_better*
> 
> Ok, thanks. I see Newegg has a Powercolor card that can be backordered, anyone know if it has voltage control? It could be a long wait to find one of the preferred cards back in stock...


I like Powercolor and the 280x looks no different. The 280x you should stay away from is the Gigabyte in this case.

Asus was accepting hundreds of pre-orders all day yesterday at newegg for a big shipment.

What I did to get the Asus 280x was sign up for Auto-Notfication and as soon as it came in stock yesterday they e-mailed me and I bought it right away with the 5% mobile code. You should set all the 280x to auto-notify and enter your email.

I also got an auto-notfication today for the HIS which you could have purchased today too but you need to buy it the minute you get a notifcation.


----------



## Faster_is_better

Quote:


> Originally Posted by *jamor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Faster_is_better*
> 
> Ok, thanks. I see Newegg has a Powercolor card that can be backordered, anyone know if it has voltage control? It could be a long wait to find one of the preferred cards back in stock...
> 
> 
> 
> I like Powercolor and the 280x looks no different. The 280x you should stay away from is the Gigabyte in this case.
> 
> Asus was accepting hundreds of pre-orders all day yesterday at newegg for a big shipment.
> 
> What I did to get the Asus 280x was sign up for Auto-Notfication and as soon as it came in stock yesterday they e-mailed me and I bought it right away with the 5% mobile code. You should set all the 280x to auto-notify and enter your email.
> 
> I also got an auto-notfication today for the HIS which you could have purchased today too but you need to buy it the minute you get a notifcation.
Click to expand...

Yes I setup notifications for all of those + RSS scanning + something else... its a race to the purchase. Good to see that they are still coming into stock semi-frequently, so I'll just wait a few days and see if I can get one of the better ones. AMD must be loving this, consoles using their hardware, mining using their hardware, + they are just great price/perf. Wins on all accounts for them.


----------



## taem

Quote:


> Originally Posted by *ImJJames*
> 
> If you find a 280x in stock for $300 right now you better buy it...they are all going out of stock.


Only because board partners are switching over to XLT so it's the remaining XT2 stock that's not being replenished. If you're not in a hurry you might want to wait, depending on what your priority is: I hear XT2 overclocks better, but XLT runs cooler and draws less power.

But, kudos to AMD for coming up with cards everyone wants, but failing to get it to market in time for Black Friday and cyber Monday. Watch them miss Christmas altogether and get production n order right after nvidia comes up with a plausible riposte. Who's running this company, microsoft?


----------



## jamor

Quote:


> Originally Posted by *taem*
> 
> Only because board partners are switching over to XLT so it's the remaining XT2 stock that's not being replenished. If you're not in a hurry you might want to wait, depending on what your priority is: I hear XT2 overclocks better, but XLT runs cooler and draws less power.
> 
> But, kudos to AMD for coming up with cards everyone wants, but failing to get it to market in time for Black Friday and cyber Monday. Watch them miss Christmas altogether and get production n order right after nvidia comes up with a plausible riposte. Who's running this company, microsoft?


What do I look for when I get my card to know if its new XLT or old XT2?


----------



## battleaxe

Quote:


> Originally Posted by *taem*
> 
> Only because board partners are switching over to XLT so it's the remaining XT2 stock that's not being replenished. If you're not in a hurry you might want to wait, depending on what your priority is: I hear XT2 overclocks better, but XLT runs cooler and draws less power.
> 
> But, kudos to AMD for coming up with cards everyone wants, but failing to get it to market in time for Black Friday and cyber Monday. Watch them miss Christmas altogether and get production n order right after nvidia comes up with a plausible riposte. Who's running this company, microsoft?


LOL... so true. So true.

I'm hoping for an non ref 290 by Christmas. Probably be sold out of those too.


----------



## Sgt Bilko

Quote:


> Originally Posted by *battleaxe*
> 
> LOL... so true. So true.
> 
> I'm hoping for an non ref 290 by Christmas. Probably be sold out of those too.


That's what i'm hoping for as well, If the Non-Ref 290's drop before January then i'll grab 2, if not then i'm going with 280x's


----------



## TempAccount007

Quote:


> Originally Posted by *jamor*
> 
> What do I look for when I get my card to know if its new XLT or old XT2?


People keep talking about these different chips, but so far no one has said how to tell the difference.

I don't get it.


----------



## Durvelle27

*List Updated*


----------



## taem

Toxic in stock right now at Newegg. But if you're not reading this the second after I post it it's probably too late.

Edit 30 seconds after post, gone lol. Guess they got one unit.


----------



## Faster_is_better

Quote:


> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ImJJames*
> 
> If you find a 280x in stock for $300 right now you better buy it...they are all going out of stock.
> 
> 
> 
> Only because board partners are switching over to XLT so it's the remaining XT2 stock that's not being replenished. If you're not in a hurry you might want to wait, depending on what your priority is: I hear XT2 overclocks better, but XLT runs cooler and draws less power.
> 
> But, kudos to AMD for coming up with cards everyone wants, but failing to get it to market in time for Black Friday and cyber Monday. Watch them miss Christmas altogether and get production n order right after nvidia comes up with a plausible riposte. Who's running this company, microsoft?
Click to expand...

This is the first I've heard of this. Have a link to a read up on the differences between them?


----------



## taem

Quote:


> Originally Posted by *Faster_is_better*
> 
> This is the first I've heard of this. Have a link to a read up on the differences between them?


http://www.overclock.net/t/1436949/cl-r9-280x-to-get-updated-gpu-tahiti-xtl-in-a-few-weeks/100_50


----------



## Miiksu

Hello everyone! With my new HIS R9 280X Turbo card I was able to get BF4 stable @1200/1760 with 1.293 vddc. Base voltage is 1.256. Quite high? Asic quality is 60.3% Not really happy of the oc but it seems run quite cool. GPU max temp was 66-68 with max oc and VRM 64°C.


----------



## jamor

Quote:


> Originally Posted by *Miiksu*
> 
> Hello everyone! With my new HIS R9 280X Turbo card I was able to get BF4 stable @1200/1760 with 1.293 vddc. Base voltage is 1.256. Quite high? Asic quality is 60.3% Not really happy of the oc but it seems run quite cool. GPU max temp was 66-68 with max oc and VRM 64°C.


cooling is most important part! cool is cool. enjoy!


----------



## Devildog83

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Since my question seemed to have gotten skipped over.
> 
> Will a phenom 965 at 4ghz be enough to feed a 270x? Or will it be a massive bottlneck?
> 
> Also any eta on mantle I searched and found nothing.


I too think you will be fine, just know this, I don't know what kind of motherboard you have but if you get a FX 8 core I would make sure I had a 990 chipset. An 8+1+1 or 8+2 phase VRM's are almost a requirement for the 8150 or 8350 IMHO. I love my 270x but it is in RMA right now due to a defect in the silicon on the X-Fire connector causing me weird pink screens in games and bench's. Good thing I have a spare Devil.


----------



## Dimaggio1103

Quote:


> Originally Posted by *Devildog83*
> 
> I too think you will be fine, just know this, I don't know what kind of motherboard you have but if you get a FX 8 core I would make sure I had a 990 chipset. An 8+1+1 or 8+2 phase VRM's are almost a requirement for the 8150 or 8350 IMHO. I love my 270x but it is in RMA right now due to a defect in the silicon on the X-Fire connector causing me weird pink screens in games and bench's. Good thing I have a spare Devil.


Ya that's the card I ordered. Can't wait to get the devil.

I have the gigabyte 990fx ud3 v.4 it has 8+2 I know I'll be ok. Strong board the last thing to upgrade is the cpu. Unless mantle does wonders.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> I too think you will be fine, just know this, I don't know what kind of motherboard you have but if you get a FX 8 core I would make sure I had a 990 chipset. An 8+1+1 or 8+2 phase VRM's are almost a requirement for the 8150 or 8350 IMHO. I love my 270x but it is in RMA right now due to a defect in the silicon on the X-Fire connector causing me weird pink screens in games and bench's. Good thing I have a spare Devil.


Thats simply not true. 8 phase boards are not a requirement for FX 8 series chips just most ppl recommend them. I use a M5A97 EVO just fine with problems and its rock solid


----------



## jamor

Quote:


> Originally Posted by *Durvelle27*
> 
> *List Updated*


Thanks... welcoming..


----------



## darkelixa

OMG pccg are getting sapphire r9 280x toxics this week $429.

Should I buy a toxic for 429 or

r9 290 $475


----------



## Durvelle27

Quote:


> Originally Posted by *darkelixa*
> 
> OMG pccg are getting sapphire r9 280x toxics this week $429.
> 
> Should I buy a toxic for 429 or
> 
> r9 290 $475


Definitely R9 290


----------



## darkelixa

Ah ok, I just see alot of users having issues with black screens and heat with the r9 290s, so that shouldnt steer me clear of the higher r9 290?


----------



## madorax

Quote:


> Originally Posted by *darkelixa*
> 
> OMG pccg are getting sapphire r9 280x toxics this week $429.
> 
> Should I buy a toxic for 429 or
> 
> r9 290 $475


wow that's even higher price than me here in Indonesia. toxic is $389 which is out of stock, thats why i pick MSI for $384. HIS 290 is $413 in here.

I notice BF4 is good for stability & OC, because my oc setting @ 1150 / 1820 run fine in firestrike & unigine but after 30 minutes in BF4 it start to misplaced-texture, so now i run @ 1100 / 1700 after 40 minutes in BF4 it's seems fine.


----------



## stolojansky

I really dont recommend to buy R9 290 even with a nice and cheap price. Wait till Sapphire, Asus and others will strike the market with new shiny and cold custom builds. Reference R9 290 could be a nice heater in winter's time.
Anyway, if u wanna play in 1920x1080, R9 280X is overkill.

Peace


----------



## leongws

Quote:


> Originally Posted by *madorax*
> 
> My MSI has come ^^
> 
> already flash the BIOS with newer version from techpowerup site to fix heat. default is 1050 / 1500. if i'm try to OC it, what volt should be safe to play?


Need to check with fellow bros who have flashed bios to this card. Do we need to select specific switch number before flashing? Example must be at position 1. What are the different between 1 & 2? Mine are already set to position #2 when I open it after buying last week. So far no issue with the card. Have not switch the bios to position #1 though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Ah ok, I just see alot of users having issues with black screens and heat with the r9 290s, so that shouldnt steer me clear of the higher r9 290?


Black screens and other issues are pretty much dealt with now, it seems to have only affected the people who bought them launch day (me)
Quote:


> Originally Posted by *stolojansky*
> 
> I really dont recommend to buy R9 290 even with a nice and cheap price. Wait till Sapphire, Asus and others will strike the market with new shiny and cold custom builds. Reference R9 290 could be a nice heater in winter's time.
> Anyway, if u wanna play in 1920x1080, R9 280X is overkill.
> 
> Peace


Well the heat can be controlled, The noise is no different from any other Ref cooler imo and at 60% fan you can keep it at 70c max and thus stop it from throttling.

And Non-Ref 290/x's are due out in January last i heard.......

And in other news, i finally got my 290x off in the post today so hopefully my RMA will be done before the weekend if all goes well


----------



## madorax

Quote:


> Originally Posted by *leongws*
> 
> Need to check with fellow bros who have flashed bios to this card. Do we need to select specific switch number before flashing? Example must be at position 1. What are the different between 1 & 2? Mine are already set to position #2 when I open it after buying last week. So far no issue with the card. Have not switch the bios to position #1 though.


i just follow the tutorial for flashing bios with ati tools and using swith 1 (which is the default position in my card), in switch 2 say it's for backup bios and can't be flashed, so i'm not checking it whether it can be flash or not. but all bios in switch 1 and 2 are the same in my card, i already checked it before flashing with newer bios.


----------



## darkelixa

Most cards in aus come at a premium


----------



## TempAccount007

Quote:


> Originally Posted by *Miiksu*
> 
> Hello everyone! With my new HIS R9 280X Turbo card I was able to get BF4 stable @1200/1760 with 1.293 vddc. Base voltage is 1.256. Quite high? Asic quality is 60.3% Not really happy of the oc but it seems run quite cool. GPU max temp was 66-68 with max oc and VRM 64°C.


That's the best I can do with my DC2T. 1200mhz @ 1.3v. Haven't touched memory clocks... overclocking it won't improve performance. ASIC quality is 68%. What's the max voltage you can set on the HIS?


----------



## Miiksu

Quote:


> Originally Posted by *TempAccount007*
> 
> That's the best I can do with my DC2T. 1200mhz @ 1.3v. Haven't touched memory clocks... overclocking it won't improve performance. ASIC quality is 68%. What's the max voltage you can set on the HIS?


My card max was 1.381 via iTurbo. Bretty high.. I did not test other overclocking softwares.


----------



## leongws

Quote:


> Originally Posted by *madorax*
> 
> i just follow the tutorial for flashing bios with ati tools and using swith 1 (which is the default position in my card), in switch 2 say it's for backup bios and can't be flashed, so i'm not checking it whether it can be flash or not. but all bios in switch 1 and 2 are the same in my card, i already checked it before flashing with newer bios.


Thanks for the valuable information. I just flashed the bios successfully.


----------



## madorax

Quote:


> Originally Posted by *leongws*
> 
> Thanks for the valuable information. I just flashed the bios successfully.


time for OC and share the result ^^


----------



## jamor

Quote:


> Originally Posted by *Jaju123*
> 
> I ordered two 290s and they were shipped today. Should be in my excited hands by tomorrow (Netherlands).
> 
> Any idea how the performance in crossfire will be affected by PCI-e 2.0 x8?
> 
> I am coming from two GTX 680s, on a MSI Z68a-GD65(g3) but with a 2500k @ 4.7 ghz.
> 
> Hopefully my Corsair AX 850W will be enough! AMD said it was on twitter.


wrong club!


----------



## gnemelf

okay so excuse me for sounding stupid but what bios is everyone flashing their 280x with?? I've got the powercolor 280x but I haven't been able to push it that far.. only at 1150/1600 ...


----------



## jamor

Quote:


> Originally Posted by *gnemelf*
> 
> okay so excuse me for sounding stupid but what bios is everyone flashing their 280x with?? I've got the powercolor 280x but I haven't been able to push it that far.. only at 1150/1600 ...


Why would you flash a bios to overclock your 280x?


----------



## gnemelf

so are they flashing their 7970's to 280x? thats all I'm trying to figure out...


----------



## Miiksu

Quote:


> Originally Posted by *jamor*
> 
> cooling is most important part! cool is cool. enjoy!


She is quiet and cool. That's why I bought HIS but I was hoping little better oc results. But I can't really complain, great cooler was what I hoped for and unlocked voltages.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Miiksu*
> 
> She is quiet and cool. That's why I bought HIS but I was hoping little better oc results. But I can't really complain, great cooler was what I hoped for and unlocked voltages.


Thats very good news, exactly what i wanted to hear


----------



## jamor

My 280x asus pre-order has been confirmed and is being packaged woot.

Paired with my new Eizo monitor just got real


----------



## NinjaToast

Quote:


> Originally Posted by *gnemelf*
> 
> okay so excuse me for sounding stupid but what bios is everyone flashing their 280x with?? I've got the powercolor 280x but I haven't been able to push it that far.. only at 1150/1600 ...


Quote:


> Originally Posted by *gnemelf*
> 
> so are they flashing their 7970's to 280x? thats all I'm trying to figure out...


Not everyone is flashing their bios, only people who have the MSI R9 280X are flashing their bios to a new version given to a techpowerup reviewer after he reviewed the card, he then uploaded it to the site for everyone else because the launch MSI R9 280X card had/has heat issues and the new bios corrects that.


----------



## Lysergix710

i just experienced a blank red screen crash whilst playing bf4 with no error message upon reboot did a little google and found a post saying they updated the bios and gpu bios and it fixed it up. So im going to update my bios, the problem is i dont know which one to use if anyone can help. these are the ones http://www.gigabyte.com.au/products/product-page.aspx?pid=4793#bios and this is my current revision string is 015.038.000.003.003435 (113-C3865000-X01)


----------



## madorax

Quote:


> Originally Posted by *gnemelf*
> 
> okay so excuse me for sounding stupid but what bios is everyone flashing their 280x with?? I've got the powercolor 280x but I haven't been able to push it that far.. only at 1150/1600 ...


its in techpowerup bios section.

btw, I flashing my card because known issue in stock bios of MSI 280x card, that's why it's necessary to flash with newer bios provided by MSI itself, and not because it can oc better









your card is the cheapest 280x in market here in my place, HIS is 2nd, sadly all is out of stock, so i must go with MSI with higher price ($55 more), if they are ready i will take powercolor or HIS for sure. they are great for OC and cool. 1150/1600 is great achievement anyway









=============================

you know... sometimes there's something that you find out really2 suck. I just ordered this MSI 280x on monday, arrive on tuesday and been happy since then.... until this time.... my distro just lauch MSI 280x BF4 version WITH THE SAME PRICE!!!!!


----------



## TempAccount007

Quote:


> Originally Posted by *Miiksu*
> 
> My card max was 1.381 via iTurbo. Bretty high.. I did not test other overclocking softwares.


So. Much. Voltage.

I want.

Does HIS have VRM temp sensors?

All That voltage makes up for the low ASIC as long as you keep VRM temps in check.

I think only matrix beats that voltage, and that's a 3 slot card.


----------



## gnemelf

thanks for clearing that up, wasn't sure if something had surfaced in all of the 280x's or if they were talking flashing 7970s to 280x... I'm happy with what I have I can pull 7031 on firestrike so that works for me... also to get it out there the voltage control is a lil odd on the powercolor it'll let me jump like.06 at a time can't really just pick a voltage it work at that... I bumped up the ram to 1.55 from 1.5 and its alot more stable imo. oh and temps are like 73 max vcore temp and like 72c max on vram temps in furmark ... everything else is like 65c vcore and 68c vram max on stock with the fan curve adjusted.


----------



## mAs81

Quote:


> Originally Posted by *madorax*
> 
> its in techpowerup bios section.
> btw, I flashing my card because known issue in stock bios of MSI 280x card, that's why it's necessary to flash with newer bios provided by MSI itself, and not because it can oc better



So this is my current Bios..It seems that I already have the new one,right...?


----------



## Miiksu

Quote:


> Originally Posted by *TempAccount007*
> 
> So. Much. Voltage.
> 
> I want.
> 
> Does HIS have VRM temp sensors?
> 
> All That voltage makes up for the low ASIC as long as you keep VRM temps in check.
> 
> I think only matrix beats that voltage, and that's a 3 slot card.


Yes. Two of them. My card VRM did never go past 70°C.


----------



## Jaffi

I got the Gigabyte Radeon 280X rev. 2. When I run Furmark with +20% powertarget, my system just shuts down and boots again after 3 sec. I know this might be due to a insufficient power supply, but mine should handle it well. It is a "be quiet! Straight Power E9-CM 580W". Is it possible, that the card draws so much power? Below are my furmark settings:



Also, isn't it possible to read the card's VRM temperatures? They don't show up in latest GPU-Z.


----------



## WiZARD7

Is there any way to control the voltage on 270(x) cards?
I have an asus 270 directcu, and a sapphire 270x vaporx, and I can't control the gpu voltage...
I've tried asus gpu tweaker, sapphire, trixx, msi afterburner, but I can set the voltage to anything, it won't change, it remains the same in gpu-z

Is there any SW to change it, or any bios mod, where I can change it?
I'd like to try to undervolt, to make the cards run cooler and more silent.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jaffi*
> 
> I got the Gigabyte Radeon 280X rev. 2. When I run Furmark with +20% powertarget, my system just shuts down and boots again after 3 sec. I know this might be due to a insufficient power supply, but mine should handle it well. It is a "be quiet! Straight Power E9-CM 580W". Is it possible, that the card draws so much power? Below are my furmark settings:
> 
> 
> 
> Also, isn't it possible to read the card's VRM temperatures? They don't show up in latest GPU-Z.


FurMark will pull the most power out of anything you can run (even Benches).

I don't use it but at 580w then it's possible thats whats happening, your PSU can't supply enough power for it in FurMark......but don't stress, so long as gaming is fine


----------



## Devildog83

From what I have read the minimum for a 280x is 600w, most recommend 750w and the Toxic minimum looks like a 750w. I would upgrade just to give some headroom. The PSU is the heart of the system and you don't want to cheap out on it. Even a good CX 750 doesn't cost that much and would work great. This one right here is a good one for a good price.

http://www.newegg.com/Product/Product.aspx?Item=N82E16817139051

Edit: Get HWinfo64 and it will read all volts/amps and temps. Here - http://www.majorgeeks.com/files/details/hwinfo64.html


----------



## battleaxe

Quote:


> Originally Posted by *Jaffi*
> 
> I got the Gigabyte Radeon 280X rev. 2. When I run Furmark with +20% powertarget, my system just shuts down and boots again after 3 sec. I know this might be due to a insufficient power supply, but mine should handle it well. It is a "be quiet! Straight Power E9-CM 580W". Is it possible, that the card draws so much power? Below are my furmark settings:
> 
> 
> 
> Also, isn't it possible to read the card's VRM temperatures? They don't show up in latest GPU-Z.


Furmark has also been killing GPU's, so you might not want to use that anymore either. Its not really valid anymore anyway. Try 3dMark11 instead. Or Firestrike for that matter.


----------



## jamor

Quote:


> Originally Posted by *battleaxe*
> 
> Furmark has also been killing GPU's, so you might not want to use that anymore either. Its not really valid anymore anyway. Try 3dMark11 instead. Or Firestrike for that matter.


Can it give a card poor performance or does it just fry it?


----------



## Jaffi

Quote:


> Originally Posted by *jamor*
> 
> Can it give a card poor performance or does it just fry it?


I think one would easily notice if a card is broken, correct me if I am wrong.
Thanks for all your suggestions, will keep my hands off furmark. If it would fry my brand new card, I would find myself in tears


----------



## jamor

Quote:


> Originally Posted by *Jaffi*
> 
> I think one would easily notice if a card is broken, correct me if I am wrong.
> Thanks for all your suggestions, will keep my hands off furmark. If it would fry my brand new card, I would find myself in tears


not necessarily. I used furmark and around the same time my 6850 was stuttering in dota 2 which never lags so Im trying to verify what kind of problems it can cause. it could easily been something else but you never know which is why Im asking


----------



## Jaffi

Quote:


> Originally Posted by *jamor*
> 
> not necessarily. I used furmark and around the same time my 6850 was stuttering in dota 2 which never lags so Im trying to verify what kind of problems it can cause. it could easily been something else but you never know which is why Im asking


Alright then someone might shed some light on this








Have you also tried to compare benchmarks?


----------



## battleaxe

These things are sold out everywhere.... HIS had one at Newegg for 10 minutes, now its gone.


----------



## jamor

Quote:


> Originally Posted by *battleaxe*
> 
> These things are sold out everywhere.... HIS had one at Newegg for 10 minutes, now its gone.


They've been in and out of stock like mad just keep that auto-notify on. Asus was accepting hundreds of orders earlier this week and I snatched one.
Quote:


> Originally Posted by *Jaffi*
> 
> Alright then someone might shed some light on this
> 
> 
> 
> 
> 
> 
> 
> 
> Have you also tried to compare benchmarks?


No I can do that tonight though. I should make sure before I sell it.


----------



## Miiksu

Quote:


> Originally Posted by *battleaxe*
> 
> These things are sold out everywhere.... HIS had one at Newegg for 10 minutes, now its gone.


Yes. I waited HIS 1.5 months. It was worth it. I bought my from Finland. It was 309 euros.


----------



## Durvelle27

*List Updated*


----------



## imranh101

Just picked up a R9 280x for $240 + tax. Had none of the AMD never settle cards, so gave me one for NVidia for AC:BF and Splinter cell. Anything cool to know about the 280x?


----------



## dayen666

My Sapphire 280X Toxic has arrived finally!











Is there any good guide for overclocking?
One question, does anyone know if the voltage of this card is blocked?


----------



## Derpinheimer

Quote:


> Originally Posted by *imranh101*
> 
> Just picked up a R9 280x for $240 + tax. Had none of the AMD never settle cards, so gave me one for NVidia for AC:BF and Splinter cell. Anything cool to know about the 280x?


Where??? What store?


----------



## imranh101

Microcenter. Was open box, but complete.
Meaning someone bought it took it home didnt like it for whatever reason and brought it back. Still had all packaging and accessories most of which were never opened. There were two of them, so I think someone got them for Xfire then decided against it lol


----------



## Amhro

Quote:


> Originally Posted by *dayen666*
> 
> My Sapphire 280X Toxic has arrived finally!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is there any good guide for overclocking?
> One question, does anyone know if the voltage of this card is blocked?


Toxic's voltage is unlocked as far as I know.


----------



## Miiksu

Keep your fingers crossed until you see how far it will go. Silicon lotterry is fun or not but atleast it keep people excited =)


----------



## Milestailsprowe

Anyone know some stable over clocks for the 270X. It can do 1120/1430 then its just errors. How can I do better


----------



## Devildog83

Quote:


> Originally Posted by *Milestailsprowe*
> 
> Anyone know some stable over clocks for the 270X. It can do 1120/1430 then its just errors. How can I do better


Mine is voltage locked but I run all day at 1200/1400 but that's in X-Fire with my 7870. I have benched at 1225/1550 but anything much over 1400 on the memory yields no improvement in bench's. If I was not running X-Fire I would most likely run 1200/1450. I am not sure what the MSI is capable of though. Try running higher core and maybe 1375 on memory.


----------



## Devildog83

Did you get your card at New egg?


----------



## Mackem

Installed 13.11 Beta 9.5 and got a graphics glitch on the screen in a game of League of Legends. Also, when I was on Beta 9.4, I got a BSOD earlier. I'm thinking there's something wrong with the drivers or they're not playing nice with my 280X. Which are the latest WHQL drivers that work with Windows 8.1 and the 280X?


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> Did you get your card at New egg?


Push that bad boy some more


----------



## Milestailsprowe

Quote:


> Originally Posted by *Devildog83*
> 
> Did you get your card at New egg?


Yep


----------



## Devildog83

Quote:


> Originally Posted by *Milestailsprowe*
> 
> Yep


For $20 extra you could send the MSI back and get a Devil.


----------



## taem

Well newegg got the vapor x 280x in stock so I ordered one. I'm worried about the reviews saying it's insanely loud but it's really the only card that meets my requirements, short enough to fit with hdd cage (HIS, Xfx, toxic, asus all too long), two dual link dvi ports. Hopefully noise won't be unbearable. It's an Iron Egg item so I will be able to return it. Though at this rate newegg is going to tell me not to shop there anymore lol, always returning stuff to those guys.

But I wanted to ask, how do you determine whether a card is XT2 or XTL? Someone else asked earlier and there were no replies.


----------



## Dimaggio1103

Quote:


> Originally Posted by *Devildog83*
> 
> For $20 extra you could send the MSI back and get a Devil.


its worth it. Got mine today and love the power. It also Overclocks like a beast.

My phenom is holding its own like a champ so far....


----------



## Cmoney

So I picked up the XFX 280x DD Black Edition on Black Friday from Newegg and I just finally got it installed today. I have been monitoring the load temps while gaming and I am a little concerned. My idle temp is around 40C, while my load temps have reached as high as 93C during both BF4 and COD:Ghost.. and that is with a custom fan profile that puts the fan at 100% @ 60C. I am running the latest beta drivers , and my 650D is very well ventilated, with an ambient temp around 23C in the room.

From what I have been reading it seems like my temps should be lower. Are the load temps safe in the 90s? Are they possibly due to the fact that it is a factory overclocked Black Edition? Any help would be greatly appreciated.


----------



## madorax

Quote:


> Originally Posted by *Cmoney*
> 
> So I picked up the XFX 280x DD Black Edition on Black Friday from Newegg and I just finally got it installed today. I have been monitoring the load temps while gaming and I am a little concerned. My idle temp is around 40C, while my load temps have reached as high as 93C during both BF4 and COD:Ghost.. and that is with a custom fan profile that puts the fan at 100% @ 60C. I am running the latest beta drivers , and my 650D is very well ventilated, with an ambient temp around 23C in the room.
> 
> From what I have been reading it seems like my temps should be lower. Are the load temps safe in the 90s? Are they possibly due to the fact that it is a factory overclocked Black Edition? Any help would be greatly appreciated.


that's seems hot man. maybe XFX got the same heat problem like MSI in default BIOS?

for comparison, here is my MSI gaming OC @ 1120 / 1700 playing for about 1 hour with AC4, with custom fan setting. my ambient room is about 26 C.


----------



## Stay Puft

Does anyone have a 270X Toxic? Does it have voltage control?


----------



## Devildog83

Quote:


> Originally Posted by *Dimaggio1103*
> 
> its worth it. Got mine today and love the power. It also Overclocks like a beast.
> 
> My phenom is holding its own like a champ so far....


Awesome !!


----------



## Devildog83

Quote:


> Originally Posted by *madorax*
> 
> that's seems hot man. maybe XFX got the same heat problem like MSI in default BIOS?
> 
> for comparison, here is my MSI gaming OC @ 1120 / 1700 playing for about 1 hour with AC4, with custom fan setting. my ambient room is about 26 C.


I see no VRM temps there, ya'll should really get HWinfo64. It shows everything.


----------



## Sgt Bilko

Just for a laugh i looked at ebay in the US.....$400 for a 7970 USED? Thats what i paid for new









Hell, get them from here if you can, seen a few month old 7950 for $150.


----------



## Devildog83

Quote:


> Originally Posted by *Cmoney*
> 
> So I picked up the XFX 280x DD Black Edition on Black Friday from Newegg and I just finally got it installed today. I have been monitoring the load temps while gaming and I am a little concerned. My idle temp is around 40C, while my load temps have reached as high as 93C during both BF4 and COD:Ghost.. and that is with a custom fan profile that puts the fan at 100% @ 60C. I am running the latest beta drivers , and my 650D is very well ventilated, with an ambient temp around 23C in the room.
> 
> From what I have been reading it seems like my temps should be lower. Are the load temps safe in the 90s? Are they possibly due to the fact that it is a factory overclocked Black Edition? Any help would be greatly appreciated.


IMHO for a non-ref card, temps in the 90's sucks. I am not an XFX fan since folks are having soooo many issues, with heat, locked cards, poor performance and more. I think XFX has lost it TBH. Plus the looks are kinda blah to me. I would return it and get something else.


----------



## Devildog83

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Just for a laugh i looked at ebay in the US.....$400 for a 7970 USED? Thats what i paid for new
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hell, get them from here if you can, seen a few month old 7950 for $150.


Ebay is not really a good place to buy Video cards. The 7970's are starting to become extinct here. Superbiiz has some for far less than $400 though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Devildog83*
> 
> Ebay is not really a good place to buy Video cards. The 7970's are starting to become extinct here. Superbiiz has some for far less than $400 though.


Oh no, Ebay is a terrible place to buy almost any PC gear (cept maybe small cables and connectors) I was just browsing for an old PC game and thought i'd look them up due to the Litecoin boom and wow......just kinda stunned me.

I found my game though so all is well


----------



## Devildog83

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Oh no, Ebay is a terrible place to buy almost any PC gear (cept maybe small cables and connectors) I was just browsing for an old PC game and thought i'd look them up due to the Litecoin boom and wow......just kinda stunned me.
> 
> I found my game though so all is well










Even Newegg is out of them here which is kinda strange, I have never seen that before. I can understand the new cards but 7970's don't make sense. As a matter of fact they are sold out of 7950's too. Is this the twilight zone?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Devildog83*
> 
> 
> 
> 
> 
> 
> 
> 
> Even Newegg is out of them here which is kinda strange, I have never seen that before. I can understand the new cards but 7970's don't make sense. As a matter of fact they are sold out of 7950's too. Is this the twilight zone?


280x's are pretty much sold out here as well, Plenty of 270x's though.......jeez I could grab 3 Powercolor 270x's for $20 more than what my 290x cost....


----------



## Durvelle27

Guys good news. Just bought a Sapphire R9 290 and Accelero Xtreme III. Very excited


----------



## Devildog83

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 280x's are pretty much sold out here as well, Plenty of 270x's though.......jeez I could grab 3 Powercolor 270x's for $20 more than what my 290x cost....


Devil 270x's are $220 here, 2 of them seem to be around Titan and near 290x performance for a lot cheaper but you are still running X-Fire and it has it's own issues but for now I am happy. When they come out I might get a 290 non-ref card, some are even pre-flashed to the 290x bios. I am assuming they will be below $450 here in the US. I just want my 270x back from NewEgg. Waiting for the RMA.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> Guys good news. Just bought a Sapphire R9 290 and Accelero Xtreme III. Very excited


You will be pleased









Best tip i can give is don't use the junk paste Arctic send with the cooler, i used the pads off the stock cooler for the vram and some that came with the AX III for the vrms
Quote:


> Originally Posted by *Devildog83*
> 
> Devil 270x's are $220 here, 2 of them seem to be around Titan and near 290x performance for a lot cheaper but you are still running X-Fire and it has it's own issues but for now I am happy. When they come out I might get a 290 non-ref card, some are even pre-flashed to the 290x bios. I am assuming they will be below $450 here in the US. I just want my 270x back from NewEgg. Waiting for the RMA.


Same thing here, waiting on RMA......going to get store credit then pick up either a couple of HIS 280x's or 2 x 290 non-ref cards, if they release before January that is.....wife wants her 7970 back


----------



## madorax

Quote:


> Originally Posted by *Devildog83*
> 
> I see no VRM temps there, ya'll should really get HWinfo64. It shows everything.


you only need to scroll down the GPU-Z to get VRM reading temp ^^



but yes, HWINFO64 does give more information more detail than GPU-Z though.

anyway.... does somebody with oc setting get fail in BF4 but runs well in any other game / benchmark? my BF4 can only run @ 1100 / 1700, anywhere beyond that will cause black screen, artifact, CTD, any type error. but in firestrike, heaven or valley, AC4, Arkham origins, and other new games i can run fine for more than 1 hour with setting like 1175 / 1850. is this because BF4 demand more stability or because something else?


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You will be pleased
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Best tip i can give is don't use the junk paste Arctic send with the cooler, i used the pads off the stock cooler for the vram and some that came with the AX III for the vrms
> Same thing here, waiting on RMA......going to get store credit then pick up either a couple of HIS 280x's or 2 x 290 non-ref cards, if they release before January that is.....wife wants her 7970 back


I won't I'll be using the paste I use on my CPU when I started WC. Also will be using some cooper heatsinks on VRM and VRAM


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> I won't I'll be using the paste I use on my CPU when I started WC. Also will be using some cooper heatsinks on VRM and VRAM


Looks like you're all sorted then









Have fun with it, awesome card!!


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Looks like you're all sorted then
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Have fun with it, awesome card!!


Thx I will. Gonna push it hard


----------



## Miiksu

Quote:


> Originally Posted by *madorax*
> 
> you only need to scroll down the GPU-Z to get VRM reading temp ^^
> 
> 
> 
> but yes, HWINFO64 does give more information more detail than GPU-Z though.
> 
> anyway.... does somebody with oc setting get fail in BF4 but runs well in any other game / benchmark? my BF4 can only run @ 1100 / 1700, anywhere beyond that will cause black screen, artifact, CTD, any type error. but in firestrike, heaven or valley, AC4, Arkham origins, and other new games i can run fine for more than 1 hour with setting like 1175 / 1850. is this because BF4 demand more stability or because something else?


Does it have voltage control? BF4 seems good to test oc results. It's harsh but if you don't get artifacs or crash your clocks are rock solid, probably


----------



## Luckael

my Asus R9 280X DCU2T V2


----------



## SultanOfWalmart

Quote:


> Originally Posted by *Luckael*
> 
> my Asus R9 280X DCU2T V2


Christ, that thing is massive...!


----------



## jamor

Quote:


> Originally Posted by *Luckael*
> 
> my Asus R9 280X DCU2T V2


awesome wow cool congrats!


----------



## madorax

Quote:


> Originally Posted by *Miiksu*
> 
> Does it have voltage control? BF4 seems good to test oc results. It's harsh but if you don't get artifacs or crash your clocks are rock solid, probably


my card can go up to 1.3v. already trying with 1.3v, +20% power limit. even if i'm go with 1110 it will get misplaced texture after about 10 minutes of gaming in BF4. now i'm at 1100 / 1700 with 1.25v, 20% power limit, all normal ^^


----------



## APC

Hello

First excuse my english.

Can anyone say which of these two card ejects (blow out) as much air out of the case?

Sapphire DUAL-X R9 270X 2GB GDDR5 OC WITH BOOST

MSI R9 270X Gaming 2G

I think the sapphire blows more air out of the case but I'm not at all sure.

Regards


----------



## Durvelle27

*List Updated*


----------



## Miiksu

Quote:


> Originally Posted by *madorax*
> 
> my card can go up to 1.3v. already trying with 1.3v, +20% power limit. even if i'm go with 1110 it will get misplaced texture after about 10 minutes of gaming in BF4. now i'm at 1100 / 1700 with 1.25v, 20% power limit, all normal ^^


Not so good oc =( Was your card temps ok? What is ur asic?


----------



## GTR Mclaren

I just got a 198$ Gigabyte R9 270x in Amazon, new

Considering the horrible, disgusting prices and stock of the cards in this month in Amazon...Im kinda happy.


----------



## Jaffi

Quote:


> Originally Posted by *madorax*
> 
> you only need to scroll down the GPU-Z to get VRM reading temp ^^
> 
> 
> 
> but yes, HWINFO64 does give more information more detail than GPU-Z though.
> 
> anyway.... does somebody with oc setting get fail in BF4 but runs well in any other game / benchmark? my BF4 can only run @ 1100 / 1700, anywhere beyond that will cause black screen, artifact, CTD, any type error. but in firestrike, heaven or valley, AC4, Arkham origins, and other new games i can run fine for more than 1 hour with setting like 1175 / 1850. is this because BF4 demand more stability or because something else?


I am a bit disappointed that gigabyte used a cheap chip which can not read any vrm temps at all on my 280x rev 2.


----------



## JoeDirt

As am I.


----------



## dayen666

Quote:


> Originally Posted by *madorax*
> 
> you only need to scroll down the GPU-Z to get VRM reading temp ^^
> 
> 
> 
> but yes, HWINFO64 does give more information more detail than GPU-Z though.
> 
> anyway.... does somebody with oc setting get fail in BF4 but runs well in any other game / benchmark? my BF4 can only run @ 1100 / 1700, anywhere beyond that will cause black screen, artifact, CTD, any type error. but in firestrike, heaven or valley, AC4, Arkham origins, and other new games i can run fine for more than 1 hour with setting like 1175 / 1850. is this because BF4 demand more stability or because something else?


I have R9 280 Toxic but on GPU-Z its show 0°








Anyway how much is VRM temp?


----------



## taem

Holy smokes 280x prices at Newegg skyrocketed. I just ordered the Vapor X for $319, a day later it's $359. All the cards are $359+. Club3D royalking is $389!!! http://www.newegg.com/Product/ProductList.aspx?Submit=Property&Subcategory=48&N=100007709&IsNodeId=1&IsPowerSearch=1&SrchInDesc=&MinPrice=&MaxPrice=&OEMMark=N&PropertyCodeValue=679%3A446073&Tpk=R9%20280x


----------



## Devildog83

Quote:


> Originally Posted by *taem*
> 
> Holy smokes 280x prices at Newegg skyrocketed. I just ordered the Vapor X for $319, a day later it's $359. All the cards are $359+. Club3D royalking is $389!!! http://www.newegg.com/Product/ProductList.aspx?Submit=Property&Subcategory=48&N=100007709&IsNodeId=1&IsPowerSearch=1&SrchInDesc=&MinPrice=&MaxPrice=&OEMMark=N&PropertyCodeValue=679%3A446073&Tpk=R9%20280x


Supply and demand, it's capitalism 101. I just hope that the 290 non-refs are not $500, then I will just stay with my Dual Devils.


----------



## jamor

Quote:


> Originally Posted by *taem*
> 
> Holy smokes 280x prices at Newegg skyrocketed. I just ordered the Vapor X for $319, a day later it's $359. All the cards are $359+. Club3D royalking is $389!!! http://www.newegg.com/Product/ProductList.aspx?Submit=Property&Subcategory=48&N=100007709&IsNodeId=1&IsPowerSearch=1&SrchInDesc=&MinPrice=&MaxPrice=&OEMMark=N&PropertyCodeValue=679%3A446073&Tpk=R9%20280x


Wow!

I just picked my Asus up for $300 i feel good.


----------



## jamor

Quote:


> Originally Posted by *Devildog83*
> 
> Supply and demand, it's capitalism 101. I just hope that the 290 non-refs are not $500, then I will just stay with my Dual Devils.


Or is it the new chip pricing? XLT?


----------



## TempAccount007

Apparently bitcoin miners are driving up the price of the cards.

Also, madorax I'm pretty sure those memory clocks are unnecessarily high.

You could probably lower them by a couple hundred mhz at least and not affect performance.


----------



## Luckael

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> Christ, that thing is massive...!


yes, its triple slot 

Quote:


> Originally Posted by *jamor*
> 
> awesome wow cool congrats!


Thank you


----------



## taem

Quote:


> Originally Posted by *Devildog83*
> 
> Supply and demand, it's capitalism 101. I just hope that the 290 non-refs are not $500, then I will just stay with my Dual Devils.


$500 for a 290 is too high, you can get a GTX 780 for that price. The desirability of the 290 is that it performs like the 780 for $100 less. At the same price, I'd go with a cooler, quieter 780, unless mantle and trueaudio are mindblowing. At $360 the 280x begins to lose luster too, you can get a 770 on sale for as low as $310ish, and 760 is quickly approaching sub $250 prices.


----------



## GTR Mclaren

QUESTION...

Is AMD giving away free games with the R9 line of cards??

I just got (yesterday) an AMD R9 270X from Amazon


----------



## madorax

Quote:


> Originally Posted by *Miiksu*
> 
> Not so good oc =( Was your card temps ok? What is ur asic?


it never reach 70C on real game, VRM is about 10C lower usually. my ASIC is 70.7, should be better oc, but i guess i don't win the lottery ^^

Quote:


> Originally Posted by *Jaffi*
> 
> I am a bit disappointed that gigabyte used a cheap chip which can not read any vrm temps at all on my 280x rev 2.


the reason i don't pick giga and choose MSI is because it's voltage locked, i don't know giga can't display VRM on GPU-Z, maybe it's not the card but the tools itself (GPU-Z), usually if you use HWINFO64 or AIDA64 the VRM always show up right? so i guess its the GPUZ fault for not showing proper VRM temp on your card








Quote:


> Originally Posted by *dayen666*
> 
> I have R9 280 Toxic but on GPU-Z its show 0°
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway how much is VRM temp?


in my card usually it's about 10C lower than core temp on both VRM's
Quote:


> Originally Posted by *jamor*
> 
> Wow!
> 
> I just picked my Asus up for $300 i feel good.


you should, im my place ASUS is no 1 brand for computer so the price is skyrocketed, 280x dcu2 got higher price than sapphire 290 ~__~
Quote:


> Originally Posted by *TempAccount007*
> 
> Apparently bitcoin miners are driving up the price of the cards.
> 
> Also, madorax I'm pretty sure those memory clocks are unnecessarily high.
> 
> You could probably lower them by a couple hundred mhz at least and not affect performance.


already lower it to 1550 now without raising the memory voltage for daily use, thanks


----------



## gh0stfac3killa

OK guys going to try and pick some brains here. Since we are all proud owneres of some type of r9 series card. Does anyone have any info on the actual mantel and directx 11.2. When are we suppose to be seeing these options on our gpus and in game. All i read about is how awsome all this new tech will be and i have all the requirements. Windows 8.1pro, and r9 280x's. So where is my mantel option and directx 11.2???? Anyone have any insight on this? If its available now where do i go to make sure there available.??? Gotta have the new new..... since thats the real reason i bought these bad boys..


----------



## Sgt Bilko

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> OK guys going to try and pick some brains here. Since we are all proud owneres of some type of r9 series card. Does anyone have any info on the actual mantel and directx 11.2. When are we suppose to be seeing these options on our gpus and in game. All i read about is how awsome all this new tech will be and i have all the requirements. Windows 8.1pro, and r9 280x's. So where is my mantel option and directx 11.2???? Anyone have any insight on this? If its available now where do i go to make sure there available.??? Gotta have the new new..... since thats the real reason i bought these bad boys...


AMD's overview on Mantle

http://community.amd.com/community/amd-blogs/amd-gaming/blog/2013/10/17/the-four-core-principles-of-amd-s-mantle

Chipp at the AMD Dev Summit

http://www.overclock.net/t/1442071/chipp-amd-developer-summit-2013/0_40

And a some overview of Mantle by Mygaming

http://mygaming.co.za/news/hardware/60224-amd-mantle-api-poised-to-revolutionise-pc-gaming.html

Afaik The Mantle update for BF4 will launch sometime this month, DICE and AMD had a deal where DICE would be first to show off what Mantle can do so thats why they are first.......actual date though, no idea tbh.


----------



## taem

Could any and all Sapphire 280x Vapor X owners post screens of gpu-z? I'd so appreciate that.


----------



## gh0stfac3killa

Quote:


> Originally Posted by *Sgt Bilko*
> 
> AMD's overview on Mantle
> 
> http://community.amd.com/community/amd-blogs/amd-gaming/blog/2013/10/17/the-four-core-principles-of-amd-s-mantle
> 
> Chipp at the AMD Dev Summit
> 
> http://www.overclock.net/t/1442071/chipp-amd-developer-summit-2013/0_40
> 
> And a some overview of Mantle by Mygaming
> 
> http://mygaming.co.za/news/hardware/60224-amd-mantle-api-poised-to-revolutionise-pc-gaming.html
> 
> Afaik The Mantle update for BF4 will launch sometime this month, DICE and AMD had a deal where DICE would be first to show off what Mantle can do so thats why they are first.......actual date though, no idea tbh.


im realy looking foward to seeing what all this new tech will do for gaming, no date though realy sucks, lol.. thanks for the update though. ive been trolling the microsoft sight for information on directx 11.2 since i did the update to 8.1 im still not seeing the 11.2. is anyone actualy showing 11.2 on there systems for the directx. im only showing 11. ive talked with a few other people that say there systems show 11.1 so im all types of lost on this deal. did all the auto updates and system dosent show any new available and i check the sight which is all types of confusing trying to figure out how to get current directx drivers. anyone know where i can get a good download for the current directx for 8.1 and the r9 gpus?


----------



## Sgt Bilko

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> im realy looking foward to seeing what all this new tech will do for gaming, no date though realy sucks, lol.. thanks for the update though. ive been trolling the microsoft sight for information on directx 11.2 since i did the update to 8.1 im still not seeing the 11.2. is anyone actualy showing 11.2 on there systems for the directx. im only showing 11. ive talked with a few other people that say there systems show 11.1 so im all types of lost on this deal. did all the auto updates and system dosent show any new available and i check the sight which is all types of confusing trying to figure out how to get current directx drivers. anyone know where i can get a good download for the current directx for 8.1 and the r9 gpus?


Microsoft has pretty much given up on PC gaming, that's why AMD created Mantle.

And not many games are using DX11 so 11.2 is a bit far along for now. I've heard a few different dates floating around (15th and 25th) but i wouldn't believe them without any sort of official confirmation.


----------



## gh0stfac3killa

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Microsoft has pretty much given up on PC gaming, that's why AMD created Mantle.
> 
> And not many games are using DX11 so 11.2 is a bit far along for now. I've heard a few different dates floating around (15th and 25th) but i wouldn't believe them without any sort of official confirmation.


crazy to see them not supporting pc gaming more but i guess its to be expected since they jumped into the console field with the xbox. guess its good to be prepared for the future lol. thanks for the links that was a great information on mantle and amd and the direction they want to take pc gaming. i hope all that actualy takes and more developers use mentel and all the other new tech. i hope it just dosent die out like so many others hopeful past technology.


----------



## Sgt Bilko

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> crazy to see them not supporting pc gaming more but i guess its to be expected since they jumped into the console field with the xbox. guess its good to be prepared for the future lol. thanks for the links that was a great information on mantle and amd and the direction they want to take pc gaming. i hope all that actualy takes and more developers use mentel and all the other new tech. i hope it just dosent die out like so many others hopeful past technology.


Well afaik Frostbite 3, Cryengine, Star Citizen (don't know what the engine is called), Unreal 4 Engine, Nitro and a few other are all onboard with Mantle so there shouldn't be any lack of support from the gaming side themselves.......

the only way i can see Mantle going under is if the Devs are finding it too much work to code for Mantle, OpenGL and DirectX. Would be easier if Nvidia/Intel just swallowed thier pride and jumped on board with it.......then everybody wins


----------



## gh0stfac3killa

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well afaik Frostbite 3, Cryengine, Star Citizen (don't know what the engine is called), Unreal 4 Engine, Nitro and a few other are all onboard with Mantle so there shouldn't be any lack of support from the gaming side themselves.......
> 
> the only way i can see Mantle going under is if the Devs are finding it too much work to code for Mantle, OpenGL and DirectX. Would be easier if Nvidia/Intel just swallowed thier pride and jumped on board with it.......then everybody wins


yah no doubt nvidia and intel are out of control in my opion. just look at the prices on stuff. my next build is going to be a amd cpu and amd gpus. not to savy on amd cpu's but i hear alot of good things about there fx line and the new apu line. intel for the good stuff you have to pay a premium. i just hope we see more games on pc. here lately though in the past few years there have been more games coming over with is great but i would still love to see more especialy games like halo. i have halo 1 and 2 on pc but 3 and all the ones after three never made the cutt i guess. the rest of the gears or war games as well i have one and they messed that up having to turn the pc clack back to like 2009 to get the game to play wich pissed me off but the game was still awsome. the extra levels and actualy haveing to dace the brumac was great. time will tell i guess. i ditched consoles all to gether after the first edition 360's and ps 3's once they died i never went back. in the mean time these gigabytes are kicking ass and soon ill have two xfx 280's for my other tower super excited to see what the xfx line brings to the table.


----------



## benjamen50

55°C max on a Gigabyte AMD R9 270X, pretty good for a GPU like this.


----------



## Camph

Quote:


> Originally Posted by *TempAccount007*
> 
> Apparently bitcoin miners are driving up the price of the cards.


It's not bitcoin, it's lightcoin. It's like bitcoin but the difficulty is fresh so they're getting these instead. Bitcoin is pretty much at giant rooms full of ASIC miners now.


----------



## Sgt Bilko

Quote:


> Originally Posted by *benjamen50*
> 
> 55°C max on a Gigabyte AMD R9 270X, pretty good for a GPU like this.


How does the gigabyte go in gaming?

Thinking about picking one up for my sis, and 55c max is pretty damn cool. My Gigabyte 7970 hits around 69c after 24 hr folding


----------



## benjamen50

Pretty great, the funny thing is that the cooler is slightly longer than the PCB of the board itself. I haven't fully tested the temperatures but thats the highest I've been at. 1150 MHZ on the core and 1500 MHZ on the memory (7500 MHZ effective) Well thats with my overclocked settings. Fans never seem to even go past 50%, the wide spread out heatsink with the 3 fans is what really makes this GPU cool.


----------



## Sgt Bilko

Quote:


> Originally Posted by *benjamen50*
> 
> Pretty great, the funny thing is that the cooler is slightly longer than the PCB of the board itself. I haven't fully tested the temperatures but thats the highest I've been at. 1150 MHZ on the core and 1500 MHZ on the memory (7500 MHZ effective) Well thats with my overclocked settings. Fans never seem to even go past 50%, the wide spread out heatsink with the 3 fans is what really makes this GPU cool.


Thats good to know, Is it Volt locked like most Gigabytes?

What sort of frame rate do you get in BL2 and such?


----------



## benjamen50

Most likely Volt locked, I used CCC overdrive to overclock it, didn't bother with MSI afterburner, unfortunately I don't have borderlands 2, I'll try MSI Afterburner and see if it's voltage locked.


----------



## Sgt Bilko

Quote:


> Originally Posted by *benjamen50*
> 
> Most likely Volt locked, I used CCC overdrive to overclock it, didn't bother with MSI afterburner, unfortunately I don't have borderlands 2, I'll try MSI Afterburner and see if it's voltage locked.


Well volt locking isn't the end of the earth, so long as it handles gaming fine it's all good.


----------



## benjamen50

Yeah, It's VOLT locked, I wouldn't go through the trouble of flashing a GPU BIOS anyway, handles games exceptionally well.


----------



## Sgt Bilko

Quote:


> Originally Posted by *benjamen50*
> 
> Yeah, It's VOLT locked, I wouldn't go through the trouble of flashing a GPU BIOS anyway, handles games exceptionally well.


Thats all i'm after from it, Going to overclock her 8320 to 4.5 so it will never bottleneck anything and the 270x's are a very good deal atm.

Thanks for the info


----------



## Wisher

I just bought a Sapphire R9 270 and im having the black screen issue like many other people. Is it possible that it hasn't been resolved yet?
I tried disabling iGPU (Intel HD 4000), different driver versions, windows updates. Can anyone help?


----------



## Devildog83

Quote:


> Originally Posted by *benjamen50*
> 
> Pretty great, the funny thing is that the cooler is slightly longer than the PCB of the board itself. I haven't fully tested the temperatures but thats the highest I've been at. 1150 MHZ on the core and 1500 MHZ on the memory (7500 MHZ effective) Well thats with my overclocked settings. Fans never seem to even go past 50%, the wide spread out heatsink with the 3 fans is what really makes this GPU cool.


1500 would be 6000 Mhz or 6Ghz on the memory.


----------



## Jaffi

Quote:


> Originally Posted by *madorax*
> 
> it never reach 70C on real game, VRM is about 10C lower usually. my ASIC is 70.7, should be better oc, but i guess i don't win the lottery ^^
> the reason i don't pick giga and choose MSI is because it's voltage locked, i don't know giga can't display VRM on GPU-Z, maybe it's not the card but the tools itself (GPU-Z), usually if you use HWINFO64 or AIDA64 the VRM always show up right? so i guess its the GPUZ fault for not showing proper VRM temp on your card
> 
> 
> 
> 
> 
> 
> 
> 
> in my card usually it's about 10C lower than core temp on both VRM's
> you should, im my place ASUS is no 1 brand for computer so the price is skyrocketed, 280x dcu2 got higher price than sapphire 290 ~__~
> already lower it to 1550 now without raising the memory voltage for daily use, thanks


No mate, the gigabyte does not show any vrm temps, even with the tools you mentioned!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jaffi*
> 
> No mate, the gigabyte does not show any vrm temps, even with the tools you mentioned!


Well i can confirm it then, My Gigabyte 7970 doesn't show vrm reading in GPU-Z or in HWinfo64.

Guess they don't have vrm sensors in them........might explain the voltage locking as well?


----------



## Devildog83

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well i can confirm it then, My Gigabyte 7970 doesn't show vrm reading in GPU-Z or in HWinfo64.
> 
> Guess they don't have vrm sensors in them........might explain the voltage locking as well?


That's strange, must just be Gigabyte. My 270x is volt locked and is shows VRM temps.


----------



## benjamen50

Same, my R9 270X gigabyte doesn't show VRM temps and is volt locked, maybe I should of went with an ASUS model.


----------



## Devildog83

Quote:


> Originally Posted by *benjamen50*
> 
> Same, my R9 270X gigabyte doesn't show VRM temps and is volt locked, maybe I should of went with an ASUS model.


I don't know what Powercolor does TBH but mine is locked at 1.265v or so but the clocks are crazy high compared to the 7870 with the same PCB and mem chips. I get 1225/1550 easy, where my 7870 gets higher core clocks, 1275 but lower mem @ 1450 max stable. I have seen the VRM temps in the 70's, I don't honestly know if that's high or not. I know the core temps are always good. I get slightly better bench's from the 270x at it's high overclock. What is your max core volts at full load?


----------



## madorax

Quote:


> Originally Posted by *Jaffi*
> 
> No mate, the gigabyte does not show any vrm temps, even with the tools you mentioned!


Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well i can confirm it then, My Gigabyte 7970 doesn't show vrm reading in GPU-Z or in HWinfo64.
> 
> Guess they don't have vrm sensors in them........might explain the voltage locking as well?


can't see vrm temp in any tools? wow...









but maybe just for reference, vrm in my card is 20C for both in idle (core temp is 39C) and when full load in 52C ~ 56C (core temp 69C) with custom fan setting set to run 80% @ 70C.

mine core voltage is strangely low though... it 1156, isn't it suppose 1200 in default? CMIIW


----------



## neptunus

Any info on Gigabyte 280X rev 2? It appears to be using different cooling system and new black PCB...


----------



## Sgt Bilko

Quote:


> Originally Posted by *madorax*
> 
> can't see vrm temp in any tools? wow...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but maybe just for reference, vrm in my card is 20C for both in idle (core temp is 39C) and when full load in 52C ~ 56C (core temp 69C) with custom fan setting set to run 80% @ 70C.
> 
> mine core voltage is strangely low though... it 1156, isn't it suppose 1200 in default? CMIIW


Just some figures here, Since mine is volt locked 1000/1375 voltage is 1.170


----------



## Jaffi

Quote:


> Originally Posted by *neptunus*
> 
> Any info on Gigabyte 280X rev 2? It appears to be using different cooling system and new black PCB...


I think it is the same card as rev 1, with different design, less voltage (1.2v locked instead of 1.256v) and less agressive fans. Though I find them still too loud so I went with a custom curve in afterburner. Stock always keeps the card below 70 °C, hence has to increase fan speed to around 60%. My setting mostly stays at 45% and around 75 °C which is much quieter. Sadly I can't read the VRM's temps so I don't exactly know how they perform with my custom setting.
The cooler is exactly the same design, only the black part around the fans was a cosmetic change to match the entire line of gigabyte cards I guess. It is made of plastic though, no metal like on Nvidia cards.

Imho the best card around is the Asus one. Sadly it is not available as BF4 edition, neither is HIS (which I find 2nd best card).

By the way, is the MSI card still suffering from overheat VRM or much too loud fans?


----------



## neptunus

Are you sure that those parts are made of plastic? I've read a review on 780Ti that uses the same cooling system and it was pretty good. The rev 2 is using black PCB, so its either they changing the style or it has different set of elements compared to rev 1 (or its both). That's the only 280X card available in my region.

*P.S* New to AMD, what's the most common way to bypass the voltage lock? Flashing edited BIOS?


----------



## TempAccount007

Pretty sure voltage lock is hardware based.

You might be able to get a program to show a voltage slider when you flash a different bios, but changing the voltage in software wont do anything to the actual voltage.

That was the case when I flashed a matrix bios (tried 280x & 7970) to my DC2T. I set the voltage slider to 1.4v but the actual voltage was less than that of the normal bios (I think it was a safe mode or something because the hardware didn't recognize the higher voltage).

It would be great if I'm wrong, though.


----------



## neptunus

Matrix uses different PCB design than DC2T. Logic tells me that you can use your own bios with fixed voltage...


----------



## Dimaggio1103

What is the max safe temps for the GPU VRMs on the devil 270x?


----------



## Devildog83

Quote:


> Originally Posted by *Dimaggio1103*
> 
> What is the max safe temps for the GPU VRMs on the devil 270x?


Not sure about the max but mine has never got out of the 70's and I am pretty sure that's safe.


----------



## Johny Boy

Hi all AMD 280X users wanted to know which AMD 280X card's are voltage locked besides Gigabyte.
Does Asus/MSI/Sapphire are voltage locked too ?
Plus i heard MSI is having VRM problems on 280X gaming cards , is it true ?

Any kind of information will be highly appreciated.


----------



## TempAccount007

XFX brand and sapphire toxic are locked down I think.


----------



## Stay Puft

Can anyone confirm or deny voltage control for the Gigabyte 270X? Im looking at picking up something other then the Hawk to play with overclocking.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125476

But if its locked i might just try the Toxic to break my old 7870 Hawk records


----------



## benjamen50

Gigabyte r9 270x is voltlocked. Tried it myself.


----------



## Stay Puft

Quote:


> Originally Posted by *benjamen50*
> 
> Gigabyte r9 270x is voltlocked. Tried it myself.


Damn.. Thank you. What about the Toxic 270X guys and the Devil 270X? That card looks beastly


----------



## madorax

Quote:


> Originally Posted by *Jaffi*
> 
> I think it is the same card as rev 1, with different design, less voltage (1.2v locked instead of 1.256v) and less agressive fans. Though I find them still too loud so I went with a custom curve in afterburner. Stock always keeps the card below 70 °C, hence has to increase fan speed to around 60%. My setting mostly stays at 45% and around 75 °C which is much quieter. Sadly I can't read the VRM's temps so I don't exactly know how they perform with my custom setting.
> The cooler is exactly the same design, only the black part around the fans was a cosmetic change to match the entire line of gigabyte cards I guess. It is made of plastic though, no metal like on Nvidia cards.
> 
> Imho the best card around is the Asus one. Sadly it is not available as BF4 edition, neither is HIS (which I find 2nd best card).
> 
> By the way, is the MSI card still suffering from overheat VRM or much too loud fans?


the problem exist on default BIOS, MSI provide update bios on techpowerup and after update the heat is gone and even @ 100% i don't think its loud... maybe because my case fan already loud (aerocool shark 14cm x 2) ^^
Quote:


> Originally Posted by *Stay Puft*
> 
> Damn.. Thank you. What about the Toxic 270X guys and the Devil 270X? That card looks beastly


perhaps you want to try MSI instead? they go with BF4 bundle with the same price. but for the look still toxic and devil are truly beast ^^


----------



## Stay Puft

Quote:


> Originally Posted by *madorax*
> 
> the problem exist on default BIOS, MSI provide update bios on techpowerup and after update the heat is gone and even @ 100% i don't think its loud... maybe because my case fan already loud (aerocool shark 14cm x 2) ^^
> perhaps you want to try MSI instead? they go with BF4 bundle with the same price. but for the look still toxic and devil are truly beast ^^


Already have BF4 and Premium so dont need it. Looking for something that i can push a lot of voltage through and exceed 1300 core. The Hawk is probably the best card to go with thanks to the triple overvoltage but the devil and toxic look sweet


----------



## Miiksu

Quote:


> Originally Posted by *Johny Boy*
> 
> Hi all AMD 280X users wanted to know which AMD 280X card's are voltage locked besides Gigabyte.
> Does Asus/MSI/Sapphire are voltage locked too ?
> Plus i heard MSI is having VRM problems on 280X gaming cards , is it true ?
> 
> Any kind of information will be highly appreciated.


XFX, Sapphire Toxic, Asus has unlocked voltage and HIS has unlocked voltages too. Not sure about MSI anymore. I have heard it does not have anymore voltage control. I think best cards are atm all Asus, all HIS and Sapphire Toxic. You have blenty of choices


----------



## Stay Puft

Quote:


> Originally Posted by *Miiksu*
> 
> XFX, Sapphire Toxic, Asus has unlocked voltage and HIS has unlcocked voltages too. Not sure about MSI anymore. I have heard it does not have anymore voltage control. I think best cards are atm all Asus, all HIS and Sapphire Toxic. You have blenty of choices


Hawk has unlocked voltages and is probably the only one you can adjust memory voltage on


----------



## benjamen50

Would the Asus amd r9 270x have vrm temps and no voltage lock?


----------



## taem

Quote:


> Originally Posted by *Miiksu*
> 
> XFX, Sapphire Toxic, Asus has unlocked voltage and HIS has unlocked voltages too. Not sure about MSI anymore. I have heard it does not have anymore voltage control. I think best cards are atm all Asus, all HIS and Sapphire Toxic. You have blenty of choices


I thought the Toxic was voltage locked. http://www.overclock.net/t/1433990/which-r9-280x-should-i-get/0_100#post_20980986 Isn't the point of the Toxic that ootb it's as high performance as this chip can go?


----------



## Miiksu

Quote:


> Originally Posted by *Stay Puft*
> 
> Hawk has unlocked voltages and is probably the only one you can adjust memory voltage on


Not really.. I can adjust memory volt too with my HIS card.


Quote:


> Originally Posted by *taem*
> 
> I thought the Toxic was voltage locked. http://www.overclock.net/t/1433990/which-r9-280x-should-i-get/0_100#post_20980986 Isn't the point of the Toxic that ootb it's as high performance as this chip can go?


Sapphire toxic card has CHIL's voltage controller same as Asus and HIS. I would say almost every card that has good cooler and unlocked voltage can go high as sapphire toxic or even higher if you are lucky.

http://www.overclockersclub.com/reviews/sapphire_r9_280x_toxic/2.htm


----------



## demoralized

Anywhere to get aftermarket backplates for the R9 280x?


----------



## TempAccount007

@Miiksu

Can you confirm with gpu-z that you are getting that voltage when you set it to 1381mv?

With vdroop under heavy load the voltage may be more like 1.3v, but its still better than the DC2T I have now.

If so, I might return my DC2T and get a HIS... if I can ever get my hands on one


----------



## neptunus

So, is it possible to bypass voltage lock by flashing edited BIOS (with the voltage set)?


----------



## Stay Puft

Quote:


> Originally Posted by *TempAccount007*
> 
> @Miiksu
> 
> Can you confirm with gpu-z that you are getting that voltage when you set it to 1381mv?
> 
> With vdroop under heavy load the voltage may be more like 1.3v, but its still better than the DC2T I have now.
> 
> If so, I might return my DC2T and get a HIS... if I can ever get my hands on one


Wait till after Christmas. They're all marked up now


----------



## Jaffi

Is this any good http://www.hisdigital.com/un/product2-786.shtml ?


----------



## Stay Puft

Quote:


> Originally Posted by *Jaffi*
> 
> Is this any good http://www.hisdigital.com/un/product2-786.shtml ?


I personally wouldnt buy HIS but maybe i'm living under a false assumption that they suck


----------



## Johny Boy

Thank's guy's for your replies.
I need a card which can be OC'd so it needs to have un-locked voltage and have memory OC option....

But have seen MSI OC gaming/Hawk having trouble with VRM's being reported in some other forum's. Asus DC II seems good option but then have no idea about voltage locked or not in them same goes for Sapphire Dual OC card as Toxic is way overpriced where i live.
At this point if MSI has no vrm problems then i would get Gaming card and OC the hell out of it....or get Asus one as it has very good cooling.


----------



## Devildog83

I do not know about the 280x's but as far as the questions about the 270x, the voltage is locked on the Devil but on the max stock voltage I have run bench's at as much as 1230/1575 and not crashed so I think the 1.265v is enough. It is clocked so high I do not think unlocking the voltage is needed. What are the max volts and clocks on the MSI Gaming 270x if anyone has one? As far as gaming I think they are all going to be very close, as far as overclocking the heck out of them there may be some small differences between the Toxic, Devil and the Gaming. If you have a Gaming or Toxic can you do a Heaven run to?


----------



## Stay Puft

Quote:


> Originally Posted by *Devildog83*
> 
> I do not know about the 280x's but as far as the questions about the 270x, the voltage is locked on the Devil but on the max stock voltage I have run bench's at as much as 1230/1575 and not crashed so I think the 1.265v is enough. It is clocked so high I do not think unlocking the voltage is needed. What are the max volts and clocks on the MSI Gaming 270x if anyone has one? As far as gaming I think they are all going to be very close, as far as overclocking the heck out of them there may be some small differences between the Toxic, Devil and the Gaming. If you have a Gaming or Toxic can you do a Heaven run to?


1.265v is not enough







especially for me. Have you tried the unlocker?

http://www.overclock.net/t/1398725/unlock-afterburner-limits-on-lots-of-cards-titan-to-gtx460-with-llc

Devil has a CHL8228 so it should work. I'm probably buying another Hawk. Has to be the best 270X for the money.


----------



## Miiksu

Quote:


> Originally Posted by *TempAccount007*
> 
> @Miiksu
> 
> Can you confirm with gpu-z that you are getting that voltage when you set it to 1381mv?
> 
> With vdroop under heavy load the voltage may be more like 1.3v, but its still better than the DC2T I have now.
> 
> If so, I might return my DC2T and get a HIS... if I can ever get my hands on one


That is scary voltage. I don't wanna burn my stuff. But at 1.33 vddc I was getting 50-70 mV less on load. At those voltage it would stay over 1.3.

Edit. I just tested with 1.356 vddc lowest reading was 1.275 BF4 stable at 1250 MHz but I was geting artifacts.

Temps:


----------



## Devildog83

Quote:


> Originally Posted by *Stay Puft*
> 
> 1.265v is not enough
> 
> 
> 
> 
> 
> 
> 
> especially for me. Have you tried the unlocker?
> 
> http://www.overclock.net/t/1398725/unlock-afterburner-limits-on-lots-of-cards-titan-to-gtx460-with-llc
> 
> Devil has a CHL8228 so it should work. I'm probably buying another Hawk. Has to be the best 270X for the money.


Mine doesn't have any voltage adjustment at all unlike the 7870 devil, when I have them X-Fired I am sure I could get 1225/1450 on both if I wished but I really have no need to with both cards. I am honestly not looking to burn up my cards just to get higher bench's. I can't afford it. The Hawk looks nice and looks like it will overclock like crazy if you wish but there is no way I could stick yellow in my case.


----------



## Stay Puft

Quote:


> Originally Posted by *Devildog83*
> 
> Mine doesn't have any voltage adjustment at all unlike the 7870 devil, when I have them X-Fired I am sure I could get 1225/1450 on both if I wished but I really have no need to with both cards. I am honestly not looking to burn up my cards just to get higher bench's. I can't afford it. The Hawk looks nice and looks like it will overclock like crazy if you wish but there is no way I could stick yellow in my case.


My motherboard has yellow on it so it would match fine


----------



## Dimaggio1103

Quote:


> Originally Posted by *Devildog83*
> 
> Not sure about the max but mine has never got out of the 70's and I am pretty sure that's safe.


I've seen my VRM's hit 77c in heavily modded skyrim. A little worried


----------



## Stay Puft

Quote:


> Originally Posted by *Dimaggio1103*
> 
> I've seen my VRM's hit 77c in heavily modded skyrim. A little worried


77C? Thats nothing to be worried about


----------



## Dimaggio1103

Quote:


> Originally Posted by *Stay Puft*
> 
> 77C? Thats nothing to be worried about


For vrm's? If so then thanks, had me a bit worried.

P.S. I like your avatar.....He was a real S.O.B.


----------



## TempAccount007

Just curious, has anyone been tempted to sell their card(s) into this litecoin craze?


----------



## gh0stfac3killa

Quote:


> Originally Posted by *TempAccount007*
> 
> Just curious, has anyone been tempted to sell their card(s) into this litecoin craze?


apperantly this litecoin and bitcoin craze is the main cause for the gpus not being available and the prices going up!!








not sure how true that all is but from logging into newegg and seeing my orginal gigabyte gpus gro from 299.99 to 359.99 and the recently ordered but not yet shipped due to none in stock wich is strange since i was able to put two xfx gpus in my cart and pay for them went from 309.99 to 359.99 as well. so my question here is, is litecoin mining realy that serious and if it is what can you use that for exactly i never realy read into the bitcoin and now litecoin deal. but for us gamers who just like to actualy have gpus for gaming, its a little fustrating to see everything sold out r series whys everywhere unless you want to pay dould or tripple the price....


----------



## ignus1212

Quote:


> Originally Posted by *taem*
> 
> Could any and all Sapphire 280x Vapor X owners post screens of gpu-z? I'd so appreciate that.


----------



## Jaffi

How do I find out which bios I should flash? http://www.gigabyte.com/products/product-page.aspx?pid=4847#bios I am using the Gigabyte 280x rev. 2.0.
Here is a GPU Z screen which shows the current bios. I can not find anything that tells me if it is F3 or F12?


----------



## neptunus

Can you please tell if its voltage locked or not? Temps/fan noise?


----------



## Jaffi

Locked at 1.2v, temps are very good, always <70° C with stock cooler. BUT it is louder than it has to be (mostly around 65%), so I customized it with MSI Afterburner and now it is mostly at around 75° C and 40-45% fan speed. That is very silent.
Besides locked voltage it's controller does not read any vrm temps. Slight coilwhine, which is not really hearable most of the time. You have to sacrifice a bit, but I could get it for 249 € as BF4 edition over here in Germany, so it was a good deal.


----------



## neptunus

Can you please check one thing for me, I assume that the card has dual bios, so there should be no damage done if something goes wrong.

Modify voltage settings in your BIOS using VBE7 and flash it...

Thanks


----------



## Jaffi

Seems impossible, alread is at max 1.2v as default. VRM is also stated as "unknown" in VBE7.


----------



## Stay Puft

Quote:


> Originally Posted by *Jaffi*
> 
> Seems impossible, alread is at max 1.2v as default. VRM is also stated as "unknown" in VBE7.


What card is that?>


----------



## Jaffi

It is a Gigabyte 280X rev. 2.0.


----------



## Stay Puft

Quote:


> Originally Posted by *Jaffi*
> 
> It is a Gigabyte 280X rev. 2.0.


That sucks. Come on Gigabyte


----------



## neptunus

How about tuning it down?


----------



## taem

Quote:


> Originally Posted by *ignus1212*


So the vapor x is XTL. Or at least yours is. Mine arrives tonight.


----------



## phillyd

Anyone know why 280x's are price jacked and OOS like everywhere in the US?


----------



## Stay Puft

Quote:


> Originally Posted by *phillyd*
> 
> Anyone know why 280x's are price jacked and OOS like everywhere in the US?


Christmas time Phil


----------



## Jaffi

Also Litecoin mining might be a factor. People are buying amd cards like crazy for mining rigs.
Quote:


> Originally Posted by *neptunus*
> 
> How about tuning it down?


That works.


----------



## Stay Puft

Quote:


> Originally Posted by *Jaffi*
> 
> Also Litecoin mining might be a factor. People are buying amd cards like crazy for mining rigs.
> That works.










the only people that care about mining are kids who live with their parents and don't pay for electricity. The rest of us could care less. It's not worth the cost.

Prices will come down after Xmas. Its funny to mention that only the amd cards are jacked up. Nvidias are the same prices as before


----------



## Jaffi

I know, however I do suspect it being partially responsible for the skyrocketing prices and extremely bad supply. Look at Nvidia, you can easily get any card you want. Perhaps it also has to do with AMD's supply for next gen consoles, but I doubt it.


----------



## tsm106

Quote:


> Originally Posted by *Jaffi*
> 
> I know, however I do suspect it being partially responsible for the skyrocketing prices and extremely bad supply. Look at Nvidia, you can easily get any card you want. Perhaps it also has to do with AMD's supply for next gen consoles, but I doubt it.


This happens every time there is a mining craze and there's a mining craze right now. There is a lot of Nv card stock because they are not desirable for mining, obviously.


----------



## jamor

Quote:


> Originally Posted by *Stay Puft*
> 
> Prices will come down after Xmas. Its funny to mention that only the amd cards are jacked up. Nvidias are the same prices as before


Nvidia raised prices too. Asus GTX 770 back up to $369 from $300ish.


----------



## Stay Puft

Quote:


> Originally Posted by *jamor*
> 
> Nvidia raised prices too. Asus GTX 770 back up to $369 from $300ish.


Nvidia and Amd didn't raise prices. Newegg is just gouging


----------



## jamor

Quote:


> Originally Posted by *Stay Puft*
> 
> Nvidia and Amd didn't raise prices. Newegg is just gouging


Right but the point is Nvidia is going up too not just AMD


----------



## Durvelle27

*List Updated*


----------



## VegetarianEater

*sigh* i was waiting to buy a second card until after black friday so i would have money to buy all the parts for my PC (ended up spending more because of black friday sales and stuff) and now they're all sold out and being price gouged... knew i should have bought 2 from amazon when they were in stock instead of 1... who knows when i'll be able to get another now (at a decent price anyway)


----------



## Derpinheimer

Quote:


> Originally Posted by *Stay Puft*
> 
> 
> 
> 
> 
> 
> 
> 
> the only people that care about mining are kids who live with their parents and don't pay for electricity. The rest of us could care less. It's not worth the cost.
> 
> Prices will come down after Xmas. Its funny to mention that only the amd cards are jacked up. Nvidias are the same prices as before


Duh, because the AMD cards are good for mining; nvidia arent.

You realize mining works about $15:1 revenue/power cost?


----------



## Stay Puft

Quote:


> Originally Posted by *Derpinheimer*
> 
> Duh, because the AMD cards are good for mining; nvidia arent.
> 
> You realize mining works about $15:1 revenue/power cost?


"Yawn"

15 dollars is chump change and Mining is only for those who dont work for a living. Keep chasing those dreams boys


----------



## TempAccount007

Your opinion doesn't change the fact that no one can get their hands on AMD cards without paying stupid amounts of money.

A 280x DC2T I bought last month for $300 just sold on ebay for $500 (I haven't sold mine).

Most people here are gamers, and it sucks for those who just want to buy a card and play.

Another Edit:
Thinking about trading my 280x for a gtx 780.


----------



## taem

Ok so I just got my Vapor X 280x from Newegg. Here's the thing. This is clearly a return. Box seal was broken, the static bag wasn't sealed, the plastic covering over the shroud is all wrinkled, paper inserts bent, disc sleeve wrinkled, screw marks clearly visible on the tabs.

Is it Newegg's policy to sell returned items as new? I thought they were supposed to be sold as open box.

I have a call out to Newegg but it's an hour wait til callback. I suppose it's no biggie as long as the card works but I'm not thrilled about it. But if I return it who knows when I can get a new card, and at what price. I got this for the original $319, the card is $359 now. Can't decide what to do.


----------



## Cmoney

Quote:


> Originally Posted by *Cmoney*
> 
> So I picked up the XFX 280x DD Black Edition on Black Friday from Newegg and I just finally got it installed today. I have been monitoring the load temps while gaming and I am a little concerned. My idle temp is around 40C, while my load temps have reached as high as 93C during both BF4 and COD:Ghost.. and that is with a custom fan profile that puts the fan at 100% @ 60C. I am running the latest beta drivers , and my 650D is very well ventilated, with an ambient temp around 23C in the room.
> 
> From what I have been reading it seems like my temps should be lower. Are the load temps safe in the 90s? Are they possibly due to the fact that it is a factory overclocked Black Edition? Any help would be greatly appreciated.


I just wanted to report back that after a fresh install of windows, a cleaning of my case, and a little "burning in" time my temps have seemed to stabilize. I failed to mention before that I was running dual monitors, and that explains my high idle temps. My max temp doesn't seem to go much higher than 80C now with two monitors while gaming, and 70C when only running one. I am very happy with the performance of this card as well


----------



## VegetarianEater

newegg seems to be doing this quite often lately, i got a PSU from them a month ago that was also clearly a return, all the seals were broken and the cables were used, the plug to the wall was dirty... i immediately issued an RMA request, and got it returned. It's a pain in the ass, but i'd rather get a new product (and did get a new one a week or 2 later...)


----------



## jamor

wow id be so pissed off.

What else can you do besides exchange it and wait?


----------



## Cmoney

Quote:


> Originally Posted by *taem*
> 
> Ok so I just got my Vapor X 280x from Newegg. Here's the thing. This is clearly a return. Box seal was broken, the static bag wasn't sealed, the plastic covering over the shroud is all wrinkled, paper inserts bent, disc sleeve wrinkled, screw marks clearly visible on the tabs.
> 
> Is it Newegg's policy to sell returned items as new? I thought they were supposed to be sold as open box.
> 
> I have a call out to Newegg but it's an hour wait til callback. I suppose it's no biggie as long as the card works but I'm not thrilled about it. But if I return it who knows when I can get a new card, and at what price. I got this for the original $319, the card is $359 now. Can't decide what to do.


When you talk to them make sure you make a huge deal about it and ask if they can give it to you at an open box discount. Maybe they will be willing to work with you, who knows?!? I personally would be pretty angry if my new card came in obviously used!


----------



## taem

Quote:


> Originally Posted by *jamor*
> 
> wow id be so pissed off.
> 
> What else can you do besides exchange it and wait?


Yeah. I imagine Newegg would be like, sure son ship it back so we can sell it for $40 more. This is what I get for being one of those guys who's always sending stuff back to Newegg, I'm sure plenty of guys have received my returned goods, payback is a b*tch.


----------



## TempAccount007

Tell them you want a discount.

Then sell it and upgrade to a more powerful nvidia card for free.

Check ebay for how much your card is going for. You may not believe it.

If I were you, there would be no way I would return it... short of it simply not working.


----------



## jamor

Quote:


> Originally Posted by *TempAccount007*
> 
> Tell them you want a discount.
> 
> Then sell it and upgrade to a more powerful nvidia card for free.
> 
> Check ebay for how much your card is going for. You may not believe it.
> 
> If I were you, there would be no way I would return it... short of it simply not working.


hey not a bad idea.

Just put it up for $500 and see what happens. Go home with a 780

But if they return it after the holidays, you're screweddd


----------



## JoeDirt

This Gigabyte 280x is a joke. Pull the cooler off and look at this 45° disaster. It's a damn mess under it. Heat pipes don't a cover a good portion of the GPU and are not in direct contact. It has a second plate of copper between the GPU and the heat pipes. Hottest running card I have ever had. I will not buy another AMD/Gigabyte product again if this is what they produce.


----------



## Modovich

Can the MSI 280X be undervolted? If yes, how?


----------



## VegetarianEater

Quote:


> Originally Posted by *taem*
> 
> Yeah. I imagine Newegg would be like, sure son ship it back so we can sell it for $40 more. This is what I get for being one of those guys who's always sending stuff back to Newegg, I'm sure plenty of guys have received my returned goods, payback is a b*tch.


just rma it and tell them it was opened, i did that a month ago for a psu like i said and it doesn't cost you a penny


----------



## GTR Mclaren

Quote:


> Originally Posted by *TempAccount007*
> 
> Your opinion doesn't change the fact that no one can get their hands on AMD cards without paying stupid amounts of money.
> 
> A 280x DC2T I bought last month for $300 just sold on ebay for $500 (I haven't sold mine).
> 
> Most people here are gamers, and it sucks for those who just want to buy a card and play.
> 
> Another Edit:
> Thinking about trading my 280x for a gtx 780.


Im happy I could get a 270x in the "normal" price, 199$


----------



## taem

Quote:


> Originally Posted by *Cmoney*
> 
> When you talk to them make sure you make a huge deal about it and ask if they can give it to you at an open box discount. Maybe they will be willing to work with you, who knows?!? I personally would be pretty angry if my new card came in obviously used!


I sent am email to that effect but the customer service guy I spoke with said since it's an Iron Egg item my only options are refund or replacement. I would RMA it right now except who knows when I'd get a card. I have 55 days, I think I'll install it and see if it's a good performer.

Edit to add, I got the card at $319, was $359 a day later, now it's $379. Geeze...


----------



## kpo6969

Quote:


> Originally Posted by *taem*
> 
> Ok so I just got my Vapor X 280x from Newegg. Here's the thing. This is clearly a return. Box seal was broken, the static bag wasn't sealed, the plastic covering over the shroud is all wrinkled, paper inserts bent, disc sleeve wrinkled, screw marks clearly visible on the tabs.
> 
> Is it Newegg's policy to sell returned items as new? I thought they were supposed to be sold as open box.
> 
> I have a call out to Newegg but it's an hour wait til callback. I suppose it's no biggie as long as the card works but I'm not thrilled about it. But if I return it who knows when I can get a new card, and at what price. I got this for the original $319, the card is $359 now. Can't decide what to do.


Quote:


> Originally Posted by *VegetarianEater*
> 
> newegg seems to be doing this quite often lately, i got a PSU from them a month ago that was also clearly a return, all the seals were broken and the cables were used, the plug to the wall was dirty... i immediately issued an RMA request, and got it returned. It's a pain in the ass, but i'd rather get a new product (and did get a new one a week or 2 later...)


Has anyone just purchased an Asus 280x DC2T from Newegg recently? I have one arriving tomorrow or Wed and by the recent reviews on Newegg there seems to be a lot of 1 eggs. Hopefully it's not a combination of quality control issues or re-boxing returns and selling as new. It was hard to finally get ahold of one and this would suck big time.

Quote:


> Originally Posted by *taem*
> 
> I sent am email to that effect but the customer service guy I spoke with said since it's an Iron Egg item my only options are refund or replacement. I would RMA it right now except who knows when I'd get a card. I have 55 days, I think I'll install it and see if it's a good performer.
> 
> Edit to add, I got the card at $319, was $359 a day later, now it's $379. Geeze...


Same here, ordered last Wed for $309 w/ Iron Egg. 10 minutes after my order went through it was $319, now it's $379.


----------



## taem

Well so I'm playing around with the used card Newegg sold me as new, the Vapor X 280x. Reviews are bunk, this card is nowhere near as loud as they say. Idle is high for my liking - 38-39c. But game load is low, 50-60c so far. Performance at stock is awesome, Tomb Raider with lowered antialiasing but tressfx on benches at 47 min, 57 avg @ 1440p. That's just stellar for a $320 card IMHO. I think I'm going to return it though, the fact that it's used just nags at me lol. There are some oddities also, like it's supposed to be 1050mhz on bios1 and 1070mhz on bios2, but it's 1070 instead for both bios. I think maybe the former owner did some flashing and reset it to stock to RMA but flashed both bios to the boost state? But the boost button does nothing.

ASIC is 68.

The Vapor X looks decent (but the XFX DD is the best looking card I've ever seen), love the backplate, but the shroud is cheap cheap plastic, I was holding it like it was a baby, it felt like if I squeezed a little bit it would crack. I don't get the decision to put a great backplate on a cheap shroud like that.
Quote:


> Originally Posted by *kpo6969*
> 
> Has anyone just purchased an Asus 280x DC2T from Newegg recently? I have one arriving tomorrow or Wed and by the recent reviews on Newegg there seems to be a lot of 1 eggs. Hopefully it's not a combination of quality control issues or re-boxing returns and selling as new. It was hard to finally get ahold of one and this would suck big time.
> Same here, ordered last Wed for $309 w/ Iron Egg. 10 minutes after my order went through it was $319, now it's $379.


The Asus DC2T is hands down the best 280x card IMHO. I had one but had to return it due to sizing issues, oh how I wish that card was 5mm shorter. Rock solid build, and completely silent. You picked the right card.


----------



## Warl0rdPT

Quote:


> Originally Posted by *Mackem*
> 
> Still got this random issue with my Asus 280X when all of the pixels randomly move down the screen by like 20 pixels for like a second then go back to normal. Not sure whether to try any different drivers or not. I'm currently using 13.11 Beta 9.4. Which drivers are the most stable?


mines does that as well but only on firefox, I think its the drivers.


----------



## Jaffi

Quote:


> Originally Posted by *JoeDirt*
> 
> This Gigabyte 280x is a joke. Pull the cooler off and look at this 45° disaster. It's a damn mess under it. Heat pipes don't a cover a good portion of the GPU and are not in direct contact. It has a second plate of copper between the GPU and the heat pipes. Hottest running card I have ever had. I will not buy another AMD/Gigabyte product again if this is what they produce.


What revision? My rev 2 stays pretty cool.


----------



## JoeDirt

Quote:


> Originally Posted by *Jaffi*
> 
> What revision? My rev 2 stays pretty cool.


Rev 2. I lowered the temps by a few by cleaning out all of their crap compound and using better stuff.


----------



## Sgt Bilko

Quote:


> Originally Posted by *JoeDirt*
> 
> Rev 2. I lowered the temps by a few by cleaning out all of their crap compound and using better stuff.


So what was the max temps you were getting?

My Giga 7970 sits around 35c idle and maxed out around 65c (and thats with a 30c ambient temp)


----------



## Johny Boy

Ordered 3 Asus R9 280X DC II for approx price of 390$ ( Launch price and i know price sucks at my place ) but somehow my order is showing now Stock out on all of them.















With hell with Minning why cant they buy 290/290X and keep 280X for us guys.
Now i have to buy 270X and that too MSI gaming with inflated price.

So my query - is it worth buying MSI with so many VRM heating issues and voltage locked ?
Please reply fast as it might also go out of stock soon.


----------



## uaedroid

Is the voltage too high for my Gigabyte card?


----------



## mAs81

How does my VRM temps on my msi 280X look?

All this talk about overheating problems has me worried..








I'm pretty sure my card came with the newer BIOS..


Spoiler: Warning: Spoiler!


----------



## Sgt Bilko

Quote:


> Originally Posted by *mAs81*
> 
> How does my VRM temps on my msi 280X look?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> All this talk about overheating problems has me worried..
> 
> 
> 
> 
> 
> 
> 
> 
> I'm pretty sure my card came with the newer BIOS..
> 
> 
> Spoiler: Warning: Spoiler!


Try running a GPU intense program first, by the looks of it those are idle temps, and if the vrms get over 80c then i might start worrying, but i can't see that happening on anything bar Furmark (which is no good for anything cept raising temps).


----------



## taem

Ok so I just ran Furmark on my used new Vapor X 280x, 1920x1080 fullscreen 15 minutes. Here's the result



GpuZ sensors, showing max values



VRM 1 hit 110c. VRM 2 hit 70c. 110c is ok with Furmark right? Have to graph the gaming temps which ought to be 30-40 degrees lower.


----------



## mAs81

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Try running a GPU intense program first, by the looks of it those are idle temps, and if the vrms get over 80c then i might start worrying, but i can't see that happening on anything bar Furmark (which is no good for anything cept raising temps).


So this is what it looks like after an hour of Far Cry 3 on Ultra settings:


It seems that everything is okay,right?(if you don't count the coil whine that my card has,not so much,but there is some







)


----------



## Sgt Bilko

Quote:


> Originally Posted by *taem*
> 
> Ok so I just ran Furmark on my used new Vapor X 280x, 1920x1080 fullscreen 15 minutes. Here's the result
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> GpuZ sensors, showing max values
> 
> 
> 
> 
> 
> VRM 1 hit 110c. VRM 2 hit 70c. 110c is ok with Furmark right? Have to graph the gaming temps which ought to be 30-40 degrees lower.


Thats fine for Furmark, imo that really th only thing it is good fo is testing out temps, In gaming you would never see anything like that so pretty good result








Quote:


> Originally Posted by *mAs81*
> 
> So this is what it looks like after an hour of Far Cry 3 on Ultra settings:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> It seems that everything is okay,right?(if you don't count the coil whine that my card has,not so much,but there is some
> 
> 
> 
> 
> 
> 
> 
> )


Yep, that is perfectly fine, as i said before, anything above 80c in gaming would have me a little worried but 60c is great


----------



## taem

What temps are folks getting on 280x? I just ran Metro LL bench on max settings for 5 runs, core hit 70c and vrm1 hit 89c. Ok numbers but not thrilled, especially the vrm. I know that's in the safe range but that's high end of acceptable.

Fan curve on this card maxes at 42% so it would appear my used new newegg card has an edited bios... I know for a fact the stock Vapor X has a 72% fan curve cap. Defo gonna RMA this.


----------



## Miiksu

Yeh.. People should start to look more closer how card cooling is made and some reviews too. Look at VRM area. HIS has alot surface and good heat conduct. MSI has tiny metal covering VRM and bad heat conduct.

Good heat plate


Bad heat plate


----------



## neurotix

My 7970 Vapor-X died on me and I'm in the process of RMAing it to Sapphire. Luckily, I have a backup card, a brand new R9 270X that was in my brother's computer. He fried my 6870 by not switching the fan profile before gaming. It overheated and died. He saved up and gave me the money to get this 270X.

Since I run Eyefinity I need the R9 270X more than he does until I get my 7970 back.

Anyway, I've been playing with this card and it seems like it might be a golden chip. It does 1280mhz @ 1.3v and seems stable so far in gaming. I'll do more tests later and play games for a longer period of time. It passes Valley on Extreme HD at 1280mhz. I'm just running the RAM at 1500mhz (stock is 1450mhz) because the card crashed with 1600mhz RAM. It made it through Valley at 1550mhz RAM but I figure I don't gain much from overclocking it 100mhz and it only adds to instability. The bandwidth at 1500mhz is 192 gb/s and at 1550mhz it's 198gb/s... not much of a difference.



The pixel fillrate is actually a bit higher than my 7970 overclocked. Texture fillrate is a lot less though.

Wondering if I have a golden card here.


----------



## Miiksu

@neurotix

That is very nice chip. I would say golden.. more like diamond lol =) I wish my r9 280x would oc that high.


----------



## Sgt Bilko

Yeah, thats very nice Core clock there, i'd have to say Golden at least.......my Giga 7970 is volt locked but for me to hit that Pixel Fillrate i'd need 1290Mhz on the core...........yeah, that 270x you got there is something special


----------



## psychok9

Quote:


> Originally Posted by *hoevito*
> 
> Here's some size comparison pics between a Matrix platinum and sapphire toxic if anyone's interested.
> 
> I have a few observations and thoughts to share:
> 
> The Sapphire toxic is MUCH quieter than people give it credit for!!! I just sent back an Asus DC2T just a few days ago, and I don't agree with all the reviews stating how quiet it was. It may have been quieter than a matrix platinum, but the matrix is noisy as hell anyway!!! The toxic by comparison I feel is far less noisy than the DC2T, and even at full load is relatively pleasant and nowhere near as bad as the matrix.
> 
> Also, the toxic is long and narrow compared to both of the Asus cards. I stacked the toxic on top of the matrix in the third shot as you can see, and the matrix is about a good 3/4 of an inch wider although the toxic seems to be nearly a full inch longer overall.
> 
> Now for the bad. The ASIC quality of this toxic I just received was only 66%, and it doesn't clock very well either. The highest stable frequency I've gotten out of it so far is 1200/1650mhz, and anything higher than that it's starts artifacting pretty badly. It also seems to be stuck at 1.256 volts, and even when raised higher in afterburner and Trixxx, it stays put. The ASIC of my matrix by the way is 68%, so it's not much better but it can at least clock up to 1220/1800mhz stable.


The best version it's the *Asus TOP CU*... (no matrix version) with special "quiet" fan.
I've sended back my Toxic because the noise (and heat) it's very disturbing if you use heavily...
ASIC quality 63,4%.


----------



## neurotix

I forgot about ASIC quality.

My R9 270X ASIC Quality is 83.6% Again, Sapphire Vapor-X edition.

Also, I was able to complete Unigine Valley at 1300/1500mhz 1.3v and scored 38.4 fps and 1565 pts. However, the card isn't stable in games at 1300mhz, so I dropped it to 1280mhz.


----------



## kpo6969

My card arrived today:


----------



## benjamen50

Whats the ASIC quality for? 69.8% on my Gigabyte AMD R9 270X


----------



## omekone

Are the Sapphire R9 280x Vapor-X voltage unlocked? found a few in stock local but they will not accept returns on pc hardware. Did not want to temp fate.


----------



## [CyGnus]

Quick run on the Asus Top R9 280X 1230/1800

http://www.3dmark.com/3dm11/7643361


----------



## rquinn19

Quote:


> Originally Posted by *[CyGnus]*
> 
> Quick run on the Asus Top R9 280X 1230/1800
> 
> http://www.3dmark.com/3dm11/7643361


What voltages on the clock/mem?


----------



## [CyGnus]

I use Asus GPU tweak so only core voltage there, 1.3v (under load is 1.252v), MSI AB did not allow to change voltages.


----------



## DiceAir

I've been thinking of upgrading my CPU and motherboard to Haswell. I'm running a i5-2500k @ 4.8GHz now. Will i waste my money in doing that? I was thinking of waiting for mantle to see if that will solve some of the crap we have like microstutter etc etc.


----------



## jamor

Quote:


> Originally Posted by *DiceAir*
> 
> I've been thinking of upgrading my CPU and motherboard to ivy bridge. I'm running a i5-2500k @ 4.8GHz now. Will i waste my money in doing that. I was thinking of waiting for mantle to see if that will solve some of the crap we have like microstutter etc etc.


Upgrading from Sandy Bridge to Ivy Bridge would essentially be the same as flushing your money down the toilet.

Your processor isn't the issue here.


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> I forgot about ASIC quality.
> 
> My R9 270X ASIC Quality is 83.6% Again, Sapphire Vapor-X edition.
> 
> Also, I was able to complete Unigine Valley at 1300/1500mhz 1.3v and scored 38.4 fps and 1565 pts. However, the card isn't stable in games at 1300mhz, so I dropped it to 1280mhz.


That is a sweet chip and the card is unlocked. Mine for some reason was locked. I am hoping the new one will not be. I did manage 1230/1580 and know I could have gone much further on the core at 1.3v.

This is a Valley 1.0 run at 1265/1400 with my 7870. Not bad for a 7870 on extremeHD.


Spoiler: Warning: Spoiler!


----------



## Archea47

They finally showed up from NewEgg










(Two R9 280x from Gigabyte)


----------



## Durvelle27

*List Updated*


----------



## jamor

Mine is in a truck touring America all week.


----------



## taem

Quote:


> Originally Posted by *omekone*
> 
> Are the Sapphire R9 280x Vapor-X voltage unlocked? found a few in stock local but they will not accept returns on pc hardware. Did not want to temp fate.


Unlocked. I undervolted mine to 1.112 and now I can run at 38% max fan which is inaudible, core maxes at 60, vrm maxes at 74. Asic of 68.

I have a pretty quiet case, fan can be heard over my Noctua fans at 39% and become obnoxious to me at 45%.


----------



## rquinn19

Quote:


> Originally Posted by *[CyGnus]*
> 
> I use Asus GPU tweak so only core voltage there, 1.3v (under load is 1.252v), MSI AB did not allow to change voltages.


Thanks


----------



## Brian18741

So which 280x's are turning out to be the best overclockers guys? Looking for one that is voltage unlocked, overclocks nicely and I can put under water, preferably accepting an XSPC waterblock.


----------



## dspacek

Hello guys, Imn sorry if it was here, but I dont have too much time read whole forum, but I have one HD7950 Gigabyte on water set to 1180/1700 Mhz. yesterday I get R-280X Gigabyte. I put it into a CF. But the performance is like HD7950 on stock clocks.
HD7950 shows 99% usage in Furmark and R9-280X shows 4% of usage. Does anyone know, where the problem is?
Thanks a lot.


----------



## jamor

Quote:


> Originally Posted by *dspacek*
> 
> Hello guys, Imn sorry if it was here, but I dont have too much time read whole forum, but I have one HD7950 Gigabyte on water set to 1180/1700 Mhz. yesterday I get R-280X Gigabyte. I put it into a CF. But the performance is like HD7950 on stock clocks.
> HD7950 shows 99% usage in Furmark and R9-280X shows 4% of usage. Does anyone know, where the problem is?
> Thanks a lot.


No but the first thing I'll tell you is to stop using Furmark and try a program that isn't known to fry cards


----------



## dspacek

Its no problem with frying.
Now I installed lates full version of drivers.
Now showing as HD7950 and HD7970, usage 99% of each, but parformance like only one card. CF is allowed. So if anyone want new R9-280X GIGABYTE OC edition I will let it him for 320$ or less. Im from Czech republic


----------



## neurotix

Quote:


> Originally Posted by *Devildog83*
> 
> That is a sweet chip and the card is unlocked. Mine for some reason was locked. I am hoping the new one will not be. I did manage 1230/1580 and know I could have gone much further on the core at 1.3v.
> 
> This is a Valley 1.0 run at 1265/1400 with my 7870. Not bad for a 7870 on extremeHD.
> 
> 
> Spoiler: Warning: Spoiler!




1280/1500mhz 1.3v

Best thing is the temps top out at 55C with the Vapor-X cooler. I think they've improved the Vapor-X cooler because the one on this 270X seems better than the one my 7970 came with. Heatpipes are thicker, fins are closer together, it's thinner, and it feels more solid.

I'm pretty sure this card could do 1300mhz stable with more than 1.3v but to get clock and voltage control to work properly at all I have to use the Trixx 4.6.3 Beta. It also lacks memory voltage control. I can pass Valley at 1300mhz no problem and the card only warms up to 55C. However, it crashes in games within a few seconds of gameplay. If I could just give it a notch over 1.3v it would probably be good.


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> 
> 
> 1280/1500mhz 1.3v
> 
> Best thing is the temps top out at 55C with the Vapor-X cooler. I think they've improved the Vapor-X cooler because the one on this 270X seems better than the one my 7970 came with. Heatpipes are thicker, fins are closer together, it's thinner, and it feels more solid.
> 
> I'm pretty sure this card could do 1300mhz stable with more than 1.3v but to get clock and voltage control to work properly at all I have to use the Trixx 4.6.3 Beta. It also lacks memory voltage control. I can pass Valley at 1300mhz no problem and the card only warms up to 55C. However, it crashes in games within a few seconds of gameplay. If I could just give it a notch over 1.3v it would probably be good.


Funny thing is I get close to 1300 with my 7870 if I lower the memory to a bit under 1400. I noticed I get more out of the core on the R9 270x if I lower the memory too and since I don't get much of a performance boost from overclocking the memory it seems ok to me. I thought about trying the voltage unlocking tool but am a bit leery of going there. I don't understand why the Devil is locked. It doesn't make sense. If the Toxic wasn't yellow I would tell Newegg to just send me a Toxic instead.


----------



## neurotix

My system is red and black, I wish the Toxic were red and black.

At least the Vapor-X has a black pcb. It has blue on the shroud but that's upside down anyway. If I want a red and black card I'd have to get an ASUS and we all know how bad those are (well, the Matrix 7970s anyway).


----------



## [CyGnus]

Quote:


> Originally Posted by *neurotix*
> 
> My system is red and black, I wish the Toxic were red and black.
> 
> At least the Vapor-X has a black pcb. It has blue on the shroud but that's upside down anyway. If I want a red and black card I'd have to get an ASUS and we all know how bad those are (well, the Matrix 7970s anyway).


All brands have good and bad cards for instance i am very happy with my Asus Top R9 280X it does 1200 core for 24/7 use i consider that good.
Go with Asus Matrix Platinum normally they are good


----------



## jamor

Quote:


> Originally Posted by *[CyGnus]*
> 
> All brands have good and bad cards for instance i am very happy with my Asus Top R9 280X it does 1200 core for 24/7 use i consider that good.
> Go with Asus Matrix Platinum normally they are good


mine comes saturday im licking my chops


----------



## mAs81

Ok,so I ran FurMark on my msi 280X on 1920x1080,with 8xmsaa and benchmark preset 1080p for 15minutes and here's the results:

Could be worse I guess,but than again,they could be much better


----------



## dspacek

PLease can someone help with that non working Crossfire?
I had flashed BIOS on R9-280 to HD7970, but there is the same result.
In idle shows like the GPU in second slot is working on 99%, and performance in benchmark depends on which card is in primary slot.
But performance as only single GPU


----------



## JCH979

Finally received my 280X today from NCIX after ordering it 2 weeks ago (gotta love holiday season delays..).

 

All's well that ends well I suppose.


----------



## Archea47

Looking awesome, JCH979!

I finally put mine in today too ..


----------



## JCH979

Quote:


> Originally Posted by *Archea47*
> 
> Looking awesome, JCH979!
> 
> I finally put mine in today too ..
> 
> 
> Spoiler: Warning: Spoiler!


Right back at ya!







Though my rig's nothing special lol, just got some led's making everything look shiny.


----------



## DiceAir

Quote:


> Originally Posted by *jamor*
> 
> Upgrading from Sandy Bridge to Ivy Bridge would essentially be the same as flushing your money down the toilet.
> 
> Your processor isn't the issue here.


Ok lol silly me. what i meant to say is haswell


----------



## diggiddi

Quote:


> Originally Posted by *dspacek*
> 
> Its no problem with frying.
> Now I installed lates full version of drivers.
> Now showing as HD7950 and HD7970, usage 99% of each, but parformance like only one card. CF is allowed. So if anyone want new R9-280X GIGABYTE OC edition I will let it him for 320$ or less. Im from Czech republic


Post your rig specs I'm sure some one will help you but first disable ULPS


----------



## Lisjak

Hey guys, I just got my new Asus 280x DC2T today. It's really a beast of a card. The one thing I would like to ask you guys is why my gpu usage is not at 100% all the time? It dips down to 60 in bf4 and the fps don't stay above 60 when that happens. Anyone know what I could do?

Also some pics of the beast












Spoiler: Warning: Spoiler!


----------



## [CyGnus]

Lisjak great card i love mine







though the fan can get loud (keep it at 45%)

Little update on my score P12631 280X @ 1235/1825

http://www.3dmark.com/3dm11/7648005


----------



## uaedroid

Any feedback from a XFX 280X user? What you guys can say about the XFX 280X?


----------



## jamor

Quote:


> Originally Posted by *uaedroid*
> 
> Any feedback from a XFX 280X user? What you guys can say about the XFX 280X?


Word is that it gets hot under load but is cool when idle.


----------



## jamor

Quote:


> Originally Posted by *DiceAir*
> 
> Ok lol silly me. what i meant to say is haswell


Lol. Still won't help. Ur sandy is fine atm


----------



## Brian18741

Quote:


> Originally Posted by *Brian18741*
> 
> So which 280x's are turning out to be the best overclockers guys? Looking for one that is voltage unlocked, overclocks nicely and I can put under water, preferably accepting an XSPC waterblock.


Anyone?


----------



## Jaffi

Do you think it is a good idea to lower the fan RPM a bit on the Gigabyte 280x rev 2? I am asking this because it can not read VRM temperature and I don't know how hot they get. The GPU itself is at max 75° C and fan @45-50%.
@stock it never gets hotter than 70° C but fans are at 65% which is too loud for my ears.
How would I recognize VRMs getting too hot? Would the card throttle down?


----------



## Devildog83

Quote:


> Originally Posted by *Jaffi*
> 
> Do you think it is a good idea to lower the fan RPM a bit on the Gigabyte 280x rev 2? I am asking this because it can not read VRM temperature and I don't know how hot they get. The GPU itself is at max 75° C and fan @45-50%.
> @stock it never gets hotter than 70° C but fans are at 65% which is too loud for my ears.
> How would I recognize VRMs getting too hot? Would the card throttle down?


I think it's a bit strange that you can't read the VRM temps at all. What program are you using to read the vitals, just GPUZ?


----------



## taem

IIrc every asic score reported on the 280x here and elsewhere is 60-70. Mine is 68 and does not overclock well, I can only get to 1140 at 1.2v. Otoh these are 850mhz chips, so that's probably a good overclock lol.
Quote:


> Originally Posted by *Jaffi*
> 
> Do you think it is a good idea to lower the fan RPM a bit on the Gigabyte 280x rev 2? I am asking this because it can not read VRM temperature and I don't know how hot they get. The GPU itself is at max 75° C and fan @45-50%.
> @stock it never gets hotter than 70° C but fans are at 65% which is too loud for my ears.
> How would I recognize VRMs getting too hot? Would the card throttle down?


Does high fan rpm hurt vrm cooling? I've never heard that. On my Vapor X 280x higher fan rpm results in lower vrm temps.

Edit nm I forgot gigabyte is voltage locked. Have you called to make sure the card isn't supposed to have a vrm sensor? Maybe it's busted on yours.


----------



## Lisjak

What does ASIC quality tell you anyway? Does a higher number mean better OC? Mine has an Asic quality of 74,7%.


----------



## [CyGnus]

Mine has only 56.8% And it it does 1235/1825 @ 1.3v


----------



## Derpinheimer

That's a very high voltage for that speed; which is the trend ASIC quality sets.

High ASIC = less voltage for given clockspeed. Opposite for low ASIC.

DOES NOT imply overall max OC.


----------



## Archea47

Quote:


> Originally Posted by *JoeDirt*
> 
> This Gigabyte 280x is a joke. Pull the cooler off and look at this 45° disaster. It's a damn mess under it. Heat pipes don't a cover a good portion of the GPU and are not in direct contact. It has a second plate of copper between the GPU and the heat pipes. Hottest running card I have ever had. I will not buy another AMD/Gigabyte product again if this is what they produce.


Hey JoeDirt, did you happen to take any pictures?

What's the problem with a copper heatplate?

When I installed my Rev2 Gigs last night one of the cards was bugged out and I realized it had been running at 99% utilization sitting on the desktop (according to MSI afterburner). Temperature on that card was 53*C and on the other card 40*C, though I keep an aggressive fan profile on in MSI afterburner so at 53*C it was about 65% fan (nothing can be louder than my H100i in "performance" mode)


----------



## taem

Quote:


> Originally Posted by *Derpinheimer*
> 
> That's a very high voltage for that speed; which is the trend ASIC quality sets.
> 
> High ASIC = less voltage for given clockspeed. Opposite for low ASIC.
> 
> DOES NOT imply overall max OC.


Well I'm a total novice at overclocking and that's what I hear. But what I find is that to raise clocks you need more voltage, which in turn raises temps. So in that sense ASIC sort of does imply a certain level of overclock ability. Because that's always my limiter, temps, I don't ever get to the point where I can determine max clocks because I shy away from the raised temps. I don't WC though.

Even if you WC or ln2 though, and can tolerate high voltages, high voltages wear out chips faster, with or without high temps. If you WC or ln2 you probably don't care since you're going to upgrade before any effects become evident. But for air cooled slow upgraders like me, I need to stress long term usage. I've had cards where I need to lower clocks to maintain stability two years into ownership.

In short, I want the highest ASIC possible. I'm so envious of the guys who get 110 rated cards, I always seem to end up in the 60s.

Here's a case in point, here are results for Metro LL bench, 20 loops, same settings except for voltage:

At 1.200v:


At 1.1193v:


And I ran the 1.200v bench early morning when ambient was much lower. Even a tiny bit of voltage makes a big difference.


----------



## Derpinheimer

Not totally sure on longevity.

I think the reason people equate low ASIC -> watercool and high ASIC -> air is that a lot of people simply feel safer putting larger amounts of voltage in to a card when its on water.. So, low ASIC on water wont be limited. Low ASIC on air will be - psychologically.

I've had a very high ASIC card and found that it drew much more power, and ran much hotter than a lower ASIC card of the same exact model, at the same voltages. However, it took less voltage to be stable at a given clock. So the only real difference ended up being.. my card took less voltage, for the same net result. Power draw and temps were basically the same.

Maybe its not always that way? I dont know. No group of people have ever teamed up to solve the ASIC confusion once and for all. As far as I know, at least.

Of course, higher voltage always means more power draw on the same card. Its when comparing to others that its not so clear.


----------



## JoeDirt

Quote:


> Originally Posted by *Archea47*
> 
> Hey JoeDirt, did you happen to take any pictures?
> 
> What's the problem with a copper heatplate?
> 
> When I installed my Rev2 Gigs last night one of the cards was bugged out and I realized it had been running at 99% utilization sitting on the desktop (according to MSI afterburner). Temperature on that card was 53*C and on the other card 40*C, though I keep an aggressive fan profile on in MSI afterburner so at 53*C it was about 65% fan (nothing can be louder than my H100i in "performance" mode)


Sadly I didn't take any pictures. No problem with the pipes, it's how to GPU is at a 45° angle and not a 90° angle. So about 15-25% of the die is not in direct contact with the heat pipes. The TC was thick, hard and over applied. New paste got it down to low 70's instead of high 70's to low 80's. And this is with fans at 100% (The noise does not bother me) I also mine litecoin with this. To get to 700Kh/s you have to under clock it to 1020MHz. Anything over that and it will sit in the 400Kh/s range. This seems to be a big issue with a lot of these Gigabyte 280x's. Most other don't seem to have this problem. Maybe they just put a updated 7970 BIOS in them and said **** it? Maybe a BIOS update will be the solution but as of this time none of the ones released by Gigabyte have done any good. I will say the construction quality is A+ and it looks cool as hell. Going to end up getting a 780 Ti to game and water cool\OC


----------



## Jaffi

Why would the KH/s drop with higher clocks, I don't understand


----------



## JoeDirt

Quote:


> Originally Posted by *Jaffi*
> 
> Why would the KH/s drop with higher clocks, I don't understand


No body knows yet. It just does. It also only works with intensity set to 13. Any others and the rate will sit around 20 KH/s. I'm going to see if I can flash another brands BIOS on it and see if that helps.


----------



## gh0stfac3killa

hey everyone i read somone asking about the xfx user. if there was any info. i orderd 2 xfx 280x's and they should be here any time now. they have been shipped from newegg so once i get them in i will post some information on the gpus. in the mean time my two gigabyte 280x's are on point. i keep reading all this crazyness about the card and the cooler. both of mine in my trooper case throw down, and run super cooler and quiet. these are the first gigabyte cards ive used and so far im very happy with them. take out the part of not being able to control the voltage since there voltage locked. other than that they are solid.









again once the xfx get here i will post up some numbers and let you guys know how i feel about them.


----------



## gh0stfac3killa

Quote:


> Originally Posted by *jamor*
> 
> Word is that it gets hot under load but is cool when idle.


once i get mine in anytime now i will post some info on them. my little brother got his in finaly and hes saying they run cool and quiet. hes not braking past 55 degrees on full load. but i will verify this once i get mine in.


----------



## taem

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> hey everyone i read somone asking about the xfx user. if there was any info. i orderd 2 xfx 280x's and they should be here any time now. they have been shipped from newegg so once i get them in i will post some information on the gpus. in the mean time my two gigabyte 280x's are on point. i keep reading all this crazyness about the card and the cooler. both of mine in my trooper case throw down, and run super cooler and quiet. these are the first gigabyte cards ive used and so far im very happy with them. take out the part of not being able to control the voltage since there voltage locked. other than that they are solid.


Congrats on the xfx cards, IMHO those are the best looking cards ever. I would have picked it except it's too long for my setup. I'm not into "gamer" aesthetics so I love the clean minimalistic design of that card.

But about the gigabytes, do you get vrm temps on any monitoring software? Because that guy above doesn't and that sounds crazy to me, in this day, that an enthusiast board would lack vrm sensors.


----------



## jamor

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> once i get mine in anytime now i will post some info on them. my little brother got his in finaly and hes saying they run cool and quiet. hes not braking past 55 degrees on full load. but i will verify this once i get mine in.


i guess that is a cool card then


----------



## uaedroid

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> hey everyone i read somone asking about the xfx user. if there was any info. i orderd 2 xfx 280x's and they should be here any time now. they have been shipped from newegg so once i get them in i will post some information on the gpus. in the mean time my two gigabyte 280x's are on point. i keep reading all this crazyness about the card and the cooler. both of mine in my trooper case throw down, and run super cooler and quiet. these are the first gigabyte cards ive used and so far im very happy with them. take out the part of not being able to control the voltage since there voltage locked. other than that they are solid.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> again once the xfx get here i will post up some numbers and let you guys know how i feel about them.


Congratz on your Gigabyte 280Xs, I also have one. I will wait for your review of the XFX 280X.


----------



## gh0stfac3killa

Quote:


> Originally Posted by *jamor*
> 
> i guess that is a cool card then


ill drag them threw the trenches once mine get here!!


----------



## gh0stfac3killa

Quote:


> Originally Posted by *uaedroid*
> 
> Congratz on your Gigabyte 280Xs, I also have one. I will wait for your review of the XFX 280X.


vey nice. pair that bad boy up with another and the combo screems. i did see the price on these gpus went way up on newegg. i bought mine for 299.99 a piece and now there 399.99. crazy how fast they jumped up in price. the xfx i lucked out at well bought them at 309.99 now there hitting for 419.99. so far though im loving my gigabytes. once the xfx's get here im going to throw the gigabytes into my little cooler master k-380 case since the xfx gpus are a little to long for that case lol.


----------



## neurotix

Just did this...

http://hwbot.org/submission/2465134_

Pretty sure that's stock 7970 non GHZ performance. My 7970 Ghz did about 9500 at 1050/1500.


----------



## gh0stfac3killa

Quote:


> Originally Posted by *taem*
> 
> Congrats on the xfx cards, IMHO those are the best looking cards ever. I would have picked it except it's too long for my setup. I'm not into "gamer" aesthetics so I love the clean minimalistic design of that card.
> 
> But about the gigabytes, do you get vrm temps on any monitoring software? Because that guy above doesn't and that sounds crazy to me, in this day, that an enthusiast board would lack vrm sensors.


ill check tonight when i get home, and let you know. i thought i did see vrm temps on the gigabyte overclock and monitoring tool but i could be wrong been to busy getting my head smashed in, in bf4. lol.. but i will most def give it a look and let you know asap...


----------



## uaedroid

@gh0stfac3killa, what is your hashrate on mining on your dual 280Xs?


----------



## gh0stfac3killa

i know the only real gripe i may have with my xfx gpu's will be that crossfire finger set up they use on the cards. my lil bro already warned me about it. but again once they are here ill see how bad it realy is. not sure if this was one of those things the xfx thought these gpus would crossfire with out a bridge like the 290x line of cards and at the last minute found out they wouldnt. so they just threw some fingers on there. ive seen online people who have them already having to bend the heck out of there bridges to get them to crossfire. but i will fight that battle once they are here....


----------



## gh0stfac3killa

Quote:


> Originally Posted by *uaedroid*
> 
> @gh0stfac3killa, what is your hashrate on mining on your dual 280Xs?


i dont mine brother, strickly gaming. but how would i look that info up as far as that rate your looking for. im willing to take a look at it.


----------



## uaedroid

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> i dont mine brother, strickly gaming. but how would i look that info up as far as that rate your looking for. im willing to take a look at it.


Here is a tutorial from our fellow OCN...http://www.overclock.net/t/1398250/official-tutorial-how-to-start-mining-litecoins


----------



## Archea47

Quote:


> Originally Posted by *taem*
> 
> But about the gigabytes, do you get vrm temps on any monitoring software? Because that guy above doesn't and that sounds crazy to me, in this day, that an enthusiast board would lack vrm sensors.


No VRM temps here on my Gigabye R9 280X Rev2's either. At least it has a heatplate for the ram and a finned cooler bolted to the larger set of VRMs. I would really like to be able to monitor them. I hope it's a BIOS flash away

FWIW my ASIC scores are 69.0% and 64.9%

*Can someone give me ideas on what in the world is happening here with my GPU utilization?*


This happened the first time I installed the cards (noticed it after I had updated to the new beta drivers from 12/03/2013 on AMD's website), and now has happened again. Last time I wasn't gaming, just updating drivers and looking around in CPU-Z etc. This is about 20 minutes after finishing playing BF4. I didn't notice any problems while playing

I have Crossfire for applications that don't have profiles turned off since troubleshooting why I had this problem last time.

Ideas?


----------



## gh0stfac3killa

Quote:


> Originally Posted by *uaedroid*
> 
> Here is a tutorial from our fellow OCN...http://www.overclock.net/t/1398250/official-tutorial-how-to-start-mining-litecoins


right ill check it out once im off work and get back to you on that rate info. the sad thing i would be willing to drag that big-o troopa in here to work if they would let me lol.. its alittle to big to be a lan box.... ohhh but if i could i sooooo would....


----------



## uaedroid

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> right ill check it out once im off work and get back to you on that rate info. the sad thing i would be willing to drag that big-o troopa in here to work if they would let me lol.. its alittle to big to be a lan box.... ohhh but if i could i sooooo would....


That's great! Thanks bro!


----------



## gh0stfac3killa

Quote:


> Originally Posted by *Archea47*
> 
> No VRM temps here on my Gigabye R9 280X Rev2's either. At least it has a heatplate for the ram and a finned cooler bolted to the larger set of VRMs. I would really like to be able to monitor them. I hope it's a BIOS flash away
> 
> FWIW my ASIC scores are 69.0% and 64.9%
> 
> *Can someone give me ideas on what in the world is happening here with my GPU utilization?*
> 
> 
> This happened the first time I installed the cards (noticed it after I had updated to the new beta drivers from 12/03/2013 on AMD's website), and now has happened again. Last time I wasn't gaming, just updating drivers and looking around in CPU-Z etc. This is about 20 minutes after finishing playing BF4. I didn't notice any problems while playing
> 
> I have Crossfire for applications that don't have profiles turned off since troubleshooting why I had this problem last time.
> 
> Ideas?


i had an issues like this with my old gpus, what i did was unistalled the gpu monitoring program and reinstalled it after i updated my drivers. for some reason the new driver didnt take to the monitoring program. it wasnt tracking. could be it solved my problem especialy after a clean update to my drivers.


----------



## taem

Well I suppose since the Gigabytes are voltage locked the vrms will remain within tolerance ranges so you don't need the sensor, I guess that's Gigabytes logic. Those things are rated for 120c anyway right, you're not exceeding that unless your core also hits a point where it will throttle and reduce the vrm load.
Quote:


> Originally Posted by *uaedroid*
> 
> Congratz on your Gigabyte 280Xs, I also have one. I will wait for your review of the XFX 280X.


79% fan speed??? On a 3x fan windforce cooler?? Your noise tolerance must be through the roof. I have my Vapor X set to 38% lol.


----------



## benjamen50

Yep, I knew it, the Gigabyte 280X has the same sensors as the Gigabyte 270X has, with no VRM temperature/sensor.


----------



## uaedroid

Quote:


> Originally Posted by *taem*
> 
> Well I suppose since the Gigabytes are voltage locked the vrms will remain within tolerance ranges so you don't need the sensor, I guess that's Gigabytes logic. Those things are rated for 120c anyway right, you're not exceeding that unless your core also hits a point where it will throttle and reduce the vrm load.
> 79% fan speed??? On a 3x fan windforce cooler?? Your noise tolerance must be through the roof. I have my Vapor X set to 38% lol.


Actually, the Gigabyte 3X Fan Windforce at 79% is NOT that noisy. I am sensitive to sound but believe me, the 3X Windforce cooler is quieter than expected.


----------



## gh0stfac3killa

Quote:


> Originally Posted by *uaedroid*
> 
> Actually, the Gigabyte 3X Fan Windforce at 79% is NOT that noisy. I am sensitive to sound but believe me, the 3X Windforce cooler is quieter than expected.


no doubt, my crossfire set up under load with my side panel off i hear the case fans more than anything.. about 4 hours into bf4 non stop the gpus are super quiet.


----------



## taem

Quote:


> Originally Posted by *uaedroid*
> 
> Actually, the Gigabyte 3X Fan Windforce at 79% is NOT that noisy. I am sensitive to sound but believe me, the 3X Windforce cooler is quieter than expected.


http://www.tomshardware.com/reviews/radeon-r9-280x-third-party-round-up,3655-6.html

The gigabyte is rated as the third quietest 280x behind the DC2T and HIS but it seems noisy to me based on this. The Vapor X is loud as hell at high rpm which is why I dialed mine down. I mean check out the vid in that link, that would be like gaming on a factory floor. Stupid loud. Cooling on the vapor x is excellent though, at 38% mine doesn't go over 65c. I posted my fur mark earlier, hit 70c on auto fan curve. That's stellar IMHO. No reason to run fans at high rpm with that kind of cooling. The DC2T caps at 42% fan speed and that's the only card that gets consistently praised as a quiet card. But at load the Asus runs a lot hotter than the Vapor X, though 75c isn't a bad temp.

Anyway, 79% seems absolutely bonkers to me lol. I don't think I've ever run a card at 79% fan speed.


----------



## uaedroid

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> no doubt, my crossfire set up under load with my side panel off i hear the case fans more than anything.. about 4 hours into bf4 non stop the gpus are super quiet.


Same experience bro, I can hear the case fans more than the windforce fans. I have set the GPU auto as it is running 24/7. I did not touch the GPU settings, I just fire it to run the miner 24/7.


----------



## Jaffi

Quote:


> Originally Posted by *uaedroid*
> 
> Congratz on your Gigabyte 280Xs, I also have one. I will wait for your review of the XFX 280X.


Yeah looks exactly the same on the rev. 2.0 card, but with voltage @1.2v. I tuned down my fans and now the card is sitting @ 75° C/45% most of the time which is very quiet. Since the VRMs can not be monitored, I hope they will be fine. But I guess they will. My case has an overall good airflow and the low voltage should do good.


----------



## delusion87

Man, the prices have gone up by like 30 euros WTH?
I've been watching the prices of 280x and i can't believe it lol.

I'm hoping to get a 280x by Xmas.
My Vapor X 4890 is not able to play BF 4 and other new games


----------



## gh0stfac3killa

Quote:


> Originally Posted by *delusion87*
> 
> Man, the prices have gone up by like 30 euros WTH?
> I've been watching the prices of 280x and i can't believe it lol.
> 
> I'm hoping to get a 280x by Xmas.
> My Vapor X 4890 is not able to play BF 4 and other new games


yah they are sky rocketing and fast seems everytime i go on newegg and look there higher and higher. i was reading a report somewhere cant remember where but i was looking around and there saying that the reason the cost is going up is do to the fact that bit coin and lite coin miners are buying them up 3 or 4 gpus at atime sometimes even more. making them super scarce. not sure how true that all is but the more i read around forums the more im starting to think it is true. im hoping they go back down so the gamer can get there hands on some of them. i was lucky to get the ones i did when they had them at the prices they were at. and when i mean lucky i mean as soon as i hit the pay button for my last two i ordered the page refreshed and they were sold out.

like i said before the two gigabyte ones were 299.99 a piece and the xfx ones i have coming in soon were 309.99. now there 399.99 and 419.99. its crazy there getting back up there into the nvidia price ranges.


----------



## gh0stfac3killa

just and fyi not sure if your heart is set on sapphire but the xfx gpus are now in stock on newegg even though the price has gone way up on them...


----------



## Archea47

Quote:


> Originally Posted by *taem*
> 
> http://www.tomshardware.com/reviews/radeon-r9-280x-third-party-round-up,3655-6.html


For what it's worth, that review was done using the Rev1 Gigabyte. The ones you buy now should be Rev2, which uses the 780-style Windforce cooler which is supposed to be a quieter version


----------



## Archea47

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> i had an issues like this with my old gpus, what i did was unistalled the gpu monitoring program and reinstalled it after i updated my drivers. for some reason the new driver didnt take to the monitoring program. it wasnt tracking. could be it solved my problem especialy after a clean update to my drivers.


Thanks for the input

I reinstalled MSI Afterburner and will report back if I get the 99% utilization again

Two problems since re-installing Afterburner:
1.) I no longer have visibility into GPU2 temperature (through AB)
2.) GPU2 fan speed is reported as 20%, which doesn't follow my fan profile (I've selected GPU2 in settings and it shows same profile as GPU1, which is at 45%)


----------



## gh0stfac3killa

Quote:


> Originally Posted by *Archea47*
> 
> Thanks for the input
> 
> I reinstalled MSI Afterburner and will report back if I get the 99% utilization again
> 
> Two problems since re-installing Afterburner:
> 1.) I no longer have visibility into GPU2 temperature (through AB)
> 2.) GPU2 fan speed is reported as 20%, which doesn't follow my fan profile (I've selected GPU2 in settings and it shows same profile as GPU1, which is at 45%)


hmm strange. are you getting the same issue using the gigabyte overclock and monitoring tool or is it just the msi afterburner..?? might be a software compatability with the new beta drivers. ill download the msi after burner and see if mine do the same thing. ill get back with you in a few...


----------



## Archea47

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> are you getting the same issue using the gigabyte overclock and monitoring tool or is it just the msi afterburner..?? might be a software compatability with the new beta drivers


I haven't tried the gigabyte tool. The software I have installed (fresh Win8 build) that has visibility into the GPUs are MSI Afterburner (beta 17 - the latest), CPU-Z (latest), GPU-Z (latest) and Catalyst Control Center. So no one should be controlling the fans apart from MSI AB I believe


----------



## Devildog83

Quote:


> Originally Posted by *Archea47*
> 
> I haven't tried the gigabyte tool. The software I have installed (fresh Win8 build) that has visibility into the GPUs are MSI Afterburner (beta 17 - the latest), CPU-Z (latest), GPU-Z (latest) and Catalyst Control Center. So no one should be controlling the fans apart from MSI AB I believe


Make sure you have AMD overdrive disabled in CCC, I have heard there can be some conflict


----------



## delusion87

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> yah they are sky rocketing and fast seems everytime i go on newegg and look there higher and higher. i was reading a report somewhere cant remember where but i was looking around and there saying that the reason the cost is going up is do to the fact that bit coin and lite coin miners are buying them up 3 or 4 gpus at atime sometimes even more. making them super scarce. not sure how true that all is but the more i read around forums the more im starting to think it is true. im hoping they go back down so the gamer can get there hands on some of them. i was lucky to get the ones i did when they had them at the prices they were at. and when i mean lucky i mean as soon as i hit the pay button for my last two i ordered the page refreshed and they were sold out.
> 
> like i said before the two gigabyte ones were 299.99 a piece and the xfx ones i have coming in soon were 309.99. now there 399.99 and 419.99. its crazy there getting back up there into the nvidia price ranges.


Yea, not sure which i'm gonna get tho.
I was thinking Vapor-X again or Toxic (black orange will look fugly on blue mobo haha). I found Vapor for 276euro here including taxes & shipping so its very cheap







But they are all out of stock.
Hopefully they'll arrive seen enough


----------



## battleaxe

I'm outta here. Bought a 290 yesterday.


----------



## jamor

Quote:


> Originally Posted by *battleaxe*
> 
> I'm outta here. Bought a 290 yesterday.


Should have waited for non-reference =/


----------



## Archea47

Quote:


> Originally Posted by *Devildog83*
> 
> Make sure you have AMD overdrive disabled in CCC, I have heard there can be some conflict


Is it something that's enabled by default? I certainly haven't touched it. Unfortunately I won't be home till Sunday to check

Also, this morning I tried setting the fan speed through MSI AB to something other than 20% on GPU2 and it wouldn't take. GPU1 does as its told

And FWIW I haven't changed any clocks etc. on the GPUs, just MSI AB fan profile


----------



## battleaxe

Quote:


> Originally Posted by *jamor*
> 
> Should have waited for non-reference =/


Going under water. So I couldn't care less. Otherwise, you'd be right.


----------



## GTR Mclaren

Man my 270x is still in the states


----------



## jamor

Quote:


> Originally Posted by *GTR Mclaren*
> 
> Man my 270x is still in the states


You'd think it would fly directly from Cali to El Salvador...


----------



## Twirlz

I'm keen to get my hands on a 280X but am really stuck on what brand. I've been trying to decide for weeks









At the moment the two I am most interested in are a Gigabyte 280X R2.0 or a MSI 280X Gaming Edition OC. They are both the same price and the only ones I can get ahold of at the moment.

Which one would be better for noise, temperatures, warranty service and reliability? I noticed the Gigabyte Rev 2.0 does not have any reviews on the internet









Thanks!


----------



## taem

Quote:


> Originally Posted by *Twirlz*
> 
> I'm keen to get my hands on a 280X but am really stuck on what brand. I've been trying to decide for weeks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At the moment the two I am most interested in are a Gigabyte 280X R2.0 or a MSI 280X Gaming Edition OC. They are both the same price and the only ones I can get ahold of at the moment.
> 
> Which one would be better for noise, temperatures, warranty service and reliability? I noticed the Gigabyte Rev 2.0 does not have any reviews on the internet
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks!


280x is just rebranded 7970, often using coolers from previous card models, so we don't really need reviews. Here's a decent one comparing 7 cards, including the ones you mention. http://www.tomshardware.com/reviews/radeon-r9-280x-third-party-round-up,3655.html

I've had the Asus DC2T and now a Vapor X. I would say the DC2T is the best 280x card. Built like a tank, silent, performance same as the rest other than Toxic which is the fastest.


----------



## Twirlz

Quote:


> Originally Posted by *taem*
> 
> 280x is just rebranded 7970, often using coolers from previous card models, so we don't really need reviews. Here's a decent one comparing 7 cards, including the ones you mention. http://www.tomshardware.com/reviews/radeon-r9-280x-third-party-round-up,3655.html
> 
> I've had the Asus DC2T and now a Vapor X. I would say the DC2T is the best 280x card. Built like a tank, silent, performance same as the rest other than Toxic which is the fastest.


Hi









Thanks for the link however, the Gigabyte in the review is using the Rev 1.0 cooler. I'm not 100% sure what improvements Rev 2.0 brings but upon Googling I can only find reviews for it on a 780.

I would happily lean towards the Asus however, the place I was looking at wont get stock for a while


----------



## [CyGnus]

What kind of scores are you guys getting with a 280x on Valley?


----------



## JoeDirt

Can anyone please upload their stock BIOS for the Sapphire R9 280X Toxic OC please? I cant find it on techpowerup.com or the sapphire site.


----------



## NinjaToast

Well i'm having a bit of an issue with my Asus card, it seems it likes to downclock the core to 970mhz all by itself and to be honest i think it's causing some artifacting problems I've been experiencing while playing games for about an hour or little over an hour and same thing happens when I watch a stream for about an hour, while browsing it's random when it starts artifacting. To say the least I'm stumped and a tad frustrated..

Thought I might gather an opinion from you guys on what I should do.. Now I could try flashing a newer Asus bios onto the card (from techpowerup) and see if it's the bios or I could RMA the card when it gets back in stock on newegg. I'd try older versions of AMD beta(I had the 9.4 drivers and am currently on the 9.5 drivers having this issue on both) drivers but I cannot find them anywhere soooo yeah..

Edit: I might add that once it downclocks itself to 970mhz there is no changing clocks back to default, it's stuck there till I restart my PC.

*Edit2: nevermind figured out the downlocking 970mhz issue, Hardware acceleration was my issue.. Tis weird but I'll see if this is the ultimate fix for my artifacting in browser and games..*


----------



## fastestlouigie

Quote:


> Originally Posted by *Twirlz*
> 
> Hi
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for the link however, the Gigabyte in the review is using the Rev 1.0 cooler. I'm not 100% sure what improvements Rev 2.0 brings but upon Googling I can only find reviews for it on a 780.
> 
> I would happily lean towards the Asus however, the place I was looking at wont get stock for a while


http://www.overclockers.co.uk/showproduct.php?prodid=GX-122-GI&groupid=701&catid=56&subcat=1842 or
http://www.overclockers.co.uk/showproduct.php?prodid=GX-225-MS
£285 for an O.C.'d 280x with 5 of the latest games? I got an Asus r9 280x CUII Top for £265 last weekend from Maplins but it's out of stock know (plus increased in price to £280 on pre-order!). However, I didn't get any free games and this promotion adds a lot to the value you're getting in the deal. 'Probably would've waited had I known that I could get BF4 for free.....


----------



## benjamen50

Did you know the graphics card downclocks itself when idling due to the power saving features it has? Its perfectly normal, this issue can also be because of the GPU not being supplied sufficient power.


----------



## NinjaToast

Quote:


> Originally Posted by *benjamen50*
> 
> Did you know the graphics card downclocks itself when idling due to the power saving features it has? Its perfectly normal, this issue can also be because of the GPU not being supplied sufficient power.


Yes because downclocking the GPU to 970mhz is idling and power saving, I've never encountered this issue with any past cards, though i've heard hardware accel has issues with cards sometimes. The downclocking issue (which I always had my browser open while playing games) however was solved by turning off hardware accel on my browser and flash, which funny enough a friend found out was the issue on the battlefield forums which he linked me too. Though again I'm not sure if it fixed my artifacting issue as i don't have the patience to test any specific game for an hour or more at the moment. xD Also I'm fairly certain the GPU is getting more than enough power.


----------



## jamor

280x arrived from newegg ($295)! Rock solid gpu. This is one heavy piece of electronic.

Dwarfs my Gigabyte 6850 who no longer has a home.

My case is rated for 11.2 inches (size of this card) and they were right. Perfect fit. Any longer and I would have been in trouble.

It's time to game.


----------



## NinjaToast

^Good lord the Asus 280x looks massive in comparison to your old card. xD


----------



## taem

Quote:


> Originally Posted by *[CyGnus]*
> 
> What kind of scores are you guys getting with a 280x on Valley?


I ran Valley at the same setting as the 270x guys who posted their scores just because I was curious about the performance difference, here is what I got:



Quote:


> Originally Posted by *benjamen50*
> 
> Did you know the graphics card downclocks itself when idling due to the power saving features it has? Its perfectly normal, this issue can also be because of the GPU not being supplied sufficient power.


ULPS takes the card to 300mhz core and 150mhz mem. 975 is something else.


----------



## benjamen50

Looks like a defective graphics card then.


----------



## gh0stfac3killa

Quote:


> Originally Posted by *Devildog83*
> 
> Make sure you have AMD overdrive disabled in CCC, I have heard there can be some conflict


good point i forgot about the ccc. most def check on that cause that is the default controler for the gpu's. its just strange that after a driver update the afterburner is acting like it is. something else has to be controling the gpus. hmmmmmm???????


----------



## gh0stfac3killa

Quote:


> Originally Posted by *jamor*
> 
> 
> 
> 
> 
> 
> 
> 
> 280x arrived from newegg ($295)! Rock solid gpu. This is one heavy piece of electronic.
> 
> Dwarfs my Gigabyte 6850 who no longer has a home.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My case is rated for 11.2 inches (size of this card) and they were right. Perfect fit. Any longer and I would have been in trouble.
> 
> It's time to game.


congrats on the new new, those asus gpus look awsome!!!


----------



## VegetarianEater

So I have a Sapphire toxic 280x, and with the recent craze it's going to be impossible for me to get my hands on a reasonably priced 280x for a while, and i wanted to crossfire toxics... since that's no longer really a viable plan, should i just sell mine and buy a gtx 780 (469 for the msi one)? Also the toxic is a bit loud, despite being inside a corsair 550D (which isn't as sound proofed as i had hoped...)


----------



## [CyGnus]

taem thanks i score 2250 and i find it kind of low for 1200/1850


----------



## gh0stfac3killa

Quote:


> Originally Posted by *VegetarianEater*
> 
> So I have a Sapphire toxic 280x, and with the recent craze it's going to be impossible for me to get my hands on a reasonably priced 280x for a while, and i wanted to crossfire toxics... since that's no longer really a viable plan, should i just sell mine and buy a gtx 780 (469 for the msi one)? Also the toxic is a bit loud, despite being inside a corsair 550D (which isn't as sound proofed as i had hoped...)


this is truly one of those judgment calls on what you realy want and like. the 780 is no doubt a great gpu but the prices of these cards will fall again. once we get past the holidays and the mining craze and more stock is available they will go back down to normal. on the other side is this you have the upcoming mantel, and hopefuly soon directx 11.2. mantel along going by from what all i have read will be a true game changer and so far there are more and more companys going to using mantel along with all the other coding deals. so the real questions here are one how inpatient are you? and 2 are you wanting to be future proof as far as upcoming software? i would stick with the toxic thats a beast of a gpu. as far as it being loud heapones, lol.. works for me. once the prices drop youll be able to cop up another toxic throw them in crossfire and destroy a 780 all day long. last time i checked to sli two 780's its still a pretty good mint and those prices arent going to come down anytime soon in true nvidia fasion.... so in my opion i would wait cause the value on that gpu being a toxic wont loose its value even if you have to sell it on ebay.









but if you realy get to the point you want to sell it let me know i have a coworker who is looking for that gpu and is waiting for the price to drop and be available. pm me if its what you want to do.


----------



## gh0stfac3killa

Quote:


> Originally Posted by *Twirlz*
> 
> Hi
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for the link however, the Gigabyte in the review is using the Rev 1.0 cooler. I'm not 100% sure what improvements Rev 2.0 brings but upon Googling I can only find reviews for it on a 780.
> 
> I would happily lean towards the Asus however, the place I was looking at wont get stock for a while


hi there the msi are throwing down very nicely my buddy has two in crossfire, im running the gigayte 280x's in crossfire, there the rev 1 you can say i guess but there pretty much the same card just a new shrowed over the gpu and fans, still the tripple fan setup, but so far mines are very quiet under load and dont brake a sweat. so the real thing here is you can not go wrong with either brand. side by side between his msi's and my gigabyte's there is not much differnece in heat or fps, but my windforces are quieter. as i said before i hear my case fans more than my gpu fans even with the panel off. performance wise though you cant miss with either one. it comes down to looks and your style. the gigabyts arent exactly eye apealing. there very bland in looks to me very straight foward gpu, fan, plastic go.. the msi's have more style and a more agressive look to them wich i do like.

you cant go wrong with either gpu brand...


----------



## Jaffi

Quote:


> Originally Posted by *[CyGnus]*
> 
> taem thanks i score 2250 and i find it kind of low for 1200/1850


I think you score very good, did you change something in driver or inside the benchmarks settings? This is what I can pull off with my gigabyte card @1130/1650: https://dl.dropboxusercontent.com/u/37488657/Unigine_Valley_Benchmark_1.0_20131214_1231.html


----------



## [CyGnus]

I just set the CCC to Performance and surface optimization to OFF then in Asus GPU Tweak gave it 1.3v and OC the hell out of the card


----------



## Jaffi

Strange, I tried the same but still I am missing 250 points on 1150/1650. What do you mean with CCC to performance?







I think it might be the CPU then. Can't believe that 50 MHz on core and 200 MHz on Ram would make that big of a difference? Could you bench once with my clocks?







Then we know.


----------



## Majentrix

Who makes the best 270x?
Do the HAWK and Toxic cards use higher-binned GPUs than the rest?


----------



## taem

Quote:


> Originally Posted by *Jaffi*
> 
> Strange, I tried the same but still I am missing 250 points on 1150/1650. What do you mean with CCC to performance?
> 
> 
> 
> 
> 
> 
> 
> I think it might be the CPU then. Can't believe that 50 MHz on core and 200 MHz on Ram would make that big of a difference? Could you bench once with my clocks?
> 
> 
> 
> 
> 
> 
> 
> Then we know.


Your score is a bit lower than it should be, I ran mine with 1070/1550. Maybe cpu? I'm running 4670k @ 4.6. Unigine benches seem to love memory overclocks though, just jam those high if you want to post impressive results.


----------



## [CyGnus]

Jaffi in Catalyst Control Center (CCC) instead of the default value 'Standart' in Texture Quality change it to performance and below that set the Surface Format Optimization to OFF and re bench


----------



## GoLDii3

Damn please someone help me mod my R9 280X Dual-X bios. I need to delete boost. It's annoying as hell when it downclocks. Already suffered this from a 7950 but thankfully Sapphire did include a non-boost bios.


----------



## Devildog83

Quote:


> Originally Posted by *[CyGnus]*
> 
> Jaffi in Catalyst Control Center (CCC) instead of the default value 'Standart' in Texture Quality change it to performance and below that set the Surface Format Optimization to OFF and re bench


I did this on my 7870 and it did help.


Spoiler: Warning: Spoiler!



Before change -



After change -


----------



## Jaffi

Quote:


> Originally Posted by *[CyGnus]*
> 
> Jaffi in Catalyst Control Center (CCC) instead of the default value 'Standart' in Texture Quality change it to performance and below that set the Surface Format Optimization to OFF and re bench


That is what I got with 1170/1700: https://dl.dropboxusercontent.com/u/37488657/Unigine_Valley_Benchmark_1.0_20131214_1748.html
Looks better now hehe.

By the way, is it normal that my Gigabyte card starts to smell weird after some time of gaming/benching? I notice it when I re-enter my room. It does smell like plastic, thats the best how I could describe it. Don't think it is a burned smell though.


----------



## Lubed Up Slug

Where can you even buy a 280x now?


----------



## Twirlz

Quote:


> Originally Posted by *Jaffi*
> 
> That is what I got with 1170/1700: https://dl.dropboxusercontent.com/u/37488657/Unigine_Valley_Benchmark_1.0_20131214_1748.html
> Looks better now hehe.
> 
> By the way, is it normal that my Gigabyte card starts to smell weird after some time of gaming/benching? I notice it when I re-enter my room. It does smell like plastic, thats the best how I could describe it. Don't think it is a burned smell though.


I think it's normal for electronics







I noticed this with a few of my recent motherboards and the smell faded after a few days.


----------



## Cmoney

Quote:


> Originally Posted by *Lubed Up Slug*
> 
> Where can you even buy a 280x now?


Even if you can find one you will be paying upwards of $400 now.... no bueno :/


----------



## [CyGnus]

Glad that i bought mine 4 days ago for 308€ now they are 349€ LOL lucky me


----------



## GoLDii3

Quote:


> Originally Posted by *[CyGnus]*
> 
> Glad that i bought mine 4 days ago for 308€ now they are 349€ LOL lucky me


Meh. Asus way overpriced. Bought my Sapphire R9 280X at 270 EUR with BF4.


----------



## [CyGnus]

i had a Gigabyte + BF4 or a Asus Top for + 8€ and i went with Asus and guess what i dont regret did 14311 marks with it @ 1235/1850


----------



## Jaffi

My Gigabyte + BF4 was 240 €. I could get a Asus for 275 € without BF4 and it is not in stock. So I think I will keep the Gigabyte. In the end I would have to pay like 60 € more if I bought BF4 anyways. That is too much imho, although I like the asus card a lot too








I am asking myself when Asus will ship with BF4. Not seen anywhere yet.


----------



## [CyGnus]

Those prices are very good but here in Portugal everything is overpriced


----------



## JoeDirt

Quote:


> Originally Posted by *GoLDii3*
> 
> Damn please someone help me mod my R9 280X Dual-X bios. I need to delete boost. It's annoying as hell when it downclocks. Already suffered this from a 7950 but thankfully Sapphire did include a non-boost bios.


Send me your bios and I can do that for you.


----------



## JoeDirt

My Gigabyte Rev2 is now $414.99 (USD) Glad I only paid $299 for it. I know the world works by supply and demand n all but come the hell on for the price jump. I know Bitcoin/Litecoin mining was the big reason for it due to the media coverage of it and a bunch more people getting gold fever. Just have to wait a little while for them to figure out they have no idea how to mine and end up looking to sell the cards. I'll admit I switched to AMD just so I could mine at a better Kh/s but I also game and do a lot of other digital editing so this was my best option. But if the price keeps going up I'll pull this pain in the ass out and sell it to get a 780 Ti and just pick up a couple older used HD7xxx cards for mining.


----------



## jamor

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> congrats on the new new, those asus gpus look awsome!!!


The honeymoon was extremely short. Got an artifact card on every available AMD driver.

Using my trusty 6850 now..


----------



## gh0stfac3killa

Quote:


> Originally Posted by *jamor*
> 
> The honeymoon was extremely short. Got an artifact card on every available AMD driver.
> 
> Using my trusty 6850 now..


man that sucks brother, sorry to hear that. not sure what is up with these cards im reading more and more people geting doa cards. you going to rma the card get a new one?


----------



## taem

Quote:


> Originally Posted by *jamor*
> 
> The honeymoon was extremely short. Got an artifact card on every available AMD driver.
> 
> Using my trusty 6850 now..


At stock? DC2T performed flawlessly for me at stock. My Vapor X is also flawless at stock, but artifacts galore if I try to go over 1150 core or 1600 mem.
Quote:


> Originally Posted by *[CyGnus]*
> 
> Jaffi in Catalyst Control Center (CCC) instead of the default value 'Standart' in Texture Quality change it to performance and below that set the Surface Format Optimization to OFF and re bench


This lowers my bench by a few frames and takes about 50 off the score. Reproducible at various clocks.


----------



## jamor

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> man that sucks brother, sorry to hear that. not sure what is up with these cards im reading more and more people geting doa cards. you going to rma the card get a new one?


I know. Those doa reviews are adding up. I'll rma once and see what happens as long as shipping is covered. huge bummer especially with the 2 week wait made it anticlimactic. But at the same time I don't want to own a card known for issues.
Quote:


> Originally Posted by *taem*
> 
> At stock? DC2T performed flawlessly for me at stock. My Vapor X is also flawless at stock, but artifacts galore if I try to go over 1150 core or 1600 mem.
> This lowers my bench by a few frames and takes about 50 off the score. Reproducible at various clocks.


stock. I never touched the clocks.


----------



## [CyGnus]

taem well something is wrong with your system because those tweaks should give a few pts more.
Are you on win 8?


----------



## GoLDii3

Quote:


> Originally Posted by *JoeDirt*
> 
> Send me your bios and I can do that for you.


Done.

Anyone knows this is fine? http://www.3dmark.com/3dm11/7665610

12k off gpu score with a 280X @ 1200/1500 1,275V.


----------



## taem

Quote:


> Originally Posted by *[CyGnus]*
> 
> taem well something is wrong with your system because those tweaks should give a few pts more.
> Are you on win 8?


I'm on 8.1 and yeah I think something is wrong, getting weird things. Like I open gpu z and the entry for bios version is a bunch of symbols. And overclocks return significantly lower scores on valley, 47fps at 1070. 35fps at 1140. Gpu z is not recording any throttling and I can game at that clock just fine with improved performance.

I've been sniffing around and the only oddity I've found so far is an entry named "program" in startup. Trying to find the files location or properties does nothing.

But I can't use ccc with trixx anyway, does ccc work with afterburner? And can non- beta afterburner tweak voltage?


----------



## [CyGnus]

i would do a clean install i think that OS is corrupted somewhere maybe use win 7 its a lot better. And yes you can use MSI AB for voltage


----------



## JoeDirt

I was able to flash the Sapphire 280x Dual-X BIOS onto my Gigabyte Rev2 280x. The Sapphire BIOS allows for 1.3v and the Gigabyte is limited to 1.256v max. Was able to flash with atiwinflash by changing the Vendor ID. It will reinstall driver after reboot. I did not play around with it, I only wanted to see if it could be done. If anyone would want this to mess around with then let me know and I'll send the modded BIOS. Remember, if you screw it up, not my fault and you can just flip the switch on the card to go to the other BIOS and be just fine.


----------



## benjamen50

What is trixx?


----------



## JoeDirt

Quote:


> Originally Posted by *benjamen50*
> 
> What is trixx?


A cereal that a certain rabbit really wants, but is always confiscated at the last second by kids who discriminate against age and species. It's also a program made by Sapphire of control your card. https://www.sapphireselectclub.com/ssc/TriXX/


----------



## benjamen50

Would this work on my gigabyte r9-270x with overclocking and voltage?


----------



## Sgt Bilko

Quote:


> Originally Posted by *benjamen50*
> 
> Would this work on my gigabyte r9-270x with overclocking and voltage?


Afaik there isnt a program that will allow you to unlock voltage.

Your only course would be a Bios flash if you really wanted to overclock it. But that obviously has some risks.


----------



## neptunus

Quote:


> Originally Posted by *JoeDirt*
> 
> I was able to flash the Sapphire 280x Dual-X BIOS onto my Gigabyte Rev2 280x. The Sapphire BIOS allows for 1.3v and the Gigabyte is limited to 1.256v max. Was able to flash with atiwinflash by changing the Vendor ID. It will reinstall driver after reboot. I did not play around with it, I only wanted to see if it could be done. If anyone would want this to mess around with then let me know and I'll send the modded BIOS. Remember, if you screw it up, not my fault and you can just flip the switch on the card to go to the other BIOS and be just fine.


As a Rev 2.0 owner screwed by Gigabyte this sounds really interesting. Are you sure that it provides 1.3V and not 1.256V with wrong reading? Another interesting thing is that the stock BIOS on my REV 2 was limited to 1.2V, how did you manage to get 1.256V on it?

*P.S* I read some pages before that you changed the thermal grease on your card. How bad is the stock one? Difference?


----------



## diggiddi

Quote:


> Originally Posted by *JoeDirt*
> 
> *A cereal that a certain rabbit really wants, but is always confiscated at the last second by kids who discriminate against age and species.* It's also a program made by Sapphire of control your card. https://www.sapphireselectclub.com/ssc/TriXX/


lol


----------



## TempAccount007

I have read threads that said 7970 gigabytes could not be unlocked by flashing a different bios.


----------



## taem

Incidentally earlier in this thread a couple guys said I was delusional for wanting a 280x that ran in the 60s (temp) at gaming load.

Well the Vapor X stock peaks at 65c at 38% fan. Vrms peak at 75c/65c. Also Inaudible at this fan speed. I think other guys in this thread are getting even better temps. This Vapor X is an XTL btw.

This is a very nice card. My only gripe is the cheap plastic shroud and it's too thick for many itx cases. But I think I'm going to pass on the 290 and just pick up another one of these, 280x crossfire ought to be good for a while. Tho, it's disappointing how poorly Metro series runs on the 280x, dialed down to high with minimal AA Metro2033 only manages like 43 fps and dips to low 30s in intense scenes. Gorgeous game though, much more impressive visually than Crysis3 IMHO. Also an awesome game, just loving this. Hard though.


----------



## [CyGnus]

Quote:


> Originally Posted by *neptunus*
> 
> *P.S* I read some pages before that you changed the thermal grease on your card. How bad is the stock one? Difference?


I also changed the thermal paste on my Asus stock it just had too mch now i have my card at 1200 core 45% Fan and folding at 55ºc


----------



## JoeDirt

Quote:


> Originally Posted by *neptunus*
> 
> As a Rev 2.0 owner screwed by Gigabyte this sounds really interesting. Are you sure that it provides 1.3V and not 1.256V with wrong reading? Another interesting thing is that the stock BIOS on my REV 2 was limited to 1.2V, how did you manage to get 1.256V on it?
> 
> *P.S* I read some pages before that you changed the thermal grease on your card. How bad is the stock one? Difference?


I can not be sure if it truly gets the 1.3v, I did not test that far on it. Just wanted to see what BIOS could be swapped. Maybe later today I'll flash it again and see if GPU-Z shows it working or not under load.
The stock BIOS was limited to 1.2v but they have an update on their site, the F12 BIOS, that brought it up to 1.256v.
The stock TIM was very thick, over applied and had scoring/burn like marks on it. I replaced it with some MX4. Go ahead and pull you cooler off and take a look. The design is total crap.
Something is seriously wrong with these Gigabyte cards on the hardware end. Everyone keeps having the heating issues and poor performance in benchmarks and mining bitcoins. I feel ripped off by it but know this is something Gigabyte will fix in time. I paid for an OC card and found I had to under clock it to make it perform.

My current settings:


----------



## Voliminal_8

Today I need to get a new gpu, pretty much decided getting Asus r9 280x.

Always had nvidia and I wanted to ask if the drivers are working properly with this card. Any other problems occured?


----------



## Jaffi

So I just send back my gigabyte 280x. It was a hard decision because it was so cheap, but yesterday I touched the backplate of the vrm cooler and it was soooo hot, I think the vrm cooling on this card is just insuffincient. Combined with non existent vrm temps monitoring, I was hesitating to use it with my reduced fanspeed any longer.
In my opinion, gigabyte HAS to run the stock fan speed so high because they know that the tiny heatsink, which is not connected to the main cooler at all, would not be sufficiently cooling the vrms.

I will try to get my hands on the asus card now. It will cost me like 50 € more, but I think it is the only worthy 280x when you want a silent card. Also it has no voltage lock and has vrm monitoring. Let's see when I can get my hands on that thing!


----------



## Algy

hi, I have a gigabyte rev 1 280x
I was trying diferent bios series to see if I can get a better oc.
my bios is a F31 which give me this score (1100/1500 @1,2v stocks settings):

it always run at full speed and gpu load was 100/97%.

here comes the best part. If I bench any other bios series, like f3, f12 or the modified f12 that JoeDirt makes me(ty mate), the bench run worse than my original bios.
the clocks ramp down to 1050/1500 for moments, the gpu load spikes down to 60% in some cases.

I thought it would be a good idea share this findings with u people.


----------



## [CyGnus]

The Asus is a great card here is a pic of mine the temps are great in full load (folding):


----------



## neptunus

Quote:


> Originally Posted by *JoeDirt*
> 
> I can not be sure if it truly gets the 1.3v, I did not test that far on it. Just wanted to see what BIOS could be swapped. Maybe later today I'll flash it again and see if GPU-Z shows it working or not under load.
> The stock BIOS was limited to 1.2v but they have an update on their site, the F12 BIOS, that brought it up to 1.256v.
> The stock TIM was very thick, over applied and had scoring/burn like marks on it. I replaced it with some MX4. Go ahead and pull you cooler off and take a look. The design is total crap.
> Something is seriously wrong with these Gigabyte cards on the hardware end. Everyone keeps having the heating issues and poor performance in benchmarks and mining bitcoins. I feel ripped off by it but know this is something Gigabyte will fix in time. I paid for an OC card and found I had to under clock it to make it perform.


Can you please send me modified BIOS so that I can test it too? As for the new BIOS I believe I tried to flash it but it was a no go, for some reason, can't remember the exact error code I received in atiflash but I came to conclusion that it was designed for different revision.

As for the TIM, I have some CF III that came with my SB-E, I believe that should do. What was the difference compared to stock in your case? I tried running witcher 2 with uber-sampling (or whatever its called) and got around 80 degrees with fan at around 45%.... can't call it wonderful, but its not the worst case. As for the settings I am running [email protected] (see rig details), one thing that I am 100% happy with - is memory overclock, to my surprise I had no troubles going from stock 6Ghz to 7Ghz and thats with Elpida chips. I wasn't able to pass 3Dmark with anything above 1180 core clock which is quite disappointing. ASIC is 64.2%

Quote:


> Originally Posted by *Jaffi*
> 
> vrm


Those cards can run --scrypt 24/7 with constant 100+ degrees on VRM with no troubles, you should be fine. But anyway, I believe it was a good decision.


----------



## Warl0rdPT

Quote:


> Originally Posted by *Voliminal_8*
> 
> Today I need to get a new gpu, pretty much decided getting Asus r9 280x.
> 
> Always had nvidia and I wanted to ask if the drivers are working properly with this card. Any other problems occured?


I have minor issues while browsing on firefox, screens get corrupted for a fraction of second every few minutes, then quickly returns to normal. no problems anywhere else.


----------



## TempAccount007

I have a DC2T 280x and use Firefox as my default browser. I haven't had that issue.

Maybe try using afterburner and disable power play support to prevent it from down clocking... see if that fixes your problem


----------



## JoeDirt

Quote:


> Originally Posted by *neptunus*
> 
> Can you please send me modified BIOS so that I can test it too? As for the new BIOS I believe I tried to flash it but it was a no go, for some reason, can't remember the exact error code I received in atiflash but I came to conclusion that it was designed for different revision.
> 
> As for the TIM, I have some CF III that came with my SB-E, I believe that should do. What was the difference compared to stock in your case? I tried running witcher 2 with uber-sampling (or whatever its called) and got around 80 degrees with fan at around 45%.... can't call it wonderful, but its not the worst case. As for the settings I am running [email protected] (see rig details), one thing that I am 100% happy with - is memory overclock, to my surprise I had no troubles going from stock 6Ghz to 7Ghz and thats with Elpida chips. I wasn't able to pass 3Dmark with anything above 1180 core clock which is quite disappointing. ASIC is 64.2%
> Those cards can run --scrypt 24/7 with constant 100+ degrees on VRM with no troubles, you should be fine. But anyway, I believe it was a good decision.


Sent. Hope it works out for you. Going to play with it some more myself.


----------



## Jaffi

If there was a good supply of 290(x) custom cards, oh boy. I can get a reference 290 for 330 € with BF4. It feels a bit stupid to pay 275 € for a Asus 280x without BF4 then


----------



## taem

Quote:


> Originally Posted by *[CyGnus]*
> 
> The Asus is a great card here is a pic of mine the temps are great in full load (folding):


Those are stupid temps. Just, ridiculous. Does folding generate less heat than gaming? I had a DC2T, idled at those temps. You should overclock that b*stard to Denver.


----------



## Devildog83

I finally received my Kill-A-Watt, my system draws 90w to 100w at idle and 33w to 350w running Valley 1.0 overclocked to 1250/1400 on the 7870 and 4.7 Ghz on the 8350. I am going to assume when I get the R9 270x back from RMA it will add 200w or so to the load so my 660 platinum PSU should be fine. Maybe a max draw of 550w loading everything up. We will see when it gets back to me but it's good to know for sure I don't need a new PSU.

Edit: New Egg is out of my 270x Devil so I am not sure what will happen from here. Might get a Toxic (yellow - Yuck) or a Hawk if they can be found.


----------



## neurotix

Devildog, the Sapphire Vapor-X 270X is excellent. The cooler is great and with 1.3v and 1280mhz I don't pass 55C in any games, including Crysis 3.

Folding at those clocks it doesn't even pass 50C.


----------



## [CyGnus]

taem i did @ 1235/1850 P14311 http://www.3dmark.com/3dm11/7656094


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> Devildog, the Sapphire Vapor-X 270X is excellent. The cooler is great and with 1.3v and 1280mhz I don't pass 55C in any games, including Crysis 3.
> 
> Folding at those clocks it doesn't even pass 50C.


Cool, I would have some $ left over for fans unless the prices went up.


----------



## neptunus

Well the BIOS flash didn't go well for me, when I tried to run a 3DMark with "Sapphire 1.3V" it just crashed, then I flashed back my old BIOS and was getting artifacts in the 3DMark at stock speeds... flashed it back to original, then to modified again - it worked. So I'm back with my [email protected]


----------



## madorax

just notice win 8.1 give lower CPU physics score compare to win 7, even though graphics score better the final score still lower than win 7 based









http://www.3dmark.com/3dm11/7672148

compare to someone with the same spec but with win 7

http://www.3dmark.com/3dm11/7589807

he / she even only use standard boost to 3.4ghz compare with mine to 3.6ghz...


----------



## Archea47

Quote:


> Originally Posted by *Archea47*
> 
> Thanks for the input
> 
> I reinstalled MSI Afterburner and will report back if I get the 99% utilization again
> 
> Two problems since re-installing Afterburner:
> 1.) I no longer have visibility into GPU2 temperature (through AB)
> 2.) GPU2 fan speed is reported as 20%, which doesn't follow my fan profile (I've selected GPU2 in settings and it shows same profile as GPU1, which is at 45%)


Hey guys & gals,

Just wanted to report back - I seem to have fixed the problem

I read elsewhere on the net about this problem occurring way back to the 6xxx line in crossfire

The fix was to search regedit for upls and change any entries (UplsEnabled and UplsEnabledNA) from 1 to 0

Not only is it not down-fanning my GPU2, but I also got GPU2 temperatures back

We'll see if it sticks. It's nice to see my GPU2 temperatures again and have my fan profile back


----------



## Frogeye

Hey all I recently traded in my 2 5850s for an ASUS r9 280x. Sweet upgrade !
With a 1.2 volt it will do 1100/6450. Even with the fancy twin frozer thing temps get up there.

So.....

My goal is to add a closed loop h20 cooler along with heat sinks. I've seen impressive cooling results bring the load temps down up to 40degreesC!

Anyone done this yet in here, I could use any extra info and advice.


----------



## JoeDirt

Quote:


> Originally Posted by *neptunus*
> 
> Well the BIOS flash didn't go well for me, when I tried to run a 3DMark with "Sapphire 1.3V" it just crashed, then I flashed back my old BIOS and was getting artifacts in the 3DMark at stock speeds... flashed it back to original, then to modified again - it worked. So I'm back with my [email protected]


Welp, now we know it didn't work right. Working on something else now.


----------



## Buxty

Hey guys!

Thinking of replacing my outgoing Tahiti LE card with either a GTX760 or a 270X. If i was to go the 270X route would there be any you guys would recommend? I'm not too bothered about battlefield 4, but free stuff is a bonus and I would want to overclock the card.

I was looking at the MSI HAWK as its around 170GBP right now in the UK and sounds like its a solid overclocker with a good cooler.

Any personal experiences or opinions i should know of?


----------



## [CyGnus]

neptunus 1100 with 1.34v is too much mine does 1235 with 1.256v something is not right there...

Buxty the 270X is a 7870GHz so your tahiti LE is a 7870XT and its better if you want to change card for what ever reason go with a 280X or upwards


----------



## Buxty

Quote:


> Originally Posted by *[CyGnus]*
> 
> neptunus 1100 with 1.34v is too much mine does 1235 with 1.256v something is not right there...
> 
> Buxty the 270X is a 7870GHz so your tahiti LE is a 7870XT and its better if you want to change card for what ever reason go with a 280X or upwards


Yeah i was aware, haven't got the funds right now for a 280X. Reason for losing the XT was because it was noisy, hot (like hitting 90) and has started coil whining too. So yeah it isn't staying


----------



## [CyGnus]

Good enough reason i ditched my 760 SLI for similar reasons and the power they used was too much


----------



## fatmario

Hey guys I recently bought MSI R9 280X GAMING 3G Radeon R9 280X 3GB and installed check the gpuz screen it says default clock 1020mhz but my graphic card supposed to have gpu clock 1000mhz and boost clock 1050mhz. any idea why is it stuck on 1020mhz?


----------



## gnemelf

now another possibly stupid question here.... when I am looking at gpuz my vddc is @ 1.201v on desktop but when I start to play any games or benchmark it drops to 1.14v-1.16v is that normal or have I done something wrong... I have a powercolor 280x 1150/1600... any thoughts would help.... I feel like I'm running in circles..


----------



## taem

Quote:


> Originally Posted by *gnemelf*
> 
> now another possibly stupid question here.... when I am looking at gpuz my vddc is @ 1.201v on desktop but when I start to play any games or benchmark it drops to 1.14v-1.16v is that normal or have I done something wrong... I have a powercolor 280x 1150/1600... any thoughts would help.... I feel like I'm running in circles..


Its normal, it's like adaptive voltage, the voltage will fluctuate according to how much the card needs. The 1.2v you see is the baseline set in bios.


----------



## gnemelf

okies thanks for clearing that up... thats kinda what I was thinking.


----------



## -Droid-

Guys tomorrow i get my 280x Asus, first AMD card here, which is the best stable driver i should get ?


----------



## [CyGnus]

Install the latest one no issues at all


----------



## JoeDirt

Quote:


> Originally Posted by *fatmario*
> 
> Hey guys I recently bought MSI R9 280X GAMING 3G Radeon R9 280X 3GB and installed check the gpuz screen it says default clock 1020mhz but my graphic card supposed to have gpu clock 1000mhz and boost clock 1050mhz. any idea why is it stuck on 1020mhz?


Put it under load and see what it boosts to.


----------



## JoeDirt

Here it is, the Gigabyte 280x Rev F60 BIOS unlocked to 1.300v.
Test at will.

Gigabyte-280x-REV2-F60-13v.zip 104k .zip file


----------



## JoeDirt

My new stable voltage for LTC mining on my Gigabyte 280x Rev2 is 1.025v at 1020mhz


----------



## fatmario

its doesn't go above 1020mhz even playing games like bf4 or running kombister


----------



## taem

Quote:


> Originally Posted by *JoeDirt*
> 
> My new stable voltage for LTC mining on my Gigabyte 280x Rev2 is 1.025v at 1020mhz
> [/quote
> 
> Are you forcing constant voltage?]


----------



## JoeDirt

Quote:


> Originally Posted by *fatmario*
> 
> its doesn't go above 1020mhz even playing games like bf4 or running kombister


Looks like your in gaming mode and not OC mode. Should be a little switch (VERY SMALL!) next to your crossfire bridge. Move that to the other position and try again. You can do this without powering off.


----------



## JoeDirt

Quote:


> Are you forcing constant voltage?]


Nope. When not under load it willidle back down to 0.850v at 300mhz.


----------



## JoeDirt

Quote:


> Originally Posted by *fatmario*
> 
> its doesn't go above 1020mhz even playing games like bf4 or running kombister


http://us.msi.com/service/download/utility-5375.html

The MSI update utility to make sure your BIOS is up to date as well.


----------



## taem

Quote:


> Originally Posted by *JoeDirt*
> 
> Nope. When not under load it willidle back down to 0.850v at 300mhz.


It's odd that you'd hit your set voltage on the money like that. Usually a gpu hovers at 99.1-99.9 usage at full load and voltage fluctuates a bit.


----------



## JoeDirt

Quote:


> Originally Posted by *taem*
> 
> It's odd that you'd hit your set voltage on the money like that. Usually a gpu hovers at 99.1-99.9 usage at full load and voltage fluctuates a bit.


I don't know for sure or not but I think once the GPU hits a certain % of load it will go to the next state of performance and stay there until the load lifts. If your different load % states are to close I could see that they would jump the voltage around to make the core speed.


----------



## taem

Quote:


> Originally Posted by *JoeDirt*
> 
> I don't know for sure or not but I think once the GPU hits a certain % of load it will go to the next state of performance and stay there until the load lifts. If your different load % states are to close I could see that they would jump the voltage around to make the core speed.


Well here's the Vapor X power step settings in bios:



But these steps will fluctuate as you adjust clocks and voltage etc. And there's variance from card to card. For example my ULPS voltage isn't 0.850, it's 0.848. Are you running stock?


----------



## fatmario

Quote:


> Originally Posted by *JoeDirt*
> 
> Looks like your in gaming mode and not OC mode. Should be a little switch (VERY SMALL!) next to your crossfire bridge. Move that to the other position and try again. You can do this without powering off.


i press the bios switch next to crossfire still same issue stuck at 1020mhz and my bios is already update to latest one..


----------



## madorax

Quote:


> Originally Posted by *fatmario*
> 
> i press the bios switch next to crossfire still same issue stuck at 1020mhz and my bios is already update to latest one..


you need to update vbios oficially by MSI itself from this link: http://www.techpowerup.com/vgabios/146769/msi-r9280x-3072-131009.html

flash it with atiwinflash

otherwise you could go to MSI forum, and they will give the same vbios, i do have MSI 280X gaming and i already try both techpowerup vbios and from MSI forum, it's the same.

anyway mind for sharing what is your ASIC & default volt? mine is 1156 & ASIC 70.7.


----------



## neptunus

Quote:


> Originally Posted by *JoeDirt*
> 
> Here it is, the Gigabyte 280x Rev F60 BIOS unlocked to 1.300v.


How did you do this?

Also, I checked F12 BIOS and it is for REV 1.0 card.


----------



## gnemelf

Now is 75c -80c on both my 280x vrm's while running furmark bad? they idle around 45c, I'm thinking I may need better thermal tape than stock if those temps are too high.


----------



## JoeDirt

Quote:


> Originally Posted by *taem*
> 
> Well here's the Vapor X power step settings in bios:
> 
> 
> 
> But these steps will fluctuate as you adjust clocks and voltage etc. And there's variance from card to card. For example my ULPS voltage isn't 0.850, it's 0.848. Are you running stock?


I am running stock ULPS. I also don't watch GPU-Z all the time so if it does fluctuate, I have yet to see it. Now some numbers it does not like and will go up or down a few from what you set. Say you call for 1.053v and it may show 1.055v. That I have seen it do.


----------



## JoeDirt

Quote:


> Originally Posted by *fatmario*
> 
> i press the bios switch next to crossfire still same issue stuck at 1020mhz and my bios is already update to latest one..


I'd contact the manufacturer at this point.


----------



## JoeDirt

Quote:


> Originally Posted by *neptunus*
> 
> How did you do this?
> 
> Also, I checked F12 BIOS and it is for REV 1.0 card.


Hex editing the BIOS. I'm going to do the F31 BIOS next.


----------



## smoicol

Hello everyone, I own a r9 matrix 280x the maximum voltage that can be set with gpu tweak is 1.40mv, but really with gpu-z I read that 1,256 is the standard, I can somehow bypass this limit? if you in that way? now I can get to the top at 1130 \ over 1725 artifacts and I freeze

thanks to everyone for the help


----------



## Joeking78

Quote:


> Originally Posted by *smoicol*
> 
> Hello everyone, I own a r9 matrix 280x the maximum voltage that can be set with gpu tweak is 1.40mv, but really with gpu-z I read that 1,256 is the standard, I can somehow bypass this limit? if you in that way? now I can get to the top at 1130 \ over 1725 artifacts and I freeze
> 
> thanks to everyone for the help


Its called vdroop, you would have to hardware voltmod to eliminate it but you would eliminate your warranty too.


----------



## Deicidium

hi guys, i'm new here









I just got my brand new sapphire r9 270x Toxic edition









I hope it's worth it


----------



## smoicol

hi,
mod software?
msi ab eula?


----------



## Devildog83

Quote:


> Originally Posted by *Deicidium*
> 
> hi guys, i'm new here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just got my brand new sapphire r9 270x Toxic edition
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I hope it's worth it


It should be, let us know how she does. I am glad they have the Devils back in, I was afraid that it was limited edition like the 7870.


----------



## Devildog83

Quote:


> Originally Posted by *smoicol*
> 
> Hello everyone, I own a r9 matrix 280x the maximum voltage that can be set with gpu tweak is 1.40mv, but really with gpu-z I read that 1,256 is the standard, I can somehow bypass this limit? if you in that way? now I can get to the top at 1130 \ over 1725 artifacts and I freeze
> 
> thanks to everyone for the help


I don't know about GPU tweak but if you use the latest Beta of Afterburner you can force constant voltage so there is no droop.


----------



## mAs81

Quote:


> Originally Posted by *fatmario*
> 
> i press the bios switch next to crossfire still same issue stuck at 1020mhz and my bios is already update to latest one..


I have the same card with the same bios(the new one)and with 1020mhz..
I believe that there are 2 versions of the card..Gaming and TF gaming..
http://www.techpowerup.com/vgabios/index.php?manufacturer=MSI&model=R9+280X
Our card is the TF Gaming with 1020mhz stock..With the newest bios(I think)Did you install the msi gaming app too?
You can easily have 1050mhz when gaming if you wish...


----------



## smoicol

Quote:


> Originally Posted by *Devildog83*
> 
> I don't know about GPU tweak but if you use the latest Beta of Afterburner you can force constant voltage so there is no droop.


Hello and thanks for the reply, the problem is not 'fall, but the voltage limit set at 1,256, even if I set no higher than 1.256 1:36 1:40 I also changed the switch, it seems strange, Cause the valley in 1150 with some 1,256 artifacts if I get up at 1.30 remains the same nothing changes, also from gpu-z latest version and read the voltage that' 1,256


----------



## Deicidium

Quote:


> Originally Posted by *Devildog83*
> 
> It should be, let us know how she does. I am glad they have the Devils back in, I was afraid that it was limited edition like the 7870.


will do when I get home









btw, right now I was a little confused over this toxic and msi's hawk.. specs wise, are they both the same? Actually I didn't check MSI Hawk thoroughly because I was so amazed on how the Sapphire Toxic looks..


----------



## Devildog83

Quote:


> Originally Posted by *smoicol*
> 
> Hello and thanks for the reply, the problem is not 'fall, but the voltage limit set at 1,256, even if I set no higher than 1.256 1:36 1:40 I also changed the switch, it seems strange, Cause the valley in 1150 with some 1,256 artifacts if I get up at 1.30 remains the same nothing changes, also from gpu-z latest version and read the voltage that' 1,256


Do you have HWinfo64, if you don't try it. I have 2 runs in 3DMark11 one at stock and 1 overclocked. As you can see the max VDDC is 1.219 but the actual voltage output, shown below GPU VRM temp 2, went up depending on load and clocks, I.E. the needs of the card. I hope this helps some but may be a bit confused as to what you are tying to say.

Stock


Overclock


----------



## Mackem

I have a random issue with the 13.11 Beta 9.5 drivers on my 280X where the contents of my displays will flicker and move down vertically like 30 pixels then go back to normal. It's hard to explain. Anyone know what the issue might be?


----------



## Cid

Okay, so my card finally arrived (HIS 280X Turbo Boost). I'm playing Skyrim, and I notice something is off. It's like I'm shaking or something. Like, clear stuttering of some sort but FRAPS says my fps is a solid 60-59. I found a spot where it happens super noticeably, I hit record so I've got some proof and... it's gone. Smooth as butter, not a problem in sight as long as I'm recording. Stop recording, that weird ass stutter is back.

Anyone experience something like this?

Vsync is on in this case, of course. Skyrim forces it on and if you turn it off everything from physics to quests goes haywire, don't ask me why they tied everything to framerate but whatever. I'm about to try BF4 and I'm hoping it doesn't happen there or I'll have wasted €300.


----------



## Fonne

Hi

Can all 270X raise the voltage to the same level (Maybe with bios mod) or can a card like 270X Hawk go higher ? ...

Is looking on getting 2x 270X insteed of 760 and same some money ...


----------



## taem

Having an odd issue with my Vapor X 280x. Valley used to bench at 47.x. Now it's dropped to 45.x and I have no idea why.

If I overclock, benchmarks go down. I have multiple monitors up, none of them report drops in clock, voltage, or usage. And temps are well below any possible throttling. But, for example, at 1070/1550, I'll get 45.x. If I overclock to 1150/1550, it'll drop to 35.x. But at 1200/1600, goes to 40.x.

Is it a vdroop issue? But again, monitors report no drops. What might be causing this? I'm so fed up I'm going to fresh install Win8.1 and start over. But I'm mystified. Maybe psu? But that would show up in the monitors. And the psu is an XFX Pro Black Series 850w, a Seasonic rebrand that's a good unit.


----------



## Themisseble

i have R9 270X windforce OC ...

anyone overclocking same card?


----------



## signex

I have exchanged my 760 for the R9 270X Toxic, arriving tomorrow.


----------



## GoLDii3

Quote:


> Originally Posted by *signex*
> 
> I have exchanged my 760 for the R9 270X Toxic, arriving tomorrow.


Not so clever.


----------



## JoeDirt

Quote:


> Originally Posted by *signex*
> 
> I have exchanged my 760 for the R9 270X Toxic, arriving tomorrow.


Ordering a 780 Ti next year for my next build. These ATI/AMD cards have the same low quality they had 15 years ago. FOREVER 3DFX!!!! I remember my first Voodoo 3 3000 and then the hover board itself the Voodoo 5 5500.


----------



## Archea47

Quote:


> Originally Posted by *JoeDirt*
> 
> Ordering a 780 Ti next year for my next build. These ATI/AMD cards have the same low quality they had 15 years ago. FOREVER 3DFX!!!! I remember my first Voodoo 3 3000 and then the hover board itself the Voodoo 5 5500.


That's a serious VOODOO card right there. I grew up with the VOODOO and VOODOO2

The GPU2 temperature problem I had does indeed still seem to be a ULPS issue. I reinstalled my video card drivers, which reset the registry values to enable ULPS, and I lost fan control and temperature monitoring on GPU2. Set EnableULPS in the registry back to 0 and voila - there's my GPU2 fan & temp again


----------



## JoeDirt

Gigabyte-280x-F31-13v.zip 99k .zip file


Gigabyte 280x F31 BIOS unlocked to 1.3v Test it up and let me know how it works for you.


----------



## JoeDirt

Quote:


> Originally Posted by *Archea47*
> 
> That's a serious VOODOO card right there. I grew up with the VOODOO and VOODOO2
> 
> The GPU2 temperature problem I had does indeed still seem to be a ULPS issue. I reinstalled my video card drivers, which reset the registry values to enable ULPS, and I lost fan control and temperature monitoring on GPU2. Set EnableULPS in the registry back to 0 and voila - there's my GPU2 fan & temp again


I never even ran a GFX card until the Quake series came out. Finally convinced my parents to let me get a GFX card when the Voodoo 2/3 series came out. Still have the Voodoo 5 in storage and as far as I know it still works.


----------



## jamor

Quote:


> Originally Posted by *JoeDirt*
> 
> I never even ran a GFX card until the Quake series came out. Finally convinced my parents to let me get a GFX card when the Voodoo 2/3 series came out. Still have the Voodoo 5 in storage and as far as I know it still works.


ya back in the day didnt need it. when i was in high school i played counter strike on some old dell and I was cal-main!


----------



## signex

Quote:


> Originally Posted by *GoLDii3*
> 
> Not so clever.


For me it was.


----------



## Deicidium

quick question

Can my 2/3 year old HEC Raptor R500W handle Sapphire Radeon R9 270x Toxic Ed??


----------



## taem

So I got an email alert saying Catalyst 13.12 was out, installed it, but the sys info says Catalyst Version 13.11? This normal? Not used to AMD cards this is my first except for the mobility 5850 in my HP that I cant update the drivers on.


----------



## seloop

I do suspect it being partially responsible for the skyrocketing prices and extremely bad supply. Look at Nvidia, you can easily get any card you want. Perhaps it also has to do with AMD's supply for next gen consoles, but I doubt it.


----------



## jamor

Quote:


> Originally Posted by *seloop*
> 
> I do suspect it being partially responsible for the skyrocketing prices and extremely bad supply. Look at Nvidia, you can easily get any card you want. Perhaps it also has to do with AMD's supply for next gen consoles, but I doubt it.


That's not entirely true. The GTX 770 is on par with the 280x (except for the 1 GB of RAM deficiency); yet, the 770 went from $$310ish to $380ish in the same time that the 280x raised their prices to about the equivalent.


----------



## Jaffi

So I finally managed to get my hands on a Asus 280X DC2T. Payed 280 € without BF4. That is 30 € more than I payed for the Gigabyte card that came with BF4, but finally the journey for the perfect card might come to an end. Should arrive tomorrow, will report asap


----------



## Warl0rdPT

Quote:


> Originally Posted by *Mackem*
> 
> I have a random issue with the 13.11 Beta 9.5 drivers on my 280X where the contents of my displays will flicker and move down vertically like 30 pixels then go back to normal. It's hard to explain. Anyone know what the issue might be?


same issue here, but I only notice it on firefox


----------



## CptAsian

Quote:


> Originally Posted by *Mackem*
> 
> I have a random issue with the 13.11 Beta 9.5 drivers on my 280X where the contents of my displays will flicker and move down vertically like 30 pixels then go back to normal. It's hard to explain. Anyone know what the issue might be?


Quote:


> Originally Posted by *Warl0rdPT*
> 
> same issue here, but I only notice it on firefox


I get the same thing with an HD 7990 in Chrome, so I think a lot of people are getting it for some reason.


----------



## Devildog83

Quote:


> Originally Posted by *signex*
> 
> For me it was.


I would take that in a heartbeat.


----------



## Devildog83

Quote:


> Originally Posted by *Deicidium*
> 
> will do when I get home
> 
> 
> 
> 
> 
> 
> 
> 
> 
> btw, right now I was a little confused over this toxic and msi's hawk.. specs wise, are they both the same? Actually I didn't check MSI Hawk thoroughly because I was so amazed on how the Sapphire Toxic looks..


I know, to be honest if I was going to have just one card it would be either the Toxic or the Hawk, the toxic looks better and might be cooler but both of them having yellow gives me the willies. Yuck. I am X-Fired with a 7870 Devil so I don't need the supper high clocks nor do I want to run them. A Toxic or a Hawk would be overkill because the 7870 will not keep up with those clocks.


----------



## TempAccount007

Quote:


> Originally Posted by *Devildog83*
> 
> I don't know about GPU tweak but if you use the latest Beta of Afterburner you can force constant voltage so there is no droop.


That doesn't work.


----------



## taem

How are guys running CCC and a tweak app? I've tried both Trixx and Afterburner, both will default to stock settings if CCC is active. AMD Overdrive is disabled. So I can have either voltage control or global 3d settings and all the other stuff in ccc, not both, which sucks.


----------



## benjamen50

I just ended up using amd overdrive since MSI afterburner do not set my over clock settings at startup.


----------



## Devildog83

Quote:


> Originally Posted by *benjamen50*
> 
> I just ended up using amd overdrive since MSI afterburner do not set my over clock settings at startup.


In Afterburner there is a button at the bottom the you can select that will run overclocks at startup.


----------



## spungyplunger

maybe you guys can help me with this, I'm debating whether I need to RMA my ASUS 280x dc2t, would love to not have to seeing as the ASUS RMA process doesn't seem to be that great...
I'm getting graphic glitches (not really sure what to call them) in WoW and on my desktop as well...


----------



## Jaffi

Could also be a driver issue, try to do a clean reinstall.

Tapatalked from my Nexus 5


----------



## benjamen50

Tried that option a million times, does not work.


----------



## spungyplunger

I have done a driver reinstall 3 times now, using the latest drivers from AMD, not asus. Used the method in driver subforum to uninstall drivers.


----------



## JoeDirt

If stock and getting glitches like that? Send it back.


----------



## spungyplunger

Quote:


> Originally Posted by *JoeDirt*
> 
> If stock and getting glitches like that? Send it back.


Was afraid of that.. I asked for a cross ship so just waiting on the credit card form now. Hopefully everything goes smoothly with ASUS.


----------



## taem

So can we compare some benchmarks? How about we do Unigine Valley 1.0, at ExtremeHD setting? Someone else asked earlier too but we got few benches posted, hoping to see more. I cannot tweak CCC to improve bench numbers because it won't run alongside AB for me.

At stock clocks of 1070 core and 1550 mem I get 45.x fps and a score of around 1900. This seems in line with posted reviews.

At my own settings of 1150 core and 1600 mem (Toxic settings







) I get:



This is about as high as I want to go to keep temps and fan speed low. At these clocks I can undervolt to 1.144 and run cool and quiet.

Quote:


> Originally Posted by *spungyplunger*
> 
> maybe you guys can help me with this, I'm debating whether I need to RMA my ASUS 280x dc2t, would love to not have to seeing as the ASUS RMA process doesn't seem to be that great...
> I'm getting graphic glitches (not really sure what to call them) in WoW and on my desktop as well...


I would rma for those glitches but only after making sure it's not a driver issue. I would go so far as to do a clean install of OS and everything just to make 100% sure. That only takes a few hours, rma process can take weeks and right now is not a good time to be shipping anyway.


----------



## JoeDirt

Quote:


> Originally Posted by *spungyplunger*
> 
> Was afraid of that.. I asked for a cross ship so just waiting on the credit card form now. Hopefully everything goes smoothly with ASUS.


I hope they do you right my friend. It still makes me question the quality of the GPU itself with every brand having so many different problems with it.

I'll sell mine to anyone who wants it for $500


----------



## madorax

Quote:


> Originally Posted by *taem*
> 
> So can we compare some benchmarks? How about we do Unigine Valley 1.0, at ExtremeHD setting? Someone else asked earlier too but we got few benches posted, hoping to see more. I cannot tweak CCC to improve bench numbers because it won't run alongside AB for me.
> 
> At stock clocks of 1070 core and 1550 mem I get 45.x fps and a score of around 1900. This seems in line with posted reviews.
> 
> At my own settings of 1150 core and 1600 mem (Toxic settings
> 
> 
> 
> 
> 
> 
> 
> ) I get:
> 
> 
> 
> This is about as high as I want to go to keep temps and fan speed low. At these clocks I can undervolt to 1.144 and run cool and quiet.
> I would rma for those glitches but only after making sure it's not a driver issue. I would go so far as to do a clean install of OS and everything just to make 100% sure. That only takes a few hours, rma process can take weeks and right now is not a good time to be shipping anyway.


This is mine, MSI 280X gaming @ 1160 / 1650 with volt 1212


----------



## [CyGnus]

Keep pushing it guys









My Asus Top R9 280X (Air) 1200/1850


----------



## JoeDirt

13.12 drivers suck 100% ass. 20 kh/s drop and driver instability.


----------



## [CyGnus]

well for me they are better in BF4 and COD Ghosts i dont mine with so no problems there


----------



## madorax

Quote:


> Originally Posted by *[CyGnus]*
> 
> Keep pushing it guys
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My Asus Top R9 280X (Air) 1200/1850


what a score









i think this is my max. look at the temp on the right corner of HWMonitor... 75C is pretty hot though...











although i can get my mem to 1900 @ mem volt 1700 it doesn't give any different in the score haha


----------



## taem

Quote:


> Originally Posted by *madorax*
> 
> what a score


Seriously though! Baseline fps for 280x at stock overclocks seems to be around 43-45 fps so that's a pretty huge increase in performance.
Quote:


> i think this is my max. look at the temp on the right corner of HWMonitor... 75C is pretty hot though...


I personally like to run lower than that but I don't think that's considered high at all, especially at those clocks.
Quote:


> although i can get my mem to 1900 @ mem volt 1700 it doesn't give any different in the score haha


Mem overclocks usually don't but in my experience they help stability. I can't push my Vapor X past 1140 unless I also raise mem clock and volt.


----------



## neptunus

For Gigabyte R280X users, I found out (sorry if it was mentioned before) that you CAN change voltage with ASUS GPU Tweak 2.4.9.2....


----------



## G2O415

Just got this bad boy today, a much needed upgrade from a integrated graphics! 4GB VRAM is overkill I know, but I got it mainly because I plan on playing The Elder Scrolls franchise with some heavy Graphic mods... BUT STILL SO STOKED!



Also, I installed the latest AMD Driver (13.12). Was this the right choice or should I have gone Beta Drivers? The latest one was released yesterday..


----------



## gh0stfac3killa

Quote:


> Originally Posted by *G2O415*
> 
> Just got this bad boy today, a much needed upgrade from a integrated graphics! 4GB VRAM is overkill I know, but I got it mainly because I plan on playing The Elder Scrolls franchise with some heavy Graphic mods... BUT STILL SO STOKED!
> 
> 
> 
> Also, I installed the latest AMD Driver (13.12). Was this the right choice or should I have gone Beta Drivers? The latest one was released yesterday..


congrats on the new gpu, she will serve you proper no doubt. heres the deal ive been trolling around on oc and have posted a few other threads about the amd drivers for these 200 series cards. not sure if anyone else Is having the issue im having but here it is. I did the latest update to the 13.12 and in battlefield 4 wich is suppose to be amd built and optimized im getting realy horrible screen flickering on the maps. and I mean horrible the map to include the actual little map on the deploy screen is just flickering away but only in crossfire. I turn crossfire off and all is well. this has been happening since the fist beta driver. the first one only certain back ground things would flicker, like mountains, snow mostly white things. then I went to the beta 13.11 9.5 and the whole map started flickering, then they released these new drivers 13.12 and its worse. I got tired of jumping out of games and trying to tweak settings to get the flickering to go away cause honestly I bought two of these bad boys and wasn't going to drop down to a single card, hard headed I guess, lol. but hey my thing is if its in my system use it!!! so I finaly did the noob thing wich im know we all here at oc do not ever do anymore and that's unistalled all drivers and used the install disk, yuuukkkoooo... but guess what? I run bf4 in crossfire no flickering at all at 120 plus fps on ultra on the disk drivers we usualy just toss out. luckily I kept one of the discs...

so in short try out the new drivers see how they act for you and if all else fails just use the disc it came with until amd fixed all this mess with there drivers. again that's just what im experiencing at the moment with there new drivers and the last three beta drives the 9.3 9.4 and 9.5 betas.


----------



## G2O415

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> congrats on the new gpu, she will serve you proper no doubt. heres the deal ive been trolling around on oc and have posted a few other threads about the amd drivers for these 200 series cards. not sure if anyone else Is having the issue im having but here it is. I did the latest update to the 13.12 and in battlefield 4 wich is suppose to be amd built and optimized im getting realy horrible screen flickering on the maps. and I mean horrible the map to include the actual little map on the deploy screen is just flickering away but only in crossfire. I turn crossfire off and all is well. this has been happening since the fist beta driver. the first one only certain back ground things would flicker, like mountains, snow mostly white things. then I went to the beta 13.11 9.5 and the whole map started flickering, then they released these new drivers 13.12 and its worse. I got tired of jumping out of games and trying to tweak settings to get the flickering to go away cause honestly I bought two of these bad boys and wasn't going to drop down to a single card, hard headed I guess, lol. but hey my thing is if its in my system use it!!! so I finaly did the noob thing wich im know we all here at oc do not ever do anymore and that's unistalled all drivers and used the install disk, yuuukkkoooo... but guess what? I run bf4 in crossfire no flickering at all at 120 plus fps on ultra on the disk drivers we usualy just toss out. luckily I kept one of the discs...
> 
> so in short try out the new drivers see how they act for you and if all else fails just use the disc it came with until amd fixed all this mess with there drivers. again that's just what im experiencing at the moment with there new drivers and the last three beta drives the 9.3 9.4 and 9.5 betas.


Gotcha! I will try out the most recent one that came out and see how that plays out. Finally the games I couldn't play on Steam is now within my reach!! I'm currently debating which game to play at the moment haha


----------



## gh0stfac3killa

Quote:


> Originally Posted by *G2O415*
> 
> Gotcha! I will try out the most recent one that came out and see how that plays out. Finally the games I couldn't play on Steam is now within my reach!! I'm currently debating which game to play at the moment haha


sounds good, don't choose one, lol, play them all!!! on the realy though between steam, origin, and uplay I have a but load of games new and old and to tell you the truth the only game that realy plays horrible with my amd r9's and my nvidias for that matter is call of duty ghosts. the game is broken they say and even with all the patched, hotfixes, updates and everything else going on with that game it still plays horrible for me unless I use only one gpu and realy dumb down the graphics to straight up Nintendo 64 / dream cast graphics. that's going a bit extreme I guess but its just realy annoying paying for a game and not realy being able to play it. all the other games though, silky smooth..... so enjoy and these r9's are looking to be some good over clockers as well. ive been watching everyones posts. im still waiting on my xfx's to come in, for the love of god newegg I want my gpus they showed shipped almost two weeks ago, grrrrrrrrr..... lol..


----------



## Durvelle27

Would anyone here like to take over/ownership of the thread and keep it updated regularly as i'm getting to busy to keep it updated.


----------



## Jaffi

So today my Asus 280X DC2T arrived to take the place of the Gigabyte rev. 2.0 that I send back a few days ago.
First of all I have to say that it was well worth the extra money. The Build quality is just phenomenal. It feels like a tank compared to the wobbly plastic that is attached to the Gigabyte card.
The Gigabyte card bent down by a lot after a few days sitting in the case, Asus is much more stable and the overall stiffness is higher.

After installing the card, I checked the ASIC, which is 60%. Not very good, but those numbers don't matter that much on AMD cards anyway. Next was GPU Tweak. It allows me to ramp up the voltage to 1,3v from 1,2v stock. Great!

On to the first run of valley benchmarking! First thing I noticed was the silence! The cooler is very capable. It was running @20-25% for quite a while and the GPU still was 50-60° C after a minute.
I managed to do a full run @1220/1850 with 1.3v, result below.



That of course is a bunch away from game stable, I still have to run those tests









The bottom line is I would highly recommend this card, although it is not cheap. But you definitely get a much better build quality, easy voltage control, VRM monitoring (they never got hotter than 75-80° C), a much quieter cooler, and absolutely NO coilwhine. All of those points were the opposite with the Gigabyte card.

Took some pics, check em out!


----------



## AlphaC

http://www.techpowerup.com/reviews/MSI/R9_270X_Gaming/3.html

Seems MSI R9 270X gaming uses HAWK PCB (with DirectFETs)

The main difference appears to be the voltage readouts, 5 heatpipes on HAWK, voltage controller (A uP1610 controller on Gaming one), 2 more directfet phases, and chip binning (if that is the case)

If HAWK is $230 and Gaming is $220, I'd think HAWK is a better value


----------



## [CyGnus]

Jaffi that score is pretty good, i have the same card and did 2250 with 1200/1850


----------



## Devildog83

Quote:


> Originally Posted by *AlphaC*
> 
> http://www.techpowerup.com/reviews/MSI/R9_270X_Gaming/3.html
> 
> Seems MSI R9 270X gaming uses HAWK PCB (with DirectFETs)
> 
> The main difference appears to be the voltage readouts, 5 heatpipes on HAWK, voltage controller (A uP1610 controller on Gaming one), 2 more directfet phases, and chip binning (if that is the case)
> 
> If HAWK is $230 and Gaming is $220, I'd think HAWK is a better value


Absolutely, the Hawks are know for overclocking like crazy animals.


----------



## GoLDii3

Quote:


> Originally Posted by *Devildog83*
> 
> Absolutely, the Hawks are know for overclocking like crazy animals.


Even better than my old 7870? 1230 mhz at stock voltage.


----------



## Devildog83

Quote:


> Originally Posted by *GoLDii3*
> 
> Even better than my old 7870? 1230 mhz at stock voltage.


I meant Hawks in general, I have seen very high overclocks from Hawks across the spectrum so I assume it would on the 270x also. For $10 more I would get the Hawk.


----------



## Fonne

Got a GTX670 now, and would really like to try something new that dont cost a fortune but still would be fun ....

Would like to try som SLI / Crossfire, and the cards will be getting watercooled + I LOVE to overclock .... Was looking on prices: (Two cards)

2x HIS R9 270X IceQ Boost Clock = 320 Euro
2x MSI R9 270X Hawk = 362 Euro
2x 280X = Around 480 Euro ...

To you think I can go higher with the Hawk than the IceQ ? - The 280X is pushing my budget, but was just to compare.


----------



## Devildog83

Quote:


> Originally Posted by *[CyGnus]*
> 
> Jaffi that score is pretty good, i have the same card and did 2250 with 1200/1850


The 280's and 290's are so expensive now. I am glad I have my X-Fire set up for about the same.


----------



## [CyGnus]

Got mine for 308€ now it costs 349€ glad i bought it when i did


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> The 280's and 290's are so expensive now. I am glad I have my X-Fire set up for about the same.


Yea prices suck really hard but at least you can make a profit off them right now


----------



## Devildog83

Quote:


> Originally Posted by *Fonne*
> 
> Got a GTX670 now, and would really like to try something new that dont cost a fortune but still would be fun ....
> 
> Would like to try som SLI / Crossfire, and the cards will be getting watercooled + I LOVE to overclock .... Was looking on prices: (Two cards)
> 
> 2x HIS R9 270X IceQ Boost Clock = 320 Euro
> 2x MSI R9 270X Hawk = 362 Euro
> 2x 280X = Around 480 Euro ...
> 
> To you think I can go higher with the Hawk than the IceQ ? - The 280X is pushing my budget, but was just to compare.


2 Hawks all the way, or Toxic's unless you want to spend the extra and get 2 280x's. I wanted to go for a R9 290 but I would wait until they came out with non-ref cards. They are so expensive now I will stay with what I have and wait until later.

I believe the Hawk will go higher by far.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> 2 Hawks all the way, or Toxic's unless you want to spend the extra and get 2 280x's. I wanted to go for a R9 290 but I would wait until they came out with non-ref cards. They are so expensive now I will stay with what I have and wait until later.
> 
> I believe the Hawk will go higher by far.


I can tell you once you get a 290 you won't be disappointed


----------



## Fonne

Quote:


> Originally Posted by *Devildog83*
> 
> 2 Hawks all the way, or Toxic's unless you want to spend the extra and get 2 280x's. I wanted to go for a R9 290 but I would wait until they came out with non-ref cards. They are so expensive now I will stay with what I have and wait until later.
> 
> I believe the Hawk will go higher by far.


Thanks:thumb:

ASUS R9270X-DC2T-2GD5 DirectCU II TOP
PowerColor Radeon R9 270X Devil
Sapphire Toxic Radeon R9 270X
MSI R9 270X Hawk

- The price on all these cards are within 180 - 195 Euro, and the Hawk got the lowest price ...


----------



## Devildog83

I can't read the graphics score.


----------



## Devildog83

Quote:


> Originally Posted by *Fonne*
> 
> Thanks:thumb:
> 
> ASUS R9270X-DC2T-2GD5 DirectCU II TOP
> PowerColor Radeon R9 270X Devil
> Sapphire Toxic Radeon R9 270X
> MSI R9 270X Hawk
> 
> - The price on all these cards are within 180 - 195 Euro, and the Hawk got the lowest price ...


I have the Devil and it is Volt locked. The DCII is good but the Toxic and Hawk will most likely overclock better. I will do some research on the Hawk and be right back.


----------



## Fonne

Quote:


> Originally Posted by *Devildog83*
> 
> I have the Devil and it is Volt locked. The DCII is good but the Toxic and Hawk will most likely overclock better.


Is the Devil really volt locked ? - Looks like the 270X Hawk is the best buy then, how high can the voltage be raised ?


----------



## G2O415

Does anyone get a little bit of sag on their 270X or higher? My card seems to be sagging a bit and it kind of bothers me...


----------



## Devildog83

Quote:


> Originally Posted by *Fonne*
> 
> Is the Devil really volt locked ? - Looks like the 270X Hawk is the best buy then, how high can the voltage be raised ?


Here is some review info -

Mega clocks


Thoughts


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> I can't read the graphics score.


My graphics score is 17133


----------



## Devildog83

Quote:


> Originally Posted by *Durvelle27*
> 
> My graphics score is 17133


Nice, I could get higher on the Physics but I just left the CPU at normal everyday clocks.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> Nice, I could get higher on the Physics but I just left the CPU at normal everyday clocks.


I could go higher to about 5.3GHz but I used my everyday clock of 5GHz


----------



## Arizonian

Hi everyone, real quick again:









*Any R9 280X / 280 & 270X Owners like to become thread starter to this club?*

Please let me know. Durvelle27 has stated he's no longer able to dedicate time to managing the OP due to other responsibilities and looking for replacement.

If we cannot find someone, the club will no longer have a members list but rather become solely an owners discussion thread. Which is actually fine as it's still provides a valuable thread for 280X / 280 & 270X owners to congregate and share their experiences, tweaks, over clocks etc....

Again let me know via PM and I'll be helping with the transition.


----------



## Devildog83

Quote:


> Originally Posted by *Durvelle27*
> 
> I could go higher to about 5.3GHz but I used my everyday clock of 5GHz


4.7 here. 5.0 is not possible for me without a full loop. I can bench at 4.9 but it's not really stable.


----------



## Devildog83

Quote:


> Originally Posted by *G2O415*
> 
> Does anyone get a little bit of sag on their 270X or higher? My card seems to be sagging a bit and it kind of bothers me...


No,but a backplate helps.


----------



## Jaffi

Quote:


> Originally Posted by *[CyGnus]*
> 
> Jaffi that score is pretty good, i have the same card and did 2250 with 1200/1850


Thanks mate. I did a few tests now and so far the card is running fine in Crysis 3 with 1150/1700 @1.2v stock voltage. Pretty nice! Let's see if it can do BF4


----------



## [CyGnus]

I use mine 1150/1625 ( to get 6500MHz mem effective) stock voltage with no problems at all fan at 46% and it tops out around 53/54ºc insane card


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> 4.7 here. 5.0 is not possible for me without a full loop. I can bench at 4.9 but it's not really stable.


You probably could with my chip


----------



## G2O415

I'm a bit confused whether or not I'm overclocking correctly, I'm using AMD Overdrive to bump up the frequency and Unigine Heaven to benchmark the modifications on my R9 270x. I bump up the core clock by 10 MHz after every "successful" benchmark in Heaven; which is set to DirectX 11, Ultra Quality, Extreme Tessellation, Stereoscopic 3D Disabled, Multi-monitor Disabled, Anti-Aliasing 8x, Fullscreen On at 1920x1080.

I'm assuming a "successful" overclock results with no artifacts, lock ups, or shutdowns? Of course, my card ain't the strongest out there so a slow, choppy run is to be expected at max settings in Heaven right?


----------



## neurotix

Quote:


> Originally Posted by *G2O415*
> 
> I'm a bit confused whether or not I'm overclocking correctly, I'm using AMD Overdrive to bump up the frequency and Unigine Heaven to benchmark the modifications on my R9 270x. I bump up the core clock by 10 MHz after every "successful" benchmark in Heaven; which is set to DirectX 11, Ultra Quality, Extreme Tessellation, Stereoscopic 3D Disabled, Multi-monitor Disabled, Anti-Aliasing 8x, Fullscreen On at 1920x1080.
> 
> I'm assuming a "successful" overclock results with no artifacts, lock ups, or shutdowns? Of course, my card ain't the strongest out there so a slow, choppy run is to be expected at max settings in Heaven right?


I have a 270X, I might be able to help.

Unigine Heaven and Valley are bad benchmarks to test stability with. I can do 1300mhz in Valley and not crash or see artifacts for as long as I leave it running, but the card crashes almost immediately in any game.

Try OCCT GPU test with max ram (however much your card has), 7 for shader complexity and error checking on at 1080p in a window. You will have to set these options manually. If you see artifacts or it says there's any errors then it's not stable, lower clocks or increase voltage.

The best way to test if your card is stable is to play modern games on it for a few hours, but using OCCT will help you find your max clocks quickly. Once it's stable in OCCT (or alternatively Furmark) for a half hour it's probably good and you can run Valley or Heaven on it to make sure.

So 1) Run OCCT 2) Then run Valley 3) Test it in games.

The only way to know for sure that you're stable is to play games with it. If you don't see any artifacts and don't crash after a few hours of gaming, your card is probably set.


----------



## G2O415

Quote:


> Originally Posted by *neurotix*
> 
> I have a 270X, I might be able to help.
> 
> Unigine Heaven and Valley are bad benchmarks to test stability with. I can do 1300mhz in Valley and not crash or see artifacts for as long as I leave it running, but the card crashes almost immediately in any game.
> 
> Try OCCT GPU test with max ram (however much your card has), 7 for shader complexity and error checking on at 1080p in a window. You will have to set these options manually. If you see artifacts or it says there's any errors then it's not stable, lower clocks or increase voltage.
> 
> The best way to test if your card is stable is to play modern games on it for a few hours, but using OCCT will help you find your max clocks quickly. Once it's stable in OCCT (or alternatively Furmark) for a half hour it's probably good and you can run Valley or Heaven on it to make sure.
> 
> So 1) Run OCCT 2) Then run Valley 3) Test it in games.
> 
> The only way to know for sure that you're stable is to play games with it. If you don't see any artifacts and don't crash after a few hours of gaming, your card is probably set.


Ah I see, when I tried OCCT it wouldn't work properly when I turned on Error Checking. Without it it would run fine, but I will give another shot and check if my current settings are good. I managed to get it to 1170/1500 but will retest it to make sure.

Thanks!

EDIT:
Never mind, read the FAQ at the site.


----------



## Jaffi

I started bf4 campaign yesterday. Is it possible that there are still some graphical glitches in the game? When they appeared, I tried downclocking my card, which didn't help. Only reloading the checkpoint made them disappear. If so, bf4 still is a bad game to validate OC.

Tapatalked from my Nexus 5


----------



## madorax

Quote:


> Originally Posted by *Jaffi*
> 
> I started bf4 campaign yesterday. Is it possible that there are still some graphical glitches in the game? When they appeared, I tried downclocking my card, which didn't help. Only reloading the checkpoint made them disappear. If so, bf4 still is a bad game to validate OC.
> 
> Tapatalked from my Nexus 5


my higher oc clock (say 1200 / 1750) will get artifact or even black screen on BF4 even though it pass all other game, firestrike, valley, 3dmark 11, heaven with no problem whatsoever. so i think it's in the BF4 itself.


----------



## ./Cy4n1d3\.

Just placed an order for a AsusR9280X-DC2T-3GD5 to replace my DCUII GTX 570. Managed to get it on Tigerdirect at $330, when Newegg was charging $420 for the same card.

Btw... anyone know the difference between the Asus R9280X-DC2T-3GD5 and the Asus R9280X-DC2T-3GD5*-V2*?


----------



## Devildog83

Quote:


> Originally Posted by *./Cy4n1d3\.*
> 
> Just placed an order for a AsusR9280X-DC2T-3GD5 to replace my DCUII GTX 570. Managed to get it on Tigerdirect at $330, when Newegg was charging $420 for the same card.
> 
> Btw... anyone know the difference between the Asus R9280X-DC2T-3GD5 and the Asus R9280X-DC2T-3GD5*-V2*?


I think the V2 is a 3 slot card and the V1 is a 2 slot if I am not mistaken.

This is v2 - http://www.amazon.com/Asus-Radeon-4DisplayPort-PCI-Express-R9280X-DC2T-3GD5-V2/dp/B00FT2KSVK/ref=sr_1_2?s=electronics&ie=UTF8&qid=1387632736&sr=1-2&keywords=asus+r9+280x

and this is V1

http://www.newegg.com/Product/Product.aspx?Item=N82E16814121803

R1 appears to have a men clock of 6.4 Ghz and the V2 6.0 Ghz and obviously the heatsink is thicker and should have better cooling on the V2.


----------



## gh0stfac3killa

Quote:


> Originally Posted by *./Cy4n1d3\.*
> 
> Just placed an order for a AsusR9280X-DC2T-3GD5 to replace my DCUII GTX 570. Managed to get it on Tigerdirect at $330, when Newegg was charging $420 for the same card.
> 
> Btw... anyone know the difference between the Asus R9280X-DC2T-3GD5 and the Asus R9280X-DC2T-3GD5*-V2*?


rev 2 is a three sloter as rev1 is a 2, and the rev 2 has 4 display ports and two dvi versus the 2 display ports and 2 dvi on the rev 1. so youll get more screens if your going to do eyefinity from the rev 2...


----------



## Arizonian

*Announcement*
Ok everyone I've got good news


First I'd like to thank *Durvelle27* for taking initiative and starting this owners club when the AMD R9 280X / 280 & 270X cards released. I was happy to work with him on starting the club and get this ball rolling.









We've found a new thread starter and I'd like to now welcome *Devildog83* for stepping up to the plate as he's graciously taken on the responsibility to keep it going so it doesn't turn into an owners thread for just discussion but can remain an updated list of owners. I've moved and worked on some of the info in posts 1, 2, & 3. edited, arranged some info and have made the first post Devildog83's new home. Let's give Devildog83 a little time to acquaint himself with the first post and welcome him aboard as your new thread starter.


----------



## Durvelle27

Quote:


> Originally Posted by *Arizonian*
> 
> *Announcement*
> Ok everyone I've got good news
> 
> 
> First I'd like to thank *Durvelle27* for taking initiative and starting this owners club when the AMD R9 280X / 280 & 270X cards released. I was happy to work with him on starting the club and get this ball rolling.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We've found a new thread starter and I'd like to now welcome *Devildog83* for stepping up to the plate as he's graciously taken on the responsibility to keep it going so it doesn't turn into an owners thread for just discussion but can remain an updated list of owners. I've moved and worked on some of the info in posts 1, 2, & 3. edited, arranged some info and have made the first post Devildog83's new home. Let's give Devildog83 a little time to acquaint himself with the first post and welcome him aboard as your new thread starter.


That's great







and welcome DevilDog


----------



## robster84

XFX Radeon R9 270X Boost 2048MB Double D here


----------



## Durvelle27

@DevilDog some members i missed that need to be added

OCN: lillebj0rn
Dec 19, 2013 at 9:42 am
ASUS Radeon R9 280X 3GB GDDR5 "DirectCU II TOP"

OCN: eclipsedude
Dec 19, 2013 at 4:52 pm
Sapphire Toxic R9 280X - 1200/1800

OCN: jamponget9
Dec 19, 2013 at 9:46 pm
Sapphire R9 270X Toxic


----------



## Devildog83

Thanks, I am on it.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> Thanks, I am on it.


----------



## Devildog83

Quote:


> Originally Posted by *robster84*
> 
> XFX Radeon R9 270X Boost 2048MB Double D here


Got a pic?


----------



## Lisjak

Hey devildog, I seem to have been "missed in the move"









Here is the gpu-z validation and some pics. http://www.techpowerup.com/gpuz/n35xm/




Spoiler: Warning: Spoiler!


----------



## Devildog83

The list has been updated.

Lisjak you are added too.

Thank to to Durvelle and Arizona for letting me oversee the thread.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> The list has been updated.
> 
> Lisjak, if you would like me to post your clocks too just let me know.
> 
> Thank to to Durvelle and Arizona for letting me oversee the thread.


No problem bud


----------



## robster84

http://www.techpowerup.com/gpuz/hmxdu/

http://s227.photobucket.com/user/robster84_2007/media/photo_zps9efc9de3.jpg.html


----------



## madorax

@devildog

dont forget add me on the list ^^



My daily 1160 / 1650

Max OC 1200 / 1700


----------



## Devildog83

robster84 and madorax have been added. Welocme to the club.

I will be off to a X-Mas party for most of the day. I will update list if needed tonight.


----------



## Archea47

Quote:


> Originally Posted by *Devildog83*
> 
> Thanks, I am on it.


Awesome Devildog83!

Now all we need is sig code in the OP


----------



## ./Cy4n1d3\.

Quote:


> Originally Posted by *Devildog83*
> 
> Quote:
> 
> 
> 
> Originally Posted by *./Cy4n1d3\.*
> 
> Just placed an order for a AsusR9280X-DC2T-3GD5 to replace my DCUII GTX 570. Managed to get it on Tigerdirect at $330, when Newegg was charging $420 for the same card.
> 
> Btw... anyone know the difference between the Asus R9280X-DC2T-3GD5 and the Asus R9280X-DC2T-3GD5*-V2*?
> 
> 
> 
> I think the V2 is a 3 slot card and the V1 is a 2 slot if I am not mistaken.
> 
> This is v2 - http://www.amazon.com/Asus-Radeon-4DisplayPort-PCI-Express-R9280X-DC2T-3GD5-V2/dp/B00FT2KSVK/ref=sr_1_2?s=electronics&ie=UTF8&qid=1387632736&sr=1-2&keywords=asus+r9+280x
> 
> and this is V1
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814121803
> 
> R1 appears to have a men clock of 6.4 Ghz and the V2 6.0 Ghz and obviously the heatsink is thicker and should have better cooling on the V2.
Click to expand...

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> Quote:
> 
> 
> 
> Originally Posted by *./Cy4n1d3\.*
> 
> Just placed an order for a AsusR9280X-DC2T-3GD5 to replace my DCUII GTX 570. Managed to get it on Tigerdirect at $330, when Newegg was charging $420 for the same card.
> 
> Btw... anyone know the difference between the Asus R9280X-DC2T-3GD5 and the Asus R9280X-DC2T-3GD5*-V2*?
> 
> 
> 
> rev 2 is a three sloter as rev1 is a 2, and the rev 2 has 4 display ports and two dvi versus the 2 display ports and 2 dvi on the rev 1. so youll get more screens if your going to do eyefinity from the rev 2...
Click to expand...

Great! If true, that's better than I thought. My Asus GTX 570 is a triple slot monster, with lots of card sag. If this is indeed a thinner design, then yee-haw!

But it does look like V1 does not have a backplate...


----------



## GTR Mclaren

Any tips to use PhysX with the CPU ??


----------



## dabhuis

Just got her in yesterday, haven't overclocked or anything yet.


----------



## Devildog83

Quote:


> Originally Posted by *Archea47*
> 
> Awesome Devildog83!
> 
> Now all we need is sig code in the OP


I will work on that.


----------



## Devildog83

But it does look like V1 does not have a backplate...

You could try and order and aftermarket. I think they make one for the 7870 that would fit it for about $28.

Made by EK


----------



## ./Cy4n1d3\.

Well... Appears my R9-280x is on hold until I can get things sorted out.

When I went to go track my order, it said this
"Please Call - This order is on hold, pending customer response. Please call 1.800.888.4437 to expedite your order processing."

Hmm... Well this is dumb. They gave me the number to the verification department, and then said it was not open at this time. Didn't tell me anything about when it would be open.

GAH... They better upgrade me to faster shipping to replace the... *counts days on calendar*.. additional 2 days that I had to wait to get verified.

Oh.. And the $330 did post on my CapitalOne account, so it's not they haven't taken their money yet.


----------



## Devildog83

Quote:


> Originally Posted by *dabhuis*
> 
> Just got her in yesterday, haven't overclocked or anything yet.


Go ahead and make a sig rig from your profile or the rig builder at the top right of this page. If you let me know which Gigabyte that is and the clocks I will add you. Should be the 280x right?


----------



## Durvelle27

Quote:


> Originally Posted by *GTR Mclaren*
> 
> Any tips to use PhysX with the CPU ??


Not worth it as you will have a performance hit


----------



## hoevito

Can anyone else with a Toxic 280x tell me what their experiences with theirs have been like? I just had to RMA mine because it produces screen artifacts and nasty flickering in windows totally stock and untouched/tweaked, and using 4 different driver sets. It also couldn't clock above 1200mhz core without running into stability issues.


----------



## VegetarianEater

Quote:


> Originally Posted by *hoevito*
> 
> Can anyone else with a Toxic 280x tell me what their experiences with theirs have been like? I just had to RMA mine because it produces screen artifacts and nasty flickering in windows totally stock and untouched/tweaked, and using 4 different driver sets. It also couldn't clock above 1200mhz core without running into stability issues.


I've only had flickering issues in BF4 and dota2 and it has happened only 3 times total in about 20+ hours of gameplay, probably driver related. Haven't tried to overclock mine but due to the recent price increase i'm trying to sell my currently. I would keep it and crossfire it with another 280x but they're too expensive right now, plus when gaming, in my case(550D), the fans spin up to 65-70 percent during BF4, which for the toxic means 3200 rpms on the fan, which is really freakin loud. So hopefully i can get a decent profit on it and buy a 780, or i might wait for prices to drop and get a 290 or 290x, or wait for prices to drop and get 2x asus 280x...


----------



## kpo6969

I was never added to the list:

http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-owners-club/2300#post_21367101

Asus 280X DC2T

ASIC 74.3%


----------



## gh0stfac3killa

awwwww, i never put mine on the sight... well here are my first two r9 280x's..





two xfx 280's on the way still. i realy hate waiting on mail.... no matter how old you get u are still super inpacient!!! lol..


----------



## PerspexPC

@VegetarianEater
I know exactly what you mean about the noise but the performance is amazing even with all settings left as standard , I'm pretty new to pc gaming I have the R9 280x toxic I've tried a few games , best I've seen is sleeping dogs with the graphics and texture add on , cranked it up to max settings and the results in the game are amazing , how ever it's very loud , can anybody give me some tips on over clocking the card , power , memory and clock settings I could try ??
Also sorry if this is a dumb question but does it have it's own bios that can be accessed at all ??
Thanks please tag me so I don't miss anyone that responds to me


----------



## PerspexPC

Sorry forgot to add pictures so I can be added
Thanks


----------



## hoevito

Just for giggles...here's the highest single card score I could massage out of my Matrix without crashing.









http://www.3dmark.com/fs/1354611


----------



## gnemelf

Quote:


> Originally Posted by *hoevito*
> 
> Just for giggles...here's the highest single card score I could massage out of my Matrix without crashing.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/1354611


nice.

okay now to take apart my 280x again.... and retry putting more thermal paste on it... replaced the stock paste which was pilled on there.. but I think I put to little on it.... so I am going to try again... i need to get some new thermal tape for the vrms and some small vrm sinks for the other vrm that has no hs's. I'll take pictures this time...


----------



## [CyGnus]

hoevito did you play with the on board voltage buttons? I was expecting more of a Matrix since my Asus Top does 1230/1850 with only 1.256v


----------



## PerspexPC

@gnemelf
I considered doing this too just too see if I can reduce heat and noise , let me know how you get on


----------



## gnemelf

Quote:


> Originally Posted by *PerspexPC*
> 
> @gnemelf
> I considered doing this too just too see if I can reduce heat and noise , let me know how you get on


will do.


----------



## DiceAir

ok so I have my 280x @ 1150Core and 1750 memory. Core voltage is 1.263 and memory at 1.6v. is that ok or will it harm my card? Temps is about 80C. BTW running 2x in crossfire and 2560x1440 @ 96Hz.


----------



## Devildog83

Quote:


> Originally Posted by *DiceAir*
> 
> ok so I have my 280x @ 1150Core and 1750 memory. Core voltage is 1.263 and memory at 1.6v. is that ok or will it harm my card? Temps is about 80C. BTW running 2x in crossfire and 2560x1440 @ 96Hz.


Would you like to be added?


----------



## Devildog83

kpo6969 - ghostfac3killa - PerspexPC have all been added. neutrix OC has been updated.

If anyone has updated OC for their card and wish's it to be updated post the GPUZ and I will do so.

I have a long range wireless modem I am reviewing right now and will not need it after I am done, if anyone has an idea for a contest maybe I can give this away. What do you guys think? Is that legal?


----------



## DiceAir

Quote:


> Originally Posted by *Devildog83*
> 
> Would you like to be added?


I'm a noob. Added to where?


----------



## Devildog83

Quote:


> Originally Posted by *DiceAir*
> 
> I'm a noob. Added to where?


I can add you to the club on the 1st page.


----------



## Devildog83

The signature for the club has been added to the 1st post. Wear it proudly.









"If you've submitted an entry before I became thread starter and have been missed either please re-post or provide a link to original post and I will get you added "


----------



## DiceAir

Quote:


> Originally Posted by *Devildog83*
> 
> The signature for the club has been added to the 1st post. Wear it proudly.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> "If you've submitted an entry before I became thread starter and have been missed either please re-post or provide a link to original post and I will get you added "


oh ok. My overclock is not stable after all. think I will just stick to 1100mhz core 1500mhz memory


----------



## cr4p

Can I join?


*MSI R9 280X Gaming Edition*


----------



## gnemelf

well I put new thermal compound on... just have to give it a bit to cure up before I know my true temp readings... gotta find my cord for my phone and then i will upload some photos....

I might have to try and re apply one more time... thinking I didn't get enough paste on there... temps are seeming to high.... used to idle at like 34c now its like 39c - 40c and the heatsink is seated right.


----------



## danilon62

Id like to join!!!!

Just got my new Gigabyte 280x!

Will post pics when it arrives


----------



## Devildog83

Quote:


> Originally Posted by *cr4p*
> 
> Can I join?
> 
> 
> *MSI R9 280X Gaming Edition*


Added, feel free to copy the club sig to your sig.


----------



## [CyGnus]

gnemelf if you are running Android get a app called websharing no need for a cable it transfers files wireless


----------



## G2O415

Quote:


> Originally Posted by *Devildog83*
> 
> The signature for the club has been added to the 1st post. Wear it proudly.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> "If you've submitted an entry before I became thread starter and have been missed either please re-post or provide a link to original post and I will get you added "


Does this image count as entry to the club?

http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-owners-club/2540#post_21427213


----------



## cr4p

Quote:


> Originally Posted by *Devildog83*
> 
> Added, feel free to copy the club sig to your sig.


Thanks!

BTW, does someone here plays AC4? What are your settings with R9 280x? Thanks!


----------



## Devildog83

Quote:


> Originally Posted by *G2O415*
> 
> Does this image count as entry to the club?
> 
> http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-owners-club/2540#post_21427213


Sure, what clocks are you at at?


----------



## G2O415

Quote:


> Originally Posted by *Devildog83*
> 
> Sure, what clocks are you at at?


Stock at the moment (1070/1400), I was able to get it up to 1170/1500 but haven't tried anything higher.


----------



## Devildog83

Danilon62 and G20415 have been added.


----------



## G2O415

Quote:


> Originally Posted by *Devildog83*
> 
> Danilon62 and G20415 have been added.


Thanks OP


----------



## uaedroid

Here is my 280X for mining...


----------



## NinjaToast

Quote:


> Originally Posted by *Devildog83*
> 
> The signature for the club has been added to the 1st post. Wear it proudly.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> "If you've submitted an entry before I became thread starter and have been missed either please re-post or provide a link to original post and I will get you added "


Tis not a linked signature but I took the liberty of adding the link, here it is if you want to save time.









Code:



Code:


[CENTER] [B]:devil:  [URL=http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-owners-club][Official] AMD R9 280X / 280 & 270X Owners Club[/URL] :devil:[/B] [/CENTER]


----------



## [CyGnus]

*uaedroid* the gigabyte overclocks good? Let me know my cosin wants one of those but i am curious on the overclock i was recomending him the Asus Top but that one comes with BF4 and its cheaper


----------



## PerspexPC

Managed a stable over clock on my toxic card today and it gave me decent increase in score on pc mark and 3d mark







power slider upto 40 , GPU 1200 , memory 1800


As you can see in this picture result on top result is over-clocked and 2nd result is standard


Apologies for the photos , I use safari to access OC.net on my phone


----------



## PerspexPC

Edited Above


----------



## [CyGnus]

PerspexPC you can edit the post dont need to post twice, and print screen works way better


----------



## Devildog83

uaedroid added, thread updated.


----------



## [CyGnus]

Devildog83 just wanted to say thanks for taking over the thread (huge responsability) and keep up the good work


----------



## TempAccount007

Quote:


> Originally Posted by *[CyGnus]*
> 
> hoevito did you play with the on board voltage buttons? I was expecting more of a Matrix since my Asus Top does 1230/1850 with only 1.256v


My DC2T doesn't. I would say most do not. I question how hard people run their cards at those clocks. If they have v-sync on then it doesn't count. BF4 on ultra at 120% scaling reveals all weaknesses... it's often sustained 100% load.


----------



## taem

Any XFX 280x black edition users here? How are the vrm temps? Xfx has had problems with this in the past so I wonder if they actually put a heat sink on the vrms this time.
Quote:


> Originally Posted by *TempAccount007*
> 
> My DC2T doesn't. I would say most do not. I question how hard people run their cards at those clocks. If they have v-sync on then it doesn't count. BF4 on ultra at 120% scaling reveals all weaknesses... it's often sustained 100% load.


I'm finding Far Cry 3 to be unforgiving of overclocks on my Vapor X. Clocks that remain stable thru unigine, 3dmark, other games, will cause FC3 to lock up. Even mild overclocks.


----------



## hoevito

Quote:


> Originally Posted by *[CyGnus]*
> 
> hoevito did you play with the on board voltage buttons? I was expecting more of a Matrix since my Asus Top does 1230/1850 with only 1.256v


I only did it for the bench with that run just to see how high of a score I could get, but yes...I've played with the on-board voltage buttons AND in GPU Tweak as well. The highest absolutely stable clock I can get that works with everything is 1220/1750 no matter what I do. Sure it will play some games at 1240-1250mhz core (like Battlefield 4 for example), but if I throw a REALLY graphically intensive game like Witcher 2 at it it starts artifacting within 5 minutes if I go any higher. It also cannot pass repeated runs of Heaven higher than 1230mhz without showing artifacts around the 2 run. I've thrown 1.28 volts/1260mhz and higher occasionally at it but at those volts if it gets any hotter than 63 deg it starts showing instability really quickly. I've got one of those low ASIC cards (67%), and the card just screams to be water cooled. With better cooling this card in particular could probably top out around 1280 core methinks...

I also HAD a DC2t, but I sent it back because it was pretty weak. It topped out at 1150/1700 at 1.26 volts, and anytime I tried to take it higher it just couldn't hang. Also considering I'm running crossfire, I wanted a top card that could at least clock similarly to my Matrix so I returned the DC2 and exchanged it for a Toxic. The Toxic turned out to just be a bad card, and would literally artifact in windows and show some nasty screen flickering shortly after booting the computer, so heat/voltage wasn't the issue. 4 different driver sets with clean installs didn't help either, and my setup never had the issue with the Matrix or the DC2 so the Toxic had to go back. I'm waiting on a replacement Toxic right now...but if I can't get this crossfire setup to my liking then I'm strongly considering selling them both and going with a single 780 ti. Getting crossfire to work perfectly, let alone reasonably well, has been an exercise in frustration to say the least...


----------



## neurotix

lol Devildog you updated my OC.

*My 270X can pass Valley and Heaven at 1300mhz but that's about it*







It crashes in games instantly. I can do 1280mhz and pass 3dmark11 (sometimes), 3dmark Vantage and 3dmark06 but it locks up in Firestrike. It's unfortunate because even at 1300mhz it never passes 55C in Valley. If someone would release a modded Trixx 4.6.3 or higher, I could give the card more than 1.3v and it would probably stabilize at 1300mhz, with much more heat. The newer Beta Trixx is the only one that supports this card, older versions don't read the clocks or voltage control correctly. So I can't use the modded Trixx 4.4.0 with memory voltage control.

The highest stable OC I can run is 1250mhz. I'll get lockups and random reboots after gaming for a while at anything higher. Firestrike will freeze up randomly at anything higher. This is with voltage at 1.3v of course. If only I could give it more...


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> lol Devildog you updated my OC.
> 
> *My 270X can pass Valley and Heaven at 1300mhz but that's about it*
> 
> 
> 
> 
> 
> 
> 
> It crashes in games instantly. I can do 1280mhz and pass 3dmark11 (sometimes), 3dmark Vantage and 3dmark06 but it locks up in Firestrike. It's unfortunate because even at 1300mhz it never passes 55C in Valley. If someone would release a modded Trixx 4.6.3 or higher, I could give the card more than 1.3v and it would probably stabilize at 1300mhz, with much more heat. The newer Beta Trixx is the only one that supports this card, older versions don't read the clocks or voltage control correctly. So I can't use the modded Trixx 4.4.0 with memory voltage control.
> 
> The highest stable OC I can run is 1250mhz. I'll get lockups and random reboots after gaming for a while at anything higher. Firestrike will freeze up randomly at anything higher. This is with voltage at 1.3v of course. If only I could give it more...


I would not be too disappointed, 1250 is still a very high clock for a 270x, I don't know anyone else who can take there 270x there a actually be stable. I have had mine as high as 1310/1552 but it ain't stable. By itself I can run my 7870 at 1250/1450 all day in games and bench's and I think that is high. If you can get that close to 1300 try clocking the memory down to 1400 and see if that works.

EDIT: 1310/1552 was the 7870 not the 270x. The 270x Devil max stable overclock is 1225/1575.


----------



## neurotix

Quote:


> Originally Posted by *Devildog83*
> 
> I would not be too disappointed, 1250 is still a very high clock for a 270x, I don't know anyone else who can take there 270x there a actually be stable. I have had mine as high as 1310/1552 but it ain't stable. By itself I can run my 7870 at 1250/1450 all day in games and bench's and I think that is high. If you can get that close to 1300 try clocking the memory down to 1400 and see if that works.
> 
> EDIT: 1310/1552 was the 7870 not the 270x. The 270x Devil max stable overclock is 1225/1575.


I see, I'll have to give it a try. I really think it just needs a little more voltage than 1.3v to be stable at higher clocks though.

What brand of card are you waiting on shipping for btw? You sent the Powercolor Devil back right?


----------



## cremelo

sup guys, i got my new Asus R9 280x DCU2 Top...




And at moment is the second best investment in my PC......
For now I'll keep the processor and video card in stock till after the summer heat (here in Brazil) and fezendo some tests to see if anything stressful with this problem.... Eventually I hope to be doing a CrossFireX Overclocking and clear with 70% increase in the performance of the Video Cards








I've been watching the forum before getting my Video Card


----------



## uaedroid

Quote:


> Originally Posted by *[CyGnus]*
> 
> *uaedroid* the gigabyte overclocks good? Let me know my cosin wants one of those but i am curious on the overclock i was recomending him the Asus Top but that one comes with BF4 and its cheaper


Hello there CyGnus, in my card, I can overclock up to 1195MHz Core Clock and 7200 MHz Memory Clock. Unfortunately, the voltage is locked.


----------



## [CyGnus]

uaedroid thank you very much for that info


----------



## Jaffi

Can it do any harm to a card to do benchmark runs with artifacts? As long as it stays cool, it should be fine, right? I mean there is a big difference between game stable and bench stable, isn't it?


----------



## gnemelf

i know if i am artifacting in benchmarks, most of my games will either crash the pc or the video driver... but thats just me.


----------



## Devildog83

cremelo has been added ---- Welcome!


----------



## cremelo

Quote:


> Originally Posted by *Devildog83*
> 
> cremelo has been added ---- Welcome!


Thanks!
Pretty soon OC than expected: D


----------



## Devildog83

Again Welcome, NewEgg has seen fit to replace my 270x Devil so the new one is on it's way and soon my Dual Devil's will be blowing the doors off of everyone's single 280x's again.


----------



## [CyGnus]

Quote:


> Originally Posted by *Jaffi*
> 
> Can it do any harm to a card to do benchmark runs with artifacts? As long as it stays cool, it should be fine, right? I mean there is a big difference between game stable and bench stable, isn't it?


Artifacts normally is the mem clock to high but for tthe score is ok though if you do that a lot of times it will mess up the card eventually.
Bottom line to give a few scores is ok to use 24/7 you must be way lower


----------



## Devildog83

CyGnus,

How far is Lisboa from Madrid? I was born there.


----------



## [CyGnus]

*Devildog83* about 500Km


----------



## Devildog83

Quote:


> Originally Posted by *[CyGnus]*
> 
> *Devildog83* about 500Km


That would be a heck of a drive.


----------



## [CyGnus]

Yup it would


----------



## Jaffi

Quote:


> Originally Posted by *[CyGnus]*
> 
> Artifacts normally is the mem clock to high but for tthe score is ok though if you do that a lot of times it will mess up the card eventually.
> Bottom line to give a few scores is ok to use 24/7 you must be way lower


Yes, that's what I mean. Just get a nice score while accepting some artifacts, but don't use those clocks as daylie driver of course


----------



## gnemelf

well this is the mess of thermal paste they had on stock mid clean. 
new paste gets a even spread...




card back in pc.



still think I might have applied paste wrong somehow... temps dont make sense to me... i used prolimatech pk-3 and my idle temps went from 33c-34c idle to 39c-40c idle and full furmark load went from 72c to 75c.... I just don't see how the stock paste could be that much better... now I didn't put it all over those little chips on the gpu just the made chip.... could that be my problem? I've made sure the hs is seated correctly... i'm just scratching my head over this...


----------



## GoLDii3

Quote:


> Originally Posted by *gnemelf*
> 
> well this is the mess of thermal paste they had on stock mid clean.
> new paste gets a even spread...
> 
> 
> 
> 
> card back in pc.
> 
> 
> 
> still think I might have applied paste wrong somehow... temps dont make sense to me... i used prolimatech pk-3 and my idle temps went from 33c-34c idle to 39c-40c idle and full furmark load went from 72c to 75c.... I just don't see how the stock paste could be that much better... now I didn't put it all over those little chips on the gpu just the made chip.... could that be my problem? I've made sure the hs is seated correctly... i'm just scratching my head over this...


Where did you get that backplate? Your PCB is reference right?


----------



## gnemelf

the backplate came stock its a powercolor r280x from when they were still $299 on the newegg, the backplate was actually the main reason I got this one over the others that where the same price... and I was under the impression that it was a custom pcb... but I could be wrong... if it's the " reference" pcb I'll be watercooling this one soon..


----------



## [CyGnus]

gnemelf re-apply the paste (use a bit less) and make sure the screws are well tight


----------



## ILLmatik94

Info for list





1180/1650/1.25V is my max stable OC in BF4 without additional voltage. Temps never exceed 65C @ 50% fan speed although VRM1 reaches 90-95C after a few hours of gaming. Not sure if that's considered safe or not for this particular card but other than that I'm very pleased


----------



## Devildog83

ILLmatik94 has been added, Welcome to the club.

I got an early present - My 270x Devil showed up and X-Fire is working again.


----------



## PerspexPC

@Devildog83
Nice score I can only get just over 2000 on my toxic card on normal clock settings
Over clocked slightly I can get 2200 but games will freeze after about half an hour


----------



## Devildog83

My FPS is 666, is that a bad Omen?







I am on par with a lot of 290's. I can live wit that.

Wait, I would think that Devils should get 666.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> My FPS is 666, is that a bad Omen?
> 
> 
> 
> 
> 
> 
> 
> I am on par with a lot of 290's. I can live wit that.


Let's see if you can match my ole 290


----------



## Devildog83

Quote:


> Originally Posted by *Durvelle27*
> 
> Let's see if you can match my ole 290


It's on like Donkey Kong. LOL

Just got a 19000 + graphics score on 3DMark11

http://www.3dmark.com/3dm11/7709163


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> It's on like Donkey Kong. LOL
> 
> Just got a 19000 + graphics score on 3DMark11
> 
> http://www.3dmark.com/3dm11/7709163


Boom

http://www.3dmark.com/3dm11/7626237


----------



## Devildog83

Quote:


> Originally Posted by *Durvelle27*
> 
> Boom
> 
> http://www.3dmark.com/3dm11/7626237


I told you it was on par with the 290? Nice score.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> I told you it was on par with the 290? Nice score.


Yea those 270X are pretty good. I kinda miss my ole 7870 Crossfire now lol


----------



## Devildog83

Quote:


> Originally Posted by *Durvelle27*
> 
> Yea those 270X are pretty good. I kinda miss my ole 7870 Crossfire now lol


If I could get my CPU up to 5.0 Ghz stable that would be nice.


----------



## avesdude

Hey so in case this can help anyone:

I have 2x Sapphire 280x: SAPPHIRE DUAL-X R9 280X 3GB GDDR5 OC (UEFI) BATTLEFIELD 4 EDITION (the 900/1050 core listed on their website).

I had major problems with VRM temps, with VRM 1 hitting 115C and then throttling the card.

I was able to successfully flash to the MSI Bios designed to fix the VRM overheating problems on their cards found here: http://www.techpowerup.com/vgabios/146769/msi-r9280x-3072-131009.html

VRM 1 temp is now 90C at full load, with me setting the fan to 75%. GPU core is 72C. Ambient of 25ish. No throttling observed.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> If I could get my CPU up to 5.0 Ghz stable that would be nice.


What's your cooling


----------



## VegetarianEater

Well you can delete me from the owners list, i just sold mine on ebay (though still awaiting payment, hopefully that comes soon) for $520, and i probably could have gotten more for it... still not a bad profit, over $100 dollar profit after ebay takes their cut. Going to either hold out for another price shift and hope the 290s come down in price or just buy a 780 lightning, or if the price of 280s ever goes down to $300 again i'd consider crossfiring the asus 280x's (my original plan was crossfire toxic editions, wish i had bought a second toxic i'd have made even more money)


----------



## Devildog83

Quote:


> Originally Posted by *Durvelle27*
> 
> What's your cooling


H100i, for some reason when I set the CPU to 1.5v or higher 3DMark11 will not run. I just get the error page. I have had the CPU to 4.9 and 5.0 Ghz before. It's too warm for my taste during benching though. I run 4.7 everyday and stay mid 50's and below.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> H100i, for some reason when I set the CPU to 1.5v or higher 3DMark11 will not run. I just get the error page. I have had the CPU to 4.9 and 5.0 Ghz before. It's too warm for my taste during benching though. I run 4.7 everyday and stay mid 50's and below.


Ahhhh. I'm running a custom loop with 3x240mm rads


----------



## Devildog83

Quote:


> Originally Posted by *Durvelle27*
> 
> Ahhhh. I'm running a custom loop with 3x240mm rads


That would be nice, with 3DMark11 I think it's voltage/stability issue because it rarely heats up the CPU even at 4.9 Ghz.


----------



## GoLDii3

Quote:


> Originally Posted by *Devildog83*
> 
> That would be nice, with 3DMark11 I think it's voltage/stability issue because it rarely heats up the CPU even at 4.9 Ghz.


Probably you are unstable due to VRM's cooking.


----------



## Devildog83

Quote:


> Originally Posted by *GoLDii3*
> 
> Probably you are unstable due to VRM's cooking.


Nope, I have a 60mm fan on them and they stay in the upper 40's, at least according to HWinfo64. It's not heat because they don't even get time for anything to heat up. It has to be NB volts or CPU/NB volts or something. Durvelle do you multi overclock or FSB?


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> Nope, I have a 60mm fan on them and they stay in the upper 40's, at least according to HWinfo64. It's not heat because they don't even get time for anything to heat up. It has to be NB volts or CPU/NB volts or something. Durvelle do you multi overclock or FSB?


When benching I use a mixture of both but for daily use i do

25 multi
200 FSB
2133MHz on RAM
2600MHz NB
2400MHz HT
1.5v VCore


----------



## [email protected]

Hello guys, want to join the club;


----------



## Fonne

Has anybody here with a R9 270X HAWK tried to use the MSI Afterburner Extreme ? - Would like to see how high it unlock the voltage.


----------



## Devildog83

[email protected] has been added. Welcome!


----------



## Frogeye

Add me to your r9 280x list: ASUS DirectCUII @ 1100 / 1600 for now


I'll be adding the NZXT G10


Kuhler 620


and some heatsinks for all of the ram and little vrm things.

Then I'll start pushing this beast. TBC....


----------



## Devildog83

Frogeye, you have been added. Welcome! I look forward to some nice overclocks and pics of the card inside the rig too.


----------



## Butternut101

woohoo opened up an early Xmas present and it was a r9 280x still got to install it cant wait!!!


----------



## Devildog83

Well what kind is it. Pics?


----------



## GuMossad

Well i have an even worse problem with my MSI R9 270x Gaming 2GB!

At the beginning everything was ok, i were using windows 7 64bit, installed 13.12 from AMD site and managed to play CS:GO on a

CPU: AMD FX8120
MB: Asus M5A97 EVO 2.0 (it's recent)
PSU: Tacens 700W Valeo Modular
8GB Corsair 1600Mhz Vengeance
Windows 8.1
MSI R9 270x Twin (BIOS 015.041) SN: 602-V303-03SB1310017043

although with some strange fps drops from 300 to 200 and then 150... But playable and all ok!

I then wanted to try out BF4, installed it, started the game everything in ULTRA in my super 4:3 1280x1024 screen, game starts, all cool, fast, people drowning, suddenly huge brake and system goes all black. Reboot, and as soon as i entered login screen dozen of artefacts (coloured lines, green, rose, blue), image all crazy and then after login Black and White stripes, like Atari 2600 games. Reboot nothing and it kept doing it.

Decided to enter in Safe Mode, all cool, no artefacts, no problems nothing at all. Tried to go back to normal mode, artefacts and crazy stuff going on.

After this I decided to install windows 8.1 and try out my luck. Well as i guessed Drivers are doing something messy, so fresh clean install, no catalyst and everything runs smooth in the system. 1st thing first, was to do a freedos pen and update Bios as soon as i arrived in this forum. All ok, BIOS updated time to install Catalyst 13.12!

So nice part, when installing drivers in the middle of the setup, screen goes all black, system dies, nothing happens and there it is. I can't do anything out of it. Nothing. Is the Card dead? I think so, switched from PCI Express, did almost everything i could remember of! I also encountered some other people complaining like me:

https://forum-en.msi.com/index.php?topic=176012.0

http://www.tomshardware.co.uk/answer...tput-270x.html

I didn't Overclocked anything, done nothing at all. Just installed Battlefield 4, started the game in Ultra and the computer screen goes all black. It then returns, with plenty of colourful lines and went grey and black.... In Safe Mode it works FINE! No problem at all...

HELP?! Return that damn MSI piece of rubbish and get an ASUS R9 270x? This is just damn crazy, I'm even thinking going to HD7870 or GTX660 (same price range here in Portugal)!


----------



## Butternut101

visiontek r9 280x pic and screen shots are to come!! GF got it for and she doesn't really know better







so I cant complain


----------



## radier

Quote:


> Originally Posted by *GuMossad*
> 
> Well i have an even worse problem with my MSI R9 270x Gaming 2GB!
> 
> At the beginning everything was ok, i were using windows 7 64bit, installed 13.12 from AMD site and managed to play CS:GO on a
> 
> CPU: AMD FX8120
> MB: Asus M5A97 EVO 2.0 (it's recent)
> PSU: Tacens 700W Valeo Modular
> 8GB Corsair 1600Mhz Vengeance
> Windows 8.1
> MSI R9 270x Twin (BIOS 015.041) SN: 602-V303-03SB1310017043
> 
> although with some strange fps drops from 300 to 200 and then 150... But playable and all ok!


Those fps drops in CS:GO are because of yours poor faildozer CPU. Source engine always required best single core performance (vide Intel).

No problem at all with my MSI 270x Gaming.

Try update yours MB bios.


----------



## jamponget9

At last! I now officially own a TOXIC R270x... thanks for the add thread master. ^_^

My Specs:

CPU: AMD FX4130 3.8 Ghz Black Edition
CPU COOLER: DEEP COOL Iceblade
GPU: SAPPHIRE TOXIC OC R9 270x 2 GB DDR5
MOBO: ASROCK 960GM-VGS3 FX
PSU: Zalman 625w
MEMORY: 8 GB Patriot DDR3 1333
OS: WIndows 7 Ultimate
STORAGE: Seagate 500gb, 1TB WD Caviar Black 60GB, 80GB Intel X25 SSD
CASE: Trendsonic Mid-Tower


----------



## Devildog83

Quote:


> Originally Posted by *GuMossad*
> 
> Well i have an even worse problem with my MSI R9 270x Gaming 2GB!
> 
> At the beginning everything was ok, i were using windows 7 64bit, installed 13.12 from AMD site and managed to play CS:GO on a
> 
> CPU: AMD FX8120
> MB: Asus M5A97 EVO 2.0 (it's recent)
> PSU: Tacens 700W Valeo Modular
> 8GB Corsair 1600Mhz Vengeance
> Windows 8.1
> MSI R9 270x Twin (BIOS 015.041) SN: 602-V303-03SB1310017043
> 
> although with some strange fps drops from 300 to 200 and then 150... But playable and all ok!
> 
> I had issues with the new 13.12 drivers on my Devil too. I uninstalled that and installed the 13.11 Beta 9.5 again and all is good. Check voltages if you can, make sure the PSU is doing the thing our way, ya knw what I mean.


----------



## Devildog83

Butternut101 and jamponget9 have been added. Guys feel free to post you clocks at everyday and high OC if you wish and I will add them to the 1st page. Welcome!


----------



## skitz9417

hi guys im just wondering will a r9 280x handle 4800x900 res gaming with no aa


----------



## GTR Mclaren

Finally my card is here, please add me to the club !!!

Gigabyte R9 270XOC


----------



## JCH979

I don't think I was added when I posted before the TS change so here's this. Oh and thanks Devildog83 for taking the reins







Also Happy Merry Holidays everyone!











Spoiler: My Asus DCUII Top R9 280X



I've been running it at stock clocks and using the 13.11 9.5 beta drivers since I got it 2 weeks ago. Haven't had any problems whatsoever playing a heavily modded Skyrim, Dead Space 3 co-op, Borderlands 2 (got the GOTY for $15 thanks to Valve







), and some other games at the highest settings. This thing is virtually silent and it looks great, I love it. I may play with OC'ing it at some point but haven't had the need to do so with the games I currently play.






I'm still using the 13.11 9.5 beta, has any one had any issues with the 13.12 drivers? Sorry if this has been asked/answered already but I haven't been on the forums lately.


----------



## jamponget9

Regarding drivers on the R9, I have found the new Catalyst Software Suite 13.12 to be more stable for my card. Unfortunately, mine was a 270x so i cant say if it will be the same for 280x. Why not give it a try, do some stress tests then if you find it unstable, just rollback to your old driver.


----------



## [email protected]

I have been using 13.12 WHQL since fresh Windows 8.1 install, about 2 days. Playing NFS: Rivals and BF4 with 1100/1650 clocks, no problem here and this card is beast. You can give it a try, im sure you will be satisfied.


----------



## Devildog83

Got a couple of X-Mas presents - custom PCIE cables for my Devils and 2 more Corsair red LED fans. As you can see way better.

OLD -


New -



Still need to get some straps so they look uniform but gotta wait till stores are open.


----------



## Devildog83

GTR Mclaren and JCH979 have been added. Welcome. Got clocks? Everyday and OC if you would like them added.


----------



## JCH979

Quote:


> Originally Posted by *Devildog83*
> 
> GTR Mclaren and JCH979 have been added. Welcome. Got clocks? Everyday and OC if you would like them added.


Sure, I'm at stock clocks atm 1070/1600. Very nice looking rig btw Devil











I did post a screen of my desktop with GPU-Z open within the spoiler in my previous post..
Quote:


> Originally Posted by *JCH979*
> 
> 
> 
> Spoiler: post #2702
> 
> 
> 
> I don't think I was added when I posted before the TS change so here's this. Oh and thanks Devildog83 for taking the reins
> 
> 
> 
> 
> 
> 
> 
> Also Happy Merry Holidays everyone!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: My Asus DCUII Top R9 280X
> 
> 
> 
> I've been running it at stock clocks and using the 13.11 9.5 beta drivers since I got it 2 weeks ago. Haven't had any problems whatsoever playing a heavily modded Skyrim, Dead Space 3 co-op, Borderlands 2 (got the GOTY for $15 thanks to Valve
> 
> 
> 
> 
> 
> 
> 
> ), and some other games at the highest settings. This thing is virtually silent and it looks great, I love it. I may play with OC'ing it at some point but haven't had the need to do so with the games I currently play.
> 
> 
> 
> 
> 
> 
> I'm still using the 13.11 9.5 beta, has any one had any issues with the 13.12 drivers? Sorry if this has been asked/answered already but I haven't been on the forums lately.


But there ya go anyways.

Hope y'all are having a nice day







Fixin to have a lil'shindig over here.


----------



## GuMossad

Well Devil i've tried from 13.12 to 13.11 beta 9.5, 9.4, 9.2 and nothing at all, always the same stuff.

I've also updated my Mobo Bios to latest one. PSU is running ok, everything fine... just a faulty MSI or perhaps drivers suck. A lot of people is saying that MSI r9 270x die fast, vrm is nuts, very incompatible with high, mid and low end mobos... Just crazy. I think I'm going for the Asus r9 270x, at least I can't find nobody saying they died, black screens, systems hanging...


----------



## GTR Mclaren

Quote:


> Originally Posted by *Devildog83*
> 
> GTR Mclaren and JCH979 have been added. Welcome. Got clocks? Everyday and OC if you would like them added.


I will install the card until sunday







currently painting and cleaning this mess of a room xD


----------



## jamponget9

My Toxic GPU-Z



This card is already Pre-Factory Overclocked @ 1150mHZ
I dont plan to OC yet as this is still a new card, but I do hope to get a nice headroom when I do so.


----------



## neurotix

What's the ASIC on that Toxic 270X?


----------



## Devildog83

Quote:


> Originally Posted by *GuMossad*
> 
> Well Devil i've tried from 13.12 to 13.11 beta 9.5, 9.4, 9.2 and nothing at all, always the same stuff.
> 
> I've also updated my Mobo Bios to latest one. PSU is running ok, everything fine... just a faulty MSI or perhaps drivers suck. A lot of people is saying that MSI r9 270x die fast, vrm is nuts, very incompatible with high, mid and low end mobos... Just crazy. I think I'm going for the Asus r9 270x, at least I can't find nobody saying they died, black screens, systems hanging...


I would say RMA it. It sounds bricked. The Tops are very popular but the Hawk, Toxic and Devil are good choices too, all are great overclockers but the Devil is Voltage locked. You can get over 1550 on the memory easy but the core clock is limited. I haven't heard bad things about MSI here but it sounds like they are heard elsewhere.


----------



## Devildog83

Clocks updated. I found Asic, my 270x is 78.2 and 7870 is 78.8. I am also curious what it might be on the Toxic and the Hawk for that matter as they seem to clock the highest.


----------



## danilon62

Finally arrived and got it installed


----------



## Devildog83

Awesome - Just curious how much did it cost?


----------



## danilon62

Quote:


> Originally Posted by *Devildog83*
> 
> Awesome - Just curious how much did it cost?


My 280x? 280 euros with shipping and battlefield 4 bundled


----------



## neurotix

ASIC is in GPU-Z. Click the little pci card illustration in the top left and select "Read ASIC Quality".


----------



## Devildog83

Quote:


> Originally Posted by *danilon62*
> 
> My 280x? 280 euros with shipping and battlefield 4 bundled


That is not bad, I see them here for $410 to $450. I don't see why some don't go for 2 x 270x's. At about the same price you get a big performance boost. I know some can't do X-Fire for what ever reason and some just won't but I would not go back unless it was to a 290 non-ref for close to the same price as what I have into my cards. Those 280x's were $300 here not long ago.


----------



## danilon62

Quote:


> Originally Posted by *Devildog83*
> 
> That is not bad, I see them here for $410 to $450. I don't see why some don't go for 2 x 270x's. At about the same price you get a big performance boost. I know some can't do X-Fire for what ever reason and some just won't but I would not go back unless it was to a 290 non-ref for close to the same price as what I have into my cards. Those 280x's were $300 here not long ago.


And 290s are going for 350 euros here in Spain. I dont know why the prices have increased that much in the USA, Its weird. 7970s are going for 400 dollars on ebay.


----------



## Greg121986

Two MSI 280x. Got them last week for $299 each through NewEgg. I wasn't going to do it, but I had been mining litecoin with my Nvidia GTX 760 and I could not resist. They run UNGODLY hot so I had to remove the sides off my cases and dig the "big boy" 200mm fan out from my stash of crap. I knew the 200mm fan would come in handy _someday_ after I threw out my Antec 1200 case.







I rigged it to a rake holder that I had previously used for a headphone stand. It dropped temps on both cards by 6C.


----------



## Devildog83

Quote:


> Originally Posted by *danilon62*
> 
> And 290s are going for 350 euros here in Spain. I dont know why the prices have increased that much in the USA, Its weird. 7970s are going for 400 dollars on ebay.


Sounds like another good reason to go back to Spain.







I'd love to see what it's like now.


----------



## jamponget9

Quote:


> Originally Posted by *Devildog83*
> 
> Clocks updated. I found Asic, my 270x is 78.2 and 7870 is 78.8. I am also curious what it might be on the Toxic and the Hawk for that matter as they seem to clock the highest.


Before it got this card to replace my old GTX 550ti, I've read somewhere that this card (toxic r9 270x) was already Overclocked so high that there were barely headroom left to OC. i was hoping to get either the devil variant or the toxic 280x but since this card was the only one available on my area, so out of curiosity, i grabbed it. I'm a happy owner of this beast nonetheless.


----------



## jamponget9

Quote:


> Originally Posted by *neurotix*
> 
> ASIC is in GPU-Z. Click the little pci card illustration in the top left and select "Read ASIC Quality".


thanks for the info.. . might go check that one after work at the office. just re-formatted my system last night as i was getting performance hiccups with this new card. i suspected some leftover drivers from my old card might be the culprit. turns out in was right. having restored all my games and test driving them this morning. the hiccups were all but gone and it gained 10 fps performance.


----------



## neurotix

Curious what the ASIC and overclocking capabilities are of the Toxic 270X compared to my Vapor-X 270X. My Vapor-X is a beast and I didn't expect to get such high clocks with it. I'm wondering if they bin chips for the Toxic 270X or not.

I'd think the Toxic should probably do 1250mhz at 1.3v if it has decent ASIC. Maybe more.


----------



## danilon62

Quote:


> Originally Posted by *Devildog83*
> 
> Sounds like another good reason to go back to Spain.
> 
> 
> 
> 
> 
> 
> 
> I'd love to see what it's like now.


Hahahaha, whenever you want, youre welcome


----------



## Devildog83

Quote:


> Originally Posted by *jamponget9*
> 
> Before it got this card to replace my old GTX 550ti, I've read somewhere that this card (toxic r9 270x) was already Overclocked so high that there were barely headroom left to OC. i was hoping to get either the devil variant or the toxic 280x but since this card was the only one available on my area, so out of curiosity, i grabbed it. I'm a happy owner of this beast nonetheless.


You should be, it will overclock some I am sure.


----------



## jamponget9

Quote:


> Originally Posted by *neurotix*
> 
> Curious what the ASIC and overclocking capabilities are of the Toxic 270X compared to my Vapor-X 270X. My Vapor-X is a beast and I didn't expect to get such high clocks with it. I'm wondering if they bin chips for the Toxic 270X or not.
> 
> I'd think the Toxic should probably do 1250mhz at 1.3v if it has decent ASIC. Maybe more.


hmmm... very interesting, one review site on the toxic 270x claims they were able to OC well up to 1235mhz that's 7% higher than the already factory OCed card from 1150mhz. going beyond according to them resulted in artifacts etc. might as well try it but my current gpu temps idles at 40 deg celcius and max load at 65 Dec celcius. i live in the tropics, so the current temps are holding me back to OC this baby.


----------



## Devildog83

Quote:


> Originally Posted by *jamponget9*
> 
> hmmm... very interesting, one review site on the toxic 270x claims they were able to OC well up to 1235mhz that's 7% higher than the already factory OCed card from 1150mhz. going beyond according to them resulted in artifacts etc. might as well try it but my current gpu temps idles at 40 deg celcius and max load at 65 Dec celcius. i live in the tropics, so the current temps are holding me back to OC this baby.


Ya, my 270x goes to 1235 and almost 1600 on mem. That should get close at least.

It's weird though, my 7870 will bench at over 1280 but only 1450 on the memory. Now that I think about it I could get to 1.3v on the core where the 270x is stuck at 1.265 to 1.287 max.


----------



## neurotix

Quote:


> Originally Posted by *jamponget9*
> 
> hmmm... very interesting, one review site on the toxic 270x claims they were able to OC well up to 1235mhz that's 7% higher than the already factory OCed card from 1150mhz. going beyond according to them resulted in artifacts etc. might as well try it but my current gpu temps idles at 40 deg celcius and max load at 65 Dec celcius. i live in the tropics, so the current temps are holding me back to OC this baby.


Yeah, you're going to be temp limited. You could probably still run it at 1200mhz, possibly with a voltage bump. If your card passes 70C you're probably going to have issues such as display driver crashes and artifacts, so as long as you can keep temps below 70C you should be okay.


----------



## GuMossad

Quote:


> Originally Posted by *Devildog83*
> 
> I would say RMA it. It sounds bricked. The Tops are very popular but the Hawk, Toxic and Devil are good choices too, all are great overclockers but the Devil is Voltage locked. You can get over 1550 on the memory easy but the core clock is limited. I haven't heard bad things about MSI here but it sounds like they are heard elsewhere.


Yeah... u know I'm a bit afraid of changing to another R9 270x... I'm reading so many bad stuff about them... I'm thinking in getting an Asus 7870 Top instead... is it a bad choice? Although I didn't found nobody talking about black screens, no drivers installing with Asus! Just the flickering and some fps drops that they needed to restart the computer... (Toxic is nice and has Hynnix memory instead of Elpida but far more expensive!) What would you do in my case? Stick with R9 270x or go to 7870 Top from Asus?


----------



## Devildog83

Quote:


> Originally Posted by *GuMossad*
> 
> Yeah... u know I'm a bit afraid of changing to another R9 270x... I'm reading so many bad stuff about them... I'm thinking in getting an Asus 7870 Top instead... is it a bad choice? Although I didn't found nobody talking about black screens, no drivers installing with Asus! Just the flickering and some fps drops that they needed to restart the computer... (Toxic is nice and has Hynnix memory instead of Elpida but far more expensive!) What would you do in my case? Stick with R9 270x or go to 7870 Top from Asus?


The 270's are mantle ready and eyefinity compatable so I would not go backwards. The flickering and blackscreens can be solved with drivers, I have not had those issues with newer beta drivers. Unlike others the 13.12 driver did not like my X-Fire set-up. If your just looking for a good gaming experience the Asus 270 seems to be a good card, if you want higher clocks I would go with a Toxic, Hawk or Devil.


----------



## GuMossad

Quote:


> Originally Posted by *Devildog83*
> 
> The 270's are mantle ready and eyefinity compatable so I would not go backwards. The flickering and blackscreens can be solved with drivers, I have not had those issues with newer beta drivers. Unlike others the 13.12 driver did not like my X-Fire set-up.


Yeah, I'm going to the shop tomorrow and ask them before exchanging cards, to try out the card and see if everything is OK with Asus R9 270X! I think that going backwards is also stupid... But well i prefer to play secure! But let's see... I bet Asus GPU on an Asus Mobo gotta work it all! The flickering can be solved with drivers and with the vbios update!

If you write on google MSI R9 270x black screen, problems u will find loads of threads. If you write about Asus R9 270x u will only find stuff talking about MSI LOL!

I'm just sad because I brought a FX8120... they didn't had in stock 8320







and they don't exchange me CPU's... Maybe the best way is to burn it up







?


----------



## jamponget9

Quote:


> Originally Posted by *GuMossad*
> 
> Yeah... u know I'm a bit afraid of changing to another R9 270x... I'm reading so many bad stuff about them... I'm thinking in getting an Asus 7870 Top instead... is it a bad choice? Although I didn't found nobody talking about black screens, no drivers installing with Asus! Just the flickering and some fps drops that they needed to restart the computer... (Toxic is nice and has Hynnix memory instead of Elpida but far more expensive!) What would you do in my case? Stick with R9 270x or go to 7870 Top from Asus?


here are my thoughts on 270x's

sapphire toxic - solid reviews and performance, short warranty (2 yrs) from sapphire and no game bundle, quite pricey.

sapphire vaporx - a friend of mine got this card because its cheaper. decent performance, short warranty, no game bundle, this card is far cheaper than toxic.

powercolor devil - only read some reviews and like the toxic, this card is also a solid performer, also a bit pricey.

Asus 270x - my cousin owns one and also a solid performer. maybe by chance you only got a bad card. i suggest you need to RMA it.

Edit: i might add, you might upgrade your mobo to the latest BIOS. new gpu drivers might also alleviate your issues. also check your psu if your card is getting enough juice.


----------



## jamponget9

Quote:


> Originally Posted by *Devildog83*
> 
> The 270's are mantle ready and eyefinity compatable so I would not go backwards. The flickering and blackscreens can be solved with drivers, I have not had those issues with newer beta drivers. Unlike others the 13.12 driver did not like my X-Fire set-up. If your just looking for a good gaming experience the Asus 270 seems to be a good card, if you want higher clocks I would go with a Toxic, Hawk or Devil.


i totally agree... the 270x performs better than the 7870 so there's no reason to go back.


----------



## Devildog83

I don't know where you buy from but at Newegg the prices are -

Asus Top - $210
Devil - $220
Toxic - $230
Hawk - $230
All others $200 to $220 Not a lot of difference there but might be elsewhere.


----------



## Devildog83

Quote:


> Originally Posted by *GuMossad*
> 
> Yeah, I'm going to the shop tomorrow and ask them before exchanging cards, to try out the card and see if everything is OK with Asus R9 270X! I think that going backwards is also stupid... But well i prefer to play secure! But let's see... I bet Asus GPU on an Asus Mobo gotta work it all! The flickering can be solved with drivers and with the vbios update!
> 
> If you write on google MSI R9 270x black screen, problems u will find loads of threads. If you write about Asus R9 270x u will only find stuff talking about MSI LOL!
> 
> I'm just sad because I brought a FX8120... they didn't had in stock 8320
> 
> 
> 
> 
> 
> 
> 
> and they don't exchange me CPU's... Maybe the best way is to burn it up
> 
> 
> 
> 
> 
> 
> 
> ?


That's funny. That 8120 should handle a 270x but what do I know, I went from the 4100 to an 8350 and a M5A99x Evo to a CHVFZ. Skipped right by the 8120.

You should do a sig rig. Go to the upper right-hand corner of this page and click rig-builder and list what you have so we can see.


----------



## GuMossad

Quote:


> Originally Posted by *jamponget9*
> 
> here are my thoughts on 270x's
> 
> sapphire toxic - solid reviews and performance, short warranty (2 yrs) from sapphire and no game bundle, quite pricey.
> 
> sapphire vaporx - a friend of mine got this card because its cheaper. decent performance, short warranty, no game bundle, this card is far cheaper than toxic.
> 
> powercolor devil - only read some reviews and like the toxic, this card is also a solid performer, also a bit pricey.
> 
> Asus 270x - my cousin owns one and also a solid performer. maybe by chance you only got a bad card. i suggest you need to RMA it.
> 
> Edit: i might add, you might upgrade your mobo to the latest BIOS. new gpu drivers might also alleviate your issues. also check your psu if your card is getting enough juice.


Yeah I think it's a bad card... Retailer will have to exchange it. I'm going for Asus for sure, I read good thing about it! I'm on the latest BIOS on Asus M5A97 Evo 2.0 mobo! Well the 700W PSU is good, GPU-Z will tell me the power right? How can I check it?

I will post GPU-Z screens in a while


----------



## GuMossad

This GPU-Z info without any ATI Drivers installed since I can't thanks to going black screen in the middle of installation!

Any more info?


----------



## Devildog83

Good luck on the new card my friend.


----------



## Falconx50

Gigabyte GA-990xaUD3 :teaching
FX 6100
16GB Gskill Sniper series Memory (1600) DDR3
WIN 7Pro
Cool Master 750W Bronze
PowerColor Devil R9 270x
:


----------



## Devildog83

Quote:


> Originally Posted by *Falconx50*
> 
> How do I post a pic into this thread?


When you click post there is a bunch of stuff at the top, the one that has mountains and a sun in a square box is for posting pics.


----------



## jamponget9

Quote:


> Originally Posted by *GuMossad*
> 
> Yeah I think it's a bad card... Retailer will have to exchange it. I'm going for Asus for sure, I read good thing about it! I'm on the latest BIOS on Asus M5A97 Evo 2.0 mobo! Well the 700W PSU is good, GPU-Z will tell me the power right? How can I check it?
> 
> I will post GPU-Z screens in a while


err.. didnt mean to say msi is a "bad card" maybe just a factory defect or something. 700w is more than enough to power-up your card. did you try considering on doing a fresh install of windows with your new card? My issues with my 270x went away after doing a fresh install.


----------



## GuMossad

Quote:


> Originally Posted by *Devildog83*
> 
> Good luck on the new card my friend.


Thank you my friend! Will keep you guys updated so I can really join the club of R9 270x owners


----------



## GuMossad

Quote:


> Originally Posted by *jamponget9*
> 
> err.. didnt mean to say msi is a "bad card" maybe just a factory defect or something. 700w is more than enough to power-up your card. did you try considering on doing a fresh install of windows with your new card? My issues with my 270x went away after doing a fresh install.


Did 2 times Windows 7 install and 2 times 8.1 install! Always the same! On Windows 7 it's worse! I think the GPU went bananas and burnt when I first launched BF4... after that black screen, music playing, had to reset when it rebooted i had colourful lines artifacts and then after login screen went black and white stripes all over. Funny is that on Safe Mode nothing happens. Drivers are also a problem too I'm aware of that, but vbios is the most recent one (015.041) and at the time i was using 13.12.

MSI is a nice brand, I agree that's why I choose them, but this card has some factory defect or something! Going to ask the retailer to try out the Asus Top on my machine and see how it goes...


----------



## Devildog83

You have been added Falconx50. Nice card.







What kind of clocks do you get?


----------



## jamponget9

yep. maybe its really defective. oh well good luck with it...


----------



## jamponget9

Quote:


> Originally Posted by *neurotix*
> 
> Curious what the ASIC and overclocking capabilities are of the Toxic 270X compared to my Vapor-X 270X. My Vapor-X is a beast and I didn't expect to get such high clocks with it. I'm wondering if they bin chips for the Toxic 270X or not.
> 
> I'd think the Toxic should probably do 1250mhz at 1.3v if it has decent ASIC. Maybe more.


Hello, with regards to the ASIC of Sapphire Toxic 270x, here it is.


----------



## DiceAir

My asic quality on one card is 62% and the other 55%. looks really bad no wonder I can't get overclocking stable and have stock voltage of 1.26V


----------



## [CyGnus]

Quote:


> Originally Posted by *DiceAir*
> 
> My asic quality on one card is 62% and the other 55%. looks really bad no wonder I can't get overclocking stable and have stock voltage of 1.26V


Asic does not mean anything really my 280 x is 56.4% and it does 1235/1850

http://www.3dmark.com/3dm11/7656094


----------



## DiceAir

Quote:


> Originally Posted by *[CyGnus]*
> 
> Asic does not mean anything really my 280 x is 56.4% and it does 1235/1850
> 
> http://www.3dmark.com/3dm11/7656094


so any suggestions how i can get my overclock stable. What is the max voltage for memory and what is the max for core voltage. I played bf4 @ 1150/1750 and I was @ 1.6v memory and suddenly I got black and grey stripes across my screen and then when I restarted my pc my memory was stuck @ 700mhz


----------



## TempAccount007

Quote:


> Originally Posted by *[CyGnus]*
> 
> Asic does not mean anything really my 280 x is 56.4% and it does 1235/1850
> 
> http://www.3dmark.com/3dm11/7656094


You can run that at sustained 100% load with no artifacts? That's amazing.


----------



## [CyGnus]

Quote:


> Originally Posted by *TempAccount007*
> 
> You can run that at sustained 100% load with no artifacts? That's amazing.


No that is my max clocks for benching for 24/7 i run 1200/1625 with 1.256v


----------



## DiceAir

Quote:


> Originally Posted by *[CyGnus]*
> 
> No that is my max clocks for benching for 24/7 i run 1200/1625 with 1.256v


wow amazing. Maybe my cards are screwed as my idle voltage is 1260.


----------



## Jaffi

Quote:


> Originally Posted by *[CyGnus]*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DiceAir*
> 
> My asic quality on one card is 62% and the other 55%. looks really bad no wonder I can't get overclocking stable and have stock voltage of 1.26V
> 
> 
> 
> Asic does not mean anything really my 280 x is 56.4% and it does 1235/1850
> 
> http://www.3dmark.com/3dm11/7656094
Click to expand...

I second that. All the newer 280x cards I have seen and owned had an asic around 60%. I guess that has to do with the revised xtl chip.

Tapatalked from my Nexus 5


----------



## DiceAir

So the question is I'm running a Qnix Qx2710 on 96Hz. Should i overclock as I can get pretty close to 96fps in some games without the overclock? I overclocked my GTX570 sli, GTX280 single card with great success but not on this screen on an old 1080p screen. I only have experience with MSI cards and now that I'm running CLUB3d (got it for bargain) has no heatsinks on the memory only thing cooling them is the heatsink covering the whole card so a bit worry about temps there. I also tried 1200MHz but what I could remember is having some issues on 1.263v (stock voltage). Maybe these cards are just bad in general. Wish I bought the Sapphire toxic or so instead of these cards.


----------



## jamponget9

Quote:


> Originally Posted by *DiceAir*
> 
> so any suggestions how i can get my overclock stable. What is the max voltage for memory and what is the max for core voltage. I played bf4 @ 1150/1750 and I was @ 1.6v memory and suddenly I got black and grey stripes across my screen and then when I restarted my pc my memory was stuck @ 700mhz


There are several factors why OCing your card may not be successful... One of which is that your board me be limiting you to OC your card Further... Second, insufficient power to the card (might want to increase voltage but be careful of the Card Temps). Third, you might be throttling your card or your OC setting is too high, start at the stock and go up bit by bit from there till you find your sweet spot. and lastly the card itself might already be Factory OCed too high that your OC headroom is very small (Like my Toxic). Most R9's are already Stock OCed so expect a bit of a challenge in making stable OCs. And remember all of them Cards might be of the same Brand and Label, but they dont always Overclock as high as the others. Artefacting and Black Screen are warning signs that your card is running at its limits, take a notch down so you wont damage your card further.


----------



## jamponget9

Quote:


> Originally Posted by *DiceAir*
> 
> So the question is I'm running a Qnix Qx2710 on 96Hz. Should i overclock as I can get pretty close to 96fps in some games without the overclock? I overclocked my GTX570 sli, GTX280 single card with great success but not on this screen on an old 1080p screen. I only have experience with MSI cards and now that I'm running CLUB3d (got it for bargain) has no heatsinks on the memory only thing cooling them is the heatsink covering the whole card so a bit worry about temps there. I also tried 1200MHz but what I could remember is having some issues on 1.263v (stock voltage). Maybe these cards are just bad in general. Wish I bought the Sapphire toxic or so instead of these cards.


My personal opinion is that playing games beyond 60fps for me is more than enough and there is no point for me to push my card further unless im benchmarking my system, if your card is a low profile design one with no fans whatsoever, leave it at stock, however, if you plan to OC it, you might want to use some aftermarket cooling solution for the card. Again, make sure to monitor your GPU temps.


----------



## [CyGnus]

Quote:


> Originally Posted by *Jaffi*
> 
> I second that. All the newer 280x cards I have seen and owned had an asic around 60%. I guess that has to do with the revised xtl chip.
> 
> Tapatalked from my Nexus 5


Could be, my card is XTL and the stock voltage is 1.15v


----------



## Devildog83

I have read extensively about the ASIC quality as it relates to performance and what I have gathered from that is it really only tells you what VID is needed to achieve stock clocks and not how far a chip can overclcock and only has to do with the GPU and not memory chip. Because a card has voltage leaking at stock may not mean when overclocking that it will stay that way. GPUZ has said that a lower ASIC requires a bit more Volts and a higher ASIC requires less but time after time in forums and responses from reviewers that has been proven wrong I.E. the card CyGnus uses. My cards are both over 78% but I don't feel they overclock better than say a Hawk or Toxic, maybe I will be proven wrong there as I have to small of a data base to confirm that but we will see.

My conclusion is that heat might be the only thing affected and that might not be very much due to the advances in non-ref cooling and W/C options.

This is just my opinion, you can take it for what it is. I am no engineer.


----------



## Devildog83

New 3DMark11 high score for me - I love this set-up.









http://www.3dmark.com/3dm11/7721509


----------



## Jaffi

Today I did some OCing and benchmarking and after some time I checked temp readings in GPU Z. It said that VRM1 had reached a max of 200° C and VRM2 a min of 15° C. That is both very unlikely, might have been a wrong reading or such? I tested again after that and everything was fine again. GPU never was above 77° C.


----------



## Devildog83

Quote:


> Originally Posted by *Jaffi*
> 
> Today I did some OCing and benchmarking and after some time I checked temp readings in GPU Z. It said that VRM1 had reached a max of 200° C and VRM2 a min of 15° C. That is both very unlikely, might have been a wrong reading or such? I tested again after that and everything was fine again. GPU never was above 77° C.


Strange, try using HWinfo64 instead to monitor you GPU's.


----------



## [CyGnus]

Quote:


> Originally Posted by *Devildog83*
> 
> New 3DMark11 high score for me - I love this set-up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7721509


Not bad but CFX of 7870's i was expecting more i have 143xx with a 280x...
Are you sure they are both being used 100% when running 3D?


----------



## Devildog83

Quote:


> Originally Posted by *[CyGnus]*
> 
> Not bad but CFX of 7870's i was expecting more i have 143xx with a 280x...
> Are you sure they are both being used 100% when running 3D?


What's you graphics score. My CPU score holds my P score back. Does the 280x get near 20,000 on graphics.


----------



## TempAccount007

Quote:


> Originally Posted by *[CyGnus]*
> 
> Could be, my card is XTL and the stock voltage is 1.15v


How can you tell you have a XTL? I have a DC2T as well.


----------



## [CyGnus]

hardware info check there


----------



## TempAccount007

BIOS : 015.039.000.000
GUID : VEN_1002&DEV_6798&SUBSYS_30061043&REV_00&BUS_1&DEV_0&FN_0

Do either of those tell me which version of the chip I have?


----------



## [CyGnus]

My asus top has that same bios so it might be a XTL but just instal hardware info 64 and check there.

Check the pic the 1st ones only had Tahiti XT not XT/XTL


----------



## TempAccount007

OK mine shows the same. Thanks for the help


----------



## [CyGnus]

also the XTL is way cooler than the XT my card averages 55/56ºc full load (45% fan speed ambient 13ºc)


----------



## [email protected]

Is anyone noticed there is a new bios update for ASUS 280x DCU II TOP?


----------



## ILLmatik94

Quote:


> Originally Posted by *[email protected]*
> 
> Is anyone noticed there is a new bios update for ASUS 280x DCU II TOP?


I tried updating and this is what happened



My bios is still the same as before (15.39.0.0.AS04)


----------



## [email protected]

I tried, too and flash fail same as urs.


----------



## [CyGnus]

same boat here


----------



## DizzlePro

im planning to get a R9 280X within the next too weeks.

if all cards where the same price then which would u recommend ?


----------



## [CyGnus]

DizzlePro Asus 280X DC2 Top very quiet, voltage unlocked and low temps


----------



## Devildog83

Quote:


> Originally Posted by *DizzlePro*
> 
> im planning to get a R9 280X within the next too weeks.
> 
> if all cards where the same price then which would u recommend ?


Which one would look better in your case. Just kidding. There are enough guys here with the 280x to get a good answer. Asus DCII top seems to be the most popular.


----------



## Archea47

Thanks for showing me where to look with HWInfo64 - my Gigabyte 280x' Rev2s come in as Tahiti XT/XTL too


----------



## GuMossad

Well guys after all MSI do sucks... I've counted at least 20 people complaining about their cards being dead like mine!

Retailer gave me an Asus DCUII Top as I requested and everything is smooth. No flickering, no nothing! Updated with the latest bios (015.041) and running 13.12 everything is smooth! BF4 constant 61 fps on my FX8120 (no OC yet) and GPU going at 72º C 100% Fan (this is my fault as my actual case sucks balls and i need fans







)

So If anyone here runs into problems with cards going all black, hanging system and all from MSI (also Powercolor, Gigabyte and some Sapphire entry level) all u gotta do is exchange in your retailer for an Asus Top, Sapphire Toxic, Powercolor Devil... good brand manufactuers.

As soon as I get my new Bitfenix Merc Alpha, BeQuiet Silentwings 2 120MM and my Hydro H60 things will be very smooth







!

Thanks for all the people who helped me and been patient!


----------



## neptunus

Is there any nvinspector-like tool for AMD?

*Update:* RadeonPro...


----------



## jamponget9

Quote:


> Originally Posted by *Archea47*
> 
> Thanks for showing me where to look with HWInfo64 - my Gigabyte 280x' Rev2s come in as Tahiti XT/XTL too


I also just checked HWINFO64 my Toxic 270x is a CURACAO XT. so, what does that mean?


----------



## [CyGnus]

jamponget9 that is only for 280X the XT's are 7970 rebrand the XTL is the new chip with less power consumption.

Check this article:
http://wccftech.com/amd-preparing-tahiti-xtl-revision-radeon-r9-280x-graphic-card/

''The new Tahiti XTL chip would bring lower power consumption, possibly new features such as AMD TrueAudio and full DirectX 11.2 support and even higher clocks. There would still be a large quantity of Tahiti XT2 chips in the market but those looking for the XTL based cards would have to search through their purchases since their isn't going to be any differentiating factor between the two variants. Pricing of the Radeon R9 280X will remain unaffected and supply won't be an issue since AMD always has those left-overs Tahiti XT2 chips to rebrand as the R9 280X graphic card.''


----------



## [email protected]

Quote:


> Originally Posted by *GuMossad*
> 
> Retailer gave me an Asus DCUII Top as I requested and everything is smooth. No flickering, no nothing! Updated with the latest bios (015.041) and running 13.12 everything is smooth! BF4 constant 61 fps on my FX8120 (no OC yet) and GPU going at 72º C 100% Fan (this is my fault as my actual case sucks balls and i need fans
> 
> 
> 
> 
> 
> 
> 
> )


How did you update your bios? I tried so many times via GPU Tweak, always flash fail.


----------



## Devildog83

Quote:


> Originally Posted by *GuMossad*
> 
> Well guys after all MSI do sucks... I've counted at least 20 people complaining about their cards being dead like mine!
> 
> Retailer gave me an Asus DCUII Top as I requested and everything is smooth. No flickering, no nothing! Updated with the latest bios (015.041) and running 13.12 everything is smooth! BF4 constant 61 fps on my FX8120 (no OC yet) and GPU going at 72º C 100% Fan (this is my fault as my actual case sucks balls and i need fans
> 
> 
> 
> 
> 
> 
> 
> )
> 
> So If anyone here runs into problems with cards going all black, hanging system and all from MSI (also Powercolor, Gigabyte and some Sapphire entry level) all u gotta do is exchange in your retailer for an Asus Top, Sapphire Toxic, Powercolor Devil... good brand manufactuers.
> 
> As soon as I get my new Bitfenix Merc Alpha, BeQuiet Silentwings 2 120MM and my Hydro H60 things will be very smooth
> 
> 
> 
> 
> 
> 
> 
> !
> 
> Thanks for all the people who helped me and been patient!


Awesome


----------



## Devildog83

Quote:


> Originally Posted by *[email protected]*
> 
> How did you update your bios? I tried so many times via GPU Tweak, always flash fail.


Use a flash drive just like you would your MB bios, you will have better success.


----------



## [CyGnus]

Devildog83 were is that new bios though... 15.41?!?


----------



## Devildog83

Quote:


> Originally Posted by *[CyGnus]*
> 
> Devildog83 were is that new bios though... 15.41?!?


It appears that there is not really a bios update from what I have read. I cannot explain why it's in GPU Tweak but nobody can find one. The Asus forum says if it's not on the website there is none.


----------



## [CyGnus]

Thanks


----------



## GuMossad

Quote:


> Originally Posted by *[email protected]*
> 
> How did you update your bios? I tried so many times via GPU Tweak, always flash fail.


Well mate I went to Asus.com website, to the support area, picked up the card, chose BIOS and got an exe from winflash bla bla







just execute it and it's done


----------



## Devildog83

Quote:


> Originally Posted by *GuMossad*
> 
> Well mate I went to Asus.com website, to the support area, picked up the card, chose BIOS and got an exe from winflash bla bla
> 
> 
> 
> 
> 
> 
> 
> just execute it and it's done


OK, I stand corrected. I could not find it anywhere.


----------



## GuMossad

Quote:


> Originally Posted by *Devildog83*
> 
> OK, I stand corrected. I could not find it anywhere.


http://www.asus.com/es/Graphics_Cards/R9270XDC2T2GD5/#support

There it is







Go to Drivers and Downloads, then pick up BIOS!


----------



## [CyGnus]

that one is for 270X i thought the update was for 280X


----------



## Devildog83

Quote:


> Originally Posted by *[CyGnus]*
> 
> that one is for 270X i thought the update was for 280X


Quote:


> Originally Posted by *GuMossad*
> 
> http://www.asus.com/es/Graphics_Cards/R9270XDC2T2GD5/#support
> 
> There it is
> 
> 
> 
> 
> 
> 
> 
> Go to Drivers and Downloads, then pick up BIOS!


I could be wrong but they are looking for the 280x which has no update.

Opps beat to the punch.

The updated bios is for flickering on the 270x only.


----------



## GuMossad

Quote:


> Originally Posted by *Devildog83*
> 
> I could be wrong but they are looking for the 280x which has no update.
> 
> Opps beat to the punch.
> 
> The updated bios is for flickering on the 270x only.


Yeah LOL a little misunderstanding from all parts! But yeah, for 270x owners just do it


----------



## [CyGnus]

weird how the bios shows up as an update in Asus Gpu Tweak for 280X owners though...


----------



## Jaffi

I am very confused


----------



## Devildog83

I am always confused.


----------



## hyp3rtraxx

Will my psu be able to handle 2*"XFX Radeon R9 280X 3GB (Tahiti XTL)" crossfire not overclocked? Will use for light gaming (dota 2) and mining?


----------



## Devildog83

Quote:


> Originally Posted by *hyp3rtraxx*
> 
> Will my psu be able to handle 2*"XFX Radeon R9 280X 3GB (Tahiti XTL)" crossfire not overclocked? Will use for light gaming (dota 2) and mining?


Yes it will do fine. I would be surprised if you use much more than 650w at full load, maybe 700w max, maybe but I doubt it. That is if you overclock the cards.


----------



## 2JZ

Is there any reason why it may be a bad idea to use MSI Afterburner for an ASUS R9 280x?

In GPU tweak (ASUS software) I am unable to touch memory voltage however, in MSI afterburner I can. Is this a bad idea?


----------



## Devildog83

Quote:


> Originally Posted by *2JZ*
> 
> Is there any reason why it may be a bad idea to use MSI Afterburner for an ASUS R9 280x?
> 
> In GPU tweak (ASUS software) I am unable to touch memory voltage however, in MSI afterburner I can. Is this a bad idea?


NO, a lot of people use Afterburner for that reason. I have Powercolor cards and the only one that works for me is Afterburner. Just remember to turn off Graphics overdrive in CCC. It seems to cause stability issues with another overclock tool in use.


----------



## 2JZ

Quote:


> Originally Posted by *Devildog83*
> 
> NO, a lot of people use Afterburner for that reason. I have Powercolor cards and the only one that works for me is Afterburner. Just remember to turn off Graphics overdrive in CCC. It seems to cause stability issues with another overclock tool in use.


Pardon my ignorance but what is CCC?


----------



## Devildog83

Quote:


> Originally Posted by *2JZ*
> 
> Pardon my ignorance but what is CCC?


AMD's Catalyst Control Center, took me a minute to figure that out the 1st time I saw it too.


----------



## 2JZ

Quote:


> Originally Posted by *Devildog83*
> 
> AMD's Catalyst Control Center, took me a minute to figure that out the 1st time I saw it too.


Thanks!

Any idea on what save voltages for memory are in afterburner? I always monitor temps via my g15 LCD fyi


----------



## Devildog83

Quote:


> Originally Posted by *2JZ*
> 
> Thanks!
> 
> Any idea on what save voltages for memory are in afterburner? I always monitor temps via my g15 LCD fyi


For my card Afterburner does not show memory voltage, just the clock. I can't answer that.


----------



## Joeking78

Quote:


> Originally Posted by *2JZ*
> 
> Thanks!
> 
> Any idea on what save voltages for memory are in afterburner? I always monitor temps via my g15 LCD fyi


There is no voltage adjustment on memory for 280x afaik.


----------



## Takla

Quote:


> Originally Posted by *2JZ*
> 
> Is there any reason why it may be a bad idea to use MSI Afterburner for an ASUS R9 280x?
> 
> In GPU tweak (ASUS software) I am unable to touch memory voltage however, in MSI afterburner I can. Is this a bad idea?


memory voltage might look like to be adjustable but it wont actually change. it will stay on 1.6v. also core voltage for the asus gpu"s might only be changed with gpu tweak since asus didn't allow it for any other tool.


----------



## 2JZ

Quote:


> Originally Posted by *Joeking78*
> 
> There is no voltage adjustment on memory for 280x afaik.


In afterburner there is a little arrow on the right of the voltage, hit that to open a menu for memory voltage


----------



## 2JZ

Quote:


> Originally Posted by *Takla*
> 
> memory voltage might look like to be adjustable but it wont actually change. it will stay on 1.6v. also core voltage for the asus gpu"s might only be changed with gpu tweak since asus didn't allow it for any other tool.


This absolutely makes sense. I would push the clocks on stock voltage until it became unstable, then adding voltage all the way to 1.3v did not help


----------



## 2JZ

My ASUS R9 280x DC2T v2 does have memory voltage adjustability in GPU tweak as well, it was hidden within the settings


----------



## Jaffi

I personally don't mess with mem voltages, gain is too small if there even is any. Risk too big imho.
Hey Devildog, could you add me to the club? I posted a couple of pics a while back: http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-owners-club/2540#post_21428898
My standard clocks are 1150/1700 @1.2v which I am running most of the time!
By the way, I am running the card for over a week now and I don't have anything bad to say about it. Highly recommendable!

Thanks


----------



## Devildog83

Jaffi has been added, it was just a couple days before I took over and I missed it. Yer in now.


----------



## Devildog83

I don't do Firestrike much bit is this a decent score?



I did break 20,000 graphics on 3DMark11


----------



## neurotix

Quote:


> Originally Posted by *Devildog83*
> 
> I don't do Firestrike much bit is this a decent score?
> 
> 
> 
> I did break 20,000 graphics on 3DMark11


That's a decent score for Fire Strike, anything over 10k would probably be considered a very high spec system afaik.

Try running Fire Strike Extreme if you can, that one is pretty much the most demanding modern graphics card test you can run.

2x 3dmark Fire Strike Extreme ranking: http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+extreme+preset/version+1.1/2+gpu


----------



## Devildog83

I only have the free version. Extreme you gotta pay for right?


----------



## neurotix

Yeah.

Here's 2x Fire Strike rankings for you to compare. http://hwbot.org/benchmark/3dmark_-_fire_strike/rankings?cores=2#start=0#interval=20

Looks like your score is close to 7870s, 7850s, 660 tis, 760s, 6990 and 580 SLI. Not bad.


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> Yeah.
> 
> Here's 2x Fire Strike rankings for you to compare. http://hwbot.org/benchmark/3dmark_-_fire_strike/rankings?cores=2#start=0#interval=20
> 
> Looks like your score is close to 7870s, 7850s, 660 tis, 760s, 6990 and 580 SLI. Not bad.


I only look at graphics because AMD don't like physics.


----------



## Jonsu

Hello OCN, is a *non-TOP* directCU II 280x the same as its overclocked counterpart (TOP)? If so, is the overclocking headroom the same for both then?


----------



## gnemelf

I think i'm going to get one of the NZXT KRAKEN™ G10 GPU BRACKET for my 280x when they drop... looks pretty legit.


----------



## Buehlar

Quote:


> Originally Posted by *Jonsu*
> 
> Hello OCN, is a *non-TOP* directCU II 280x the same as its overclocked counterpart (TOP)? If so, is the overclocking headroom the same for both then?


The components used are the same. The only difference is that the TOP versions use pre-binned components that pass a pre determined quality/performance threshold and cherry pick the best for the TOP. This guarentees the consumer a certian degree of OCing performance provided by the TOP version.
This doesn't necessarilly mean that a non-TOP version won't perform equal to or surpass that of a TOP version, there is always the chance that it will be just as good or even better.
Just the luck of the draw
















@Devildog
Nice to see you firing those twin demons again









Keep up the good work


----------



## DizzlePro

Quote:


> Originally Posted by *Buehlar*
> 
> The components used are the same. The only difference is that the TOP versions use pre-binned components that pass a pre determined quality/performance threshold and cherry pick the best for the TOP. This guarentees the consumer a certian degree of OCing performance provided by the TOP version.
> This doesn't necessarilly mean that a non-TOP version won't perform equal to or surpass that of a TOP version, there is always the chance that it will be just as good or even better.
> Just the luck of the draw
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @Devildog
> Nice to see you firing those twin demons again
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Keep up the good work


is the top worth £30 more tthan the non top?

no where in the Uk has the non "Top" in stock


----------



## Nirvana91

I have 2x Asus 280X TOP in crossfire.

With Asus GPU Tweak I can only max the voltage at 1.263v...? Why..?

I see everyone capable of reaching 1.3v, why can´t I?


----------



## Buehlar

Quote:


> Originally Posted by *DizzlePro*
> 
> is the top worth £30 more tthan the non top?
> 
> no where in the Uk has the non "Top" in stock


IMO...no.
The gains in performance that you "may" get from the TOP version over an OCed non-TOP would be hardly enough to justify the extra costs.
You probably won't notice any difference in real world performance, but if you're after scoring the highest synthetic benchmarks then you may consider TOP.
Quote:


> Originally Posted by *Nirvana91*
> 
> I have 2x Asus 280X TOP in crossfire.
> 
> With Asus GPU Tweak I can only max the voltage at 1.263v...? Why..?
> 
> I see everyone capable of reaching 1.3v, why can´t I?


I have HD7870s and depending on which version of GPU Tweak I use, the max voltage is different.
eg..Using the latest version allows me 1.3v max.
I'm currently using version 2.4.7 which gives me 1.4v max.


----------



## TempAccount007

That version shows 1.4v max, but it doesn't actually give your card that. Your card actually decreases its voltage.

If you don't believe me, set it to 1.4 then look at the voltage in the gpu tweak monitor or gpu-z. Also you will notice that clocks previously possible at 1.3v are no longer stable.

All of this is true with at least the ASUS DC2T 280x.


----------



## Buehlar

Quote:


> Originally Posted by *TempAccount007*
> 
> That version shows 1.4v max, but it doesn't actually give your card that. Your card actually decreases its voltage.
> 
> If you don't believe me, set it to 1.4 then look at the voltage in the gpu tweak monitor or gpu-z. Also you will notice that clocks previously possible at 1.3v are no longer stable.
> 
> All of this is true with at least the ASUS DC2T 280x.


The slider goes up to 1.4, however I've never pushed it above 1.35v and GPUtweak mon does report a set value as does HWinfo.

HD7870 DC2 ver.2 here.


----------



## [CyGnus]

Quote:


> Originally Posted by *Nirvana91*
> 
> I have 2x Asus 280X TOP in crossfire.
> 
> With Asus GPU Tweak I can only max the voltage at 1.263v...? Why..?
> 
> I see everyone capable of reaching 1.3v, why can´t I?


They are hardware capped at 1.263v even if others say they have 1.3v in GPU-Z it reads max 1.263 the only card that does more is the Matrix Platinum it goes o 1.4v (1.382v)


----------



## TempAccount007

My DC2T 280x shows max voltage of 1.279 in gpu-z when I set it to 1.3v. That's before up to 80mv vdroop under load.


----------



## [CyGnus]

TempAccount007 that is maybe a spike not constant voltage...


----------



## Gabkicks

I just ordered 2 Sapphire Dual-x 280x's. How is microstutter? I could cancel my order and order 2 Asus 280x DCUT's instead for $30 more each. I am not sure if they're worth it.


----------



## [CyGnus]

Gabkicks very worth it the cooler is better and more silent on the Asus Top and they are 1070/6400 stock Vs the 1000/6000 of the sapphires


----------



## Buehlar

Quote:


> Originally Posted by *[CyGnus]*
> 
> They are hardware capped at 1.263v even if others say they have 1.3v in GPU-Z it reads max 1.263 the only card that does more is the Matrix Platinum it goes o 1.4v (1.382v)


Interesting that they did this with the 280x. I'm wondering if the 270x is capped also...anyone?

The HD7870 DCII's are wide open. Set at 1.35v . (1337mv)


Quote:


> Originally Posted by *TempAccount007*
> 
> My DC2T 280x shows max voltage of 1.279 in gpu-z when I set it to 1.3v. That's before up to 80mv vdroop under load.


----------



## [CyGnus]

Buehlar well just open GPU-Z click on the *?* and see the sensors


----------



## AlphaC

http://www.hkepc.com/10576




Gigabyte r9 280X review over on HKEPC shows the new cooler has the same lesser quality Low RDS(on) mosfets , in 6+2 configuration

Three 8mm heatpipes







.


----------



## Roaches

I expected that from Rev 2, Theres really no difference from Rev 1 since the only thing that changed was the shroud look and black PCB....Really lame on Gigabyte's part


----------



## Buehlar

Quote:


> Originally Posted by *[CyGnus]*
> 
> Buehlar well just open GPU-Z click on the *?* and see the sensors


Thanks, did as suggested. Max 1.199v reported below via GPU-Z which does differ from GPU Tweak monitor reporting 1.33v.
This is with a 245Mhz OC set @ 1.35v via GPU Tweak. I can not achieve these clocks without upping the voltage beyond 1.3v.
I find it difficult to trust GPU-z is reporting these values correctly.
Very interesting.









Any thoughts as to why voltage manipulation drastically improves these cards (HD7870's) overclocking abilities but only GPU-Z tells me that my voltage settings are irrelevant...basically suggesting my cards are voltage capped?


----------



## bokchoi

Just a quick question....I can't seem to find the answer anywhere.... Im not taking my card apart to find this out but Im curious about what memory chips the Sapphire R9 270X Vapor-X 2gb uses.. Hynix or Elpida etc....


----------



## JoeDirt

Quote:


> Originally Posted by *AlphaC*
> 
> http://www.hkepc.com/10576
> 
> 
> 
> 
> Gigabyte r9 280X review over on HKEPC shows the new cooler has the same lesser quality Low RDS(on) mosfets , in 6+2 configuration
> 
> Three 8mm heatpipes
> 
> 
> 
> 
> 
> 
> 
> .


I think we all got kinda screwed on these cards. I'm also wondering if how they weld the copper contact plate to the heat pipes has been damaging some and causing the heating issues with a lot of these. I have a Prolimatech PRO-MK-26 with two Corsair AF140's that will be here tomorrow to replace the Windforce 3.


----------



## Luckael

guys,

is there any issue's when it comes to 270x Crossfire? like screen tearing?

Tia


----------



## TempAccount007

Quote:


> Originally Posted by *[CyGnus]*
> 
> TempAccount007 that is maybe a spike not constant voltage...


----------



## Buehlar

Definitely not a spike.... what does GPU Tweak monitor report?


----------



## TempAccount007

GPU Tweak shows the same thing. However like I said before, this is without a heavy and varying load.

While gaming, the lowest I have seen the voltage drop by is 80mv.

In that way, the screenshot you see (which was set by 'force constant voltage' in AB), is almost never seen while gaming due to vdroop. The voltage is usually between 1.22 to 1.25.

If there was a way to consistently achieve 1.277v while gaming on this card I would be very happy... I think Matrix cards can adjust vdroop, but AFIK DC2T's cannot.


----------



## gnemelf

powercolor 280x with 1.3v set in msi but yea i get the vdroop when load goes to card.


----------



## Devildog83

Quote:


> Originally Posted by *Luckael*
> 
> guys,
> 
> is there any issue's when it comes to 270x Crossfire? like screen tearing?
> 
> Tia


I have none.


----------



## Mackem

Anyone that has a 280X DirectCU II TOP can you tell me what drivers you are using and whether you have any issues or not? I'm on the latest drivers and still get the odd glitch in League of Legends as well as the thing where whatever is displayed on my monitors will move down vertically for like a split second then go back to normal.


----------



## JCH979

I'm using beta 9.5 for almost 3 weeks now, I haven't had any issues







I don't play League of Legends so can't comment on that.


----------



## Mackem

Quote:


> Originally Posted by *JCH979*
> 
> I'm using beta 9.5 for almost 3 weeks now, I haven't had any issues
> 
> 
> 
> 
> 
> 
> 
> I don't play League of Legends so can't comment on that.


I've had issues with almost all drivers I've tried. Fresh install of Windows too.


----------



## pac08

Question to MSI 280X owners. Is the card voltage locked, and if that's the case is there a BIOS around that can unlock voltage control? I 've ordered one but if i can't undervolt it i'm going to have to cancel the order.


----------



## madorax

Quote:


> Originally Posted by *pac08*
> 
> Question to MSI 280X owners. Is the card voltage locked, and if that's the case is there a BIOS around that can unlock voltage control? I 've ordered one but if i can't undervolt it i'm going to have to cancel the order.


i had MSI 280X i can tell you it's not locked. it can go to 1300, the default voltage is 1156 in my card. i haven't try to undervolt it... and i don't know why you want to do that, but i can assure you the card is not voltage locked.

look for the bios in here, and match this with your card later, if the last digit is still 331 (or 3XX) instead 435 then you need to update the bios, take it from HERE or you can go to official MSI forum to ask one, either way you will get the same BIOS, i DO try both bios and compare though, it is the same bios.


----------



## Pis

Mine.

Sapphire TOXIC R9 280X.


Spoiler: Warning: Spoiler!


----------



## Devildog83

Pis, you have been added. Welcome and that's a very nice card.


----------



## jamor

^ wow thats gorgeous

Asus 280x - Is overclocking an art or do I just have a poor card? I can't boot either clock more than +40 even with overvolting iif I want 100% stability and 0 errors I am unable to get past 1100 MHZ / 1640 MHZ.

1.2v up to 1.3v ... Techpowerup GPU Tool Stability test is showing errors/artifacts instantly and Valley benchmark major artifacts over stated clocks.

Seems like most of the reviews cranked it up to around 1200/1800 @ 1.25v/1.3v without a hiccup...

OD was disabled using GPU tweak.

One weird thing - sometimes it wouldn't show errors for a while on higher clocks like 1800mhz @ 1.3v but I would eventually get an orange or a red screen of death that doesn't go away and I have to power down unsafely. never seen that before.


----------



## Gabkicks

okay i had my order changed to 2 ASUS DC2T 280x's .







Now i just need to figure out how I'm going to get my power supply running them since thePSU only has one 6 and one 6+2 pin PCIE cables.


----------



## jamor

Quote:


> Originally Posted by *Gabkicks*
> 
> okay i had my order changed to 2 ASUS DC2T 280x's .
> 
> 
> 
> 
> 
> 
> 
> Now i just need to figure out how I'm going to get my power supply running them since thePSU only has one 6 and one 6+2 pin PCIE cables.


You don't have 3 available 6 pins? The connector IS 6 & 8 so you should be fine.

But it comes with a 2-in-1 pci-e adapter so I actually have 6+6+6 (3 pci-e) going into my card.


----------



## Devildog83

Quote:


> Originally Posted by *Gabkicks*
> 
> okay i had my order changed to 2 ASUS DC2T 280x's .
> 
> 
> 
> 
> 
> 
> 
> Now i just need to figure out how I'm going to get my power supply running them since thePSU only has one 6 and one 6+2 pin PCIE cables.


In all honesty I would get a new PSU. A real good 850w 80 plus gold or platinum. You really don't want to sell yourself short on the heart of your system.


----------



## [email protected]

I have a Gigabyte r9 280x v2 voltage locked , Can anyone please tell me how to unlock the voltage?

bios is locked to 1.2v and of course cannot be moved in any way. Is it possible to bypass the voltage lock with a bios mod or flashing the bios of one of the unlocked cards?

With asus gpu tweak its possible to slide through 1.3v for example but in gpu-z the vddc shows only 1.2v


----------



## Devildog83

I would like to wish everyone in the club a "HAPPY NEW YEAR". If you party, "Party On" if not try to anyway. I hope 2014 is a great year for *ALL!!!*

*GO HAWKS*


----------



## jamor

then it works at 1110/1700? I dont understand this card


----------



## DizzlePro

i'm looking to buy any minute now but does the Toxic have Unlocked voltage?

*Sapphire Toxic £275.99* http://www.overclockers.co.uk/showproduct.php?prodid=GX-324-SP
*
Asus DirectCUII TOP is £281.99* http://www.overclockers.co.uk/showproduct.php?prodid=GX-323-AS&groupid=701&catid=56&subcat=1842


----------



## Stay Puft

Quote:


> Originally Posted by *Pis*
> 
> Mine.
> 
> Sapphire TOXIC R9 280X.
> 
> 
> Spoiler: Warning: Spoiler!


That card is beautiful


----------



## JoeDirt

Because why not? MK-26 with two Corsair AF140's on my Gigabyte 280x. Wow. Such board flex. Much cooling. Super weight. Wow


----------



## Pis

Quote:


> Originally Posted by *Devildog83*
> 
> Pis, you have been added. Welcome and that's a very nice card.


Thx ^^


----------



## Pis

Quote:


> Originally Posted by *DizzlePro*
> 
> i'm looking to buy any minute now but does the Toxic have Unlocked voltage?
> 
> *Sapphire Toxic £275.99* http://www.overclockers.co.uk/showproduct.php?prodid=GX-324-SP
> *
> Asus DirectCUII TOP is £281.99* http://www.overclockers.co.uk/showproduct.php?prodid=GX-323-AS&groupid=701&catid=56&subcat=1842





Spoiler: Warning: Spoiler!







You can adjust the voltage, I guess....Sapphire TRIXX required.


----------



## Pis

Quote:


> Originally Posted by *Stay Puft*
> 
> That card is beautiful


Yeah, Zotac's theme lol


----------



## GoLDii3

Quote:


> Originally Posted by *Pis*
> 
> Yeah, Zotac's theme lol


Your name in spanish means pee,lol.


----------



## Pis

Quote:


> Originally Posted by *GoLDii3*
> 
> Your name in spanish means pee,lol.


lol. It is just my nickname anyway hehehe

Go to youtube, type: Pis R9 280X

x3


----------



## GoLDii3

Quote:


> Originally Posted by *Pis*
> 
> lol. It is just my nickname anyway hehehe
> 
> Go to youtube, type: Pis R9 280X
> 
> x3


Unboxing


----------



## chaics

hi guys, can my psu run r9 280x nicely? planning for an upgrade. below is the spec of my 80+ psu. gigabyte odin pro 550watt.. thanks and happy new year..


----------



## JoeDirt

I'd say your border-line. Only one way to find out.


----------



## Pis

Quote:


> Originally Posted by *chaics*
> 
> hi guys, can my psu run r9 280x nicely? planning for an upgrade. below is the spec of my 80+ psu. gigabyte odin pro 550watt.. thanks and happy new year..


R9 280X can consume up to 300W. hmm, which R9 280X you planned to buy?


----------



## sterob

does 280x have coil whine problem like 7970? and how is the cooler quality of MSI, Sapphire and asus?


----------



## chaics

R9 280X can consume up to 300W. hmm, which R9 280X you planned to buy?[/quote]
Quote:


> Originally Posted by *JoeDirt*
> 
> I'd say your border-line. Only one way to find out.


Define one way to find out plz.. buy and put it for test run? i wish to buy and use straight away instead of buy, test, sell then buy others.. quite a hassle for me..

im looking at MSI R9 280X GAMING 3G or GIGABYTE R9 280X 3GB. Will this be a good upgrade comparing my aging CFx 6850?


----------



## madorax

Quote:


> Originally Posted by *chaics*
> 
> R9 280X can consume up to 300W. hmm, which R9 280X you planned to buy?
> Define one way to find out plz.. buy and put it for test run? i wish to buy and use straight away instead of buy, test, sell then buy others.. quite a hassle for me..
> 
> im looking at MSI R9 280X GAMING 3G or GIGABYTE R9 280X 3GB. Will this be a good upgrade comparing my aging CFx 6850?


FYI, Giga is voltage locked and MSI is not, so if you planning to get more performance by OC-ing it, MSI will be more worth to buy.


----------



## chaics

Quote:


> Originally Posted by *madorax*
> 
> FYI, Giga is voltage locked and MSI is not, so if you planning to get more performance by OC-ing it, MSI will be more worth to buy.


are you using a 550watt psu on a r9 280x? did you OCed yours? planning to use it on my 550watt psu too.


----------



## madorax

Quote:


> Originally Posted by *chaics*
> 
> are you using a 550watt psu on a r9 280x? did you OCed yours? planning to use it on my 550watt psu too.


yup. golden green 550w gold, have benn tested with SLI 570 and SLI 670 with 4 HDD, never got a problem. OCCT test result was great, 5 years warranty just like seasonic, good PSU with funny name "Super Flower"









my proc is not OC-able since it's not a K version, 280X run in 1.256v @ 1160 / 1600 daily.


----------



## DiceAir

I see whenever I bump my Memory anything higher on my CLub3d R9-280x my games either crash or so. I can't even ump the core clock any hhigher than my stock. My clock is 1100/1500mhz with 1.2663V core and 1.5V memory.

If I overclock I want to achieve 1600MHz on memory and 1200MHz at least on core to actually make it worth overclocking


----------



## madorax

Quote:


> Originally Posted by *DiceAir*
> 
> I see whenever I bump my Memory anything higher on my CLub3d R9-280x my games either crash or so. I can't even ump the core clock any hhigher than my stock. My clock is 1100/1500mhz with 1.2663V core and 1.5V memory.
> 
> If I overclock I want to achieve 1600MHz on memory and 1200MHz at least on core to actually make it worth overclocking


try upping the voltage with afterburner or trixx. memory need set to 1.6v to be able to raise 1600 or 1650, there's no performance benefit though with only raising the memory without also raising the core.


----------



## DiceAir

Quote:


> Originally Posted by *madorax*
> 
> try upping the voltage with afterburner or trixx. memory need set to 1.6v to be able to raise 1600 or 1650, there's no performance benefit though with only raising the memory without also raising the core.


Ok so what should I raise the core to you think? I can go 1.3V iin MSI afterburner. Like i said many times before I'm on 2560x1440 @ 96Hz


----------



## chaics

Quote:


> Originally Posted by *madorax*
> 
> yup. golden green 550w gold, have benn tested with SLI 570 and SLI 670 with 4 HDD, never got a problem. OCCT test result was great, 5 years warranty just like seasonic, good PSU with funny name "Super Flower"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> my proc is not OC-able since it's not a K version, 280X run in 1.256v @ 1160 / 1600 daily.


wow.. didnt know a gold 550watt psu can sli 670.. anyway, thanks for clarifying.. guess i'll just grab one msi r9 r280x..


----------



## madorax

Quote:


> Originally Posted by *chaics*
> 
> wow.. didnt know a gold 550watt psu can sli 670.. anyway, thanks for clarifying.. guess i'll just grab one msi r9 r280x..


yup, friend of mine bring dual 670 to my home specially to test this PSU back in the day, and test it with watt meter or something? i forgot the name, the one you put on the terminal and has digital display. test it on metro last light and heaven to get it max load. maybe because my proc is only 4570 wich non-oc proc, the result not too high, maybe if i'm using 4770k oc @ 4.4Ghz the watt will go up to 600w, for sure it will eat more power than default i5 no?


----------



## Devildog83

Quote:


> Originally Posted by *chaics*
> 
> wow.. didnt know a gold 550watt psu can sli 670.. anyway, thanks for clarifying.. guess i'll just grab one msi r9 r280x..


Here's what you should consider IMO. That PSU is most likely enough to handle your Phenom and a 280x, BUT, how old is that thing and do you want to trust it with a very expensive GPU. Here is how I think, I just spent $500 on graphics do I want to trust that to an old PSU that has just enough power or get something with a tad more power that is newer. For less than $100 you can get a very good unit that will give you both and be more efficient. If it were me I would be PSU shopping but that's just me.


----------



## JoeDirt

Quote:


> Originally Posted by *madorax*
> 
> FYI, Giga is voltage locked and MSI is not, so if you planning to get more performance by OC-ing it, MSI will be more worth to buy.


Gigabyte BIOS F60 are set to 1.200v, F30 are set to 1.200v, F3 & F12 are at 1.256v. And I unlocked the F60 & F30 to 1.300v.
These Gigabyte cards are hot boxes and don't over clock for ****, even with extra voltage above 1.200v

I believe the MSI BIOS are set to 1.300v. That's some damn high power consumption and heat.


----------



## jamponget9

Guys I know some of us here are "miners" now i want to ask... is mining at 460khash profitable or rather is it worth it nowadays? Im using a Toxic 270x 2gb... thats as far as my khash can go... my card temps plays at 67-70 degrees max.

im quite concerned that i may be blowing more electricity than earning an LT... looking at the mining calculator, @ 460 khash more or less is earning only at least 4 LT coins a month... thats quite low for me... or was it just an estimation?

website for LT Calculator is here... CLICK ME


----------



## JoeDirt

Quote:


> Originally Posted by *chaics*
> 
> R9 280X can consume up to 300W. hmm, which R9 280X you planned to buy?


Define one way to find out plz.. buy and put it for test run? i wish to buy and use straight away instead of buy, test, sell then buy others.. quite a hassle for me..

im looking at MSI R9 280X GAMING 3G or GIGABYTE R9 280X 3GB. Will this be a good upgrade comparing my aging CFx 6850?[/quote]

Buy it and use it. If it don't hold under load then its time for a bigger PSU. It will power on for sure but when its at 100%, that all will depend on what the rest of your system in consuming (CPU, RAM, Motherboard, Fans, USB devices, Lights). If you can't afford the extra $120 for a 800w PSU maybe hold off on the 280x until you can.


----------



## chaics

Quote:


> Originally Posted by *Devildog83*
> 
> Here's what you should consider IMO. That PSU is most likely enough to handle your Phenom and a 280x, BUT, how old is that thing and do you want to trust it with a very expensive GPU. Here is how I think, I just spent $500 on graphics do I want to trust that to an old PSU that has just enough power or get something with a tad more power that is newer. For less than $100 you can get a very good unit that will give you both and be more efficient. If it were me I would be PSU shopping but that's just me.


your input is highly appreciated. Ya, i'm concern about my aging PSU too. i was actually thinking to skip PSU upgrade until few more years to come and utilize it to the fullest.

hmm.. if the PSU died after putting in 280x, will it harm any hardware connected to it? hmmm...


----------



## JoeDirt

Quote:


> Originally Posted by *jamponget9*
> 
> Guys I know some of us here are "miners" now i want to ask... is mining at 460khash profitable or rather is it worth it nowadays? Im using a Toxic 270x 2gb... thats as far as my khash can go... my card temps plays at 67-70 degrees max.


If you just do it has a hobby while your not actively gaming or watching HD movies then yes in a month you can make a little extra cash. Once you take in power consumption it won't be much at all. Now I'd just let it all add up over months then cash out at that hash rate. I'm making an extra $140 a month after expenses @ 1.1Mh/s. This is still just a hobby for me.


----------



## JoeDirt

Quote:


> Originally Posted by *chaics*
> 
> your input is highly appreciated. Ya, i'm concern about my aging PSU too. i was actually thinking to skip PSU upgrade until few more years to come and utilize it to the fullest.
> 
> hmm.. if the PSU died after putting in 280x, will it harm any hardware connected to it? hmmm...


It would be like a car running out of gas. Sputter sputter clunk. But nothing will be broken, you will just need more power. Dude get a new PSU.


----------



## Devildog83

You might be good for a couple more years but then again maybe 1 week, I am not saying it's going to die or if your died it would damage other parts but it could. My main point was do you want to take that chance. Everything in your system relies on your PSU, it is the heart, squeezing by is not an option for me. Example - I had a Ultra LSP pro 750w PSU, it worked great for what I had and had plenty of power but once I spent $230 and a mobo and $180 an a CPU and was going to upgrade my GPU I decided that a better brand and efficiency was the way to go to protect my investment, kinda like insurance. I guess I might be a bit cautious but that's just me.


----------



## Sgt Bilko

I'd like to say thank you for all of the answers and advice everyone has given me but i'm afraid i won't be joining this club.

I've put an order in for two XFX R9 290's, so i'm looking forward to that very much









And congrats on taking over the club Devildog, it will flourish in your very capable hands


----------



## Devildog83

Thanks Sarge !!


----------



## kpo6969

Quote:


> Originally Posted by *Devildog83*
> 
> Here's what you should consider IMO. That PSU is most likely enough to handle your Phenom and a 280x, BUT, *how old* is that thing and do you want to trust it with a very expensive GPU. Here is how I think, I just spent $500 on graphics do I want to trust that to an old PSU that has just enough power or get something with a tad more power that is newer. For less than $100 you can get a very good unit that will give you both and be more efficient. If it were me I would be PSU shopping but that's just me.


+1
Very good and often overlooked point.


----------



## GTR Mclaren

QUESTIONNNNN

about to OC my 270x with the AMD overdrive options....

the "Powe control settings"...its ok to put that at plus 20??


----------



## Jaffi

Today I had a strange flicker for a split second. Was a vertical white bar while browsing the web. Hope it was a driver/software failure!


----------



## alkusoittow

So I've noticed the price has been soaring for the 280x... Anyone think they know when it will come back down to ~$300 again?


----------



## Sgt Bilko

Quote:


> Originally Posted by *alkusoittow*
> 
> So I've noticed the price has been soaring for the 280x... Anyone think they know when it will come back down to ~$300 again?


When the value of Litecoin drops?


----------



## GoLDii3

Quote:


> Originally Posted by *Sgt Bilko*
> 
> When the value of Litecoin drops?


Nah,mostly if bitcoin crashes or scrypt mining gets non-profitable over all cryptos or ASIC miners come out.


----------



## hyp3rtraxx

Anyone else running the "XFX Radeon R9 280X 3GB (Tahiti XTL)" or have any usefull info about those cards? I plan to mine litecoins and other alt coins with 2 of those cards crossfired, and use my cpu to mine as well, a amd [email protected] and use 4 threads of the cpu for mining. Do you think i can make 1500khash/s? And arent the Tahiti XTL cards supposed to draw a little less watt then other r280 cards?
And how are those cards for mining primary litecoins? Are they good at that? Or should i combine and mine alt coins such as mooncoins, dogecoins etc? Im a complete noob when it comes to mining so it seems most simply for me to reg at multipool and strictly just go for litecoins straight to my wallet or? At least im thinking of doing that for the first month of mining, while i read up on different mining forums about alt coin mining and playing with the exchange rates and such. And is it worth to overclock these cards for more khash? I think ive read that oc them doesnt do much, that you rather want to undervolt them to make the cards last and be cool and that they hash better when cooler. But i think the cooler seems good on those cards, lets hope it functions as good as it looks!


----------



## jamponget9

Quote:


> Originally Posted by *JoeDirt*
> 
> If you just do it has a hobby while your not actively gaming or watching HD movies then yes in a month you can make a little extra cash. Once you take in power consumption it won't be much at all. Now I'd just let it all add up over months then cash out at that hash rate. I'm making an extra $140 a month after expenses @ 1.1Mh/s. This is still just a hobby for me.


First of all thanks for the reply... Actually im planning to make a dedicated mining rig in the coming months. can you give me your mining specs and your khash rate? the only problem i have with farming is that my temps ramp up to the 70s which has got me a little worried... ive read some GPUs mining go as high as 95 degrees(290x) and 80-85 degrees (280x and 270x). I have never gone up to above the 70s with my intensity setting 15-16 @ 460khash... as i push my worker beyond 18-20 (intensity level) Khash rate also dramatically increases but it ramps up my temps up to 80-85 degrees in just a matter of seconds. I have never stayed at the 80 degree marker for more than 30 minutes coz im afraid it my toast my system. is it really safe to push that far? any further advice you can give?


----------



## JoeDirt

Quote:


> Originally Posted by *jamponget9*
> 
> First of all thanks for the reply... Actually im planning to make a dedicated mining rig in the coming months. can you give me your mining specs and your khash rate? the only problem i have with farming is that my temps ramp up to the 70s which has got me a little worried... ive read some GPUs mining go as high as 95 degrees(290x) and 80-85 degrees (280x and 270x). I have never gone up to above the 70s with my intensity setting 15-16 @ 460khash... as i push my worker beyond 18-20 (intensity level) Khash rate also dramatically increases but it ramps up my temps up to 80-85 degrees in just a matter of seconds. I have never stayed at the 80 degree marker for more than 30 minutes coz im afraid it my toast my system. is it really safe to push that far? any further advice you can give?


I'm running a water cooled 3770k i7 clocked to 4.7ghz with a Gigabyte z77 motherboard. 16GB ddr3 2133 g skill ram.
1000W PSU. Gigabyte 280x and HD 5770 video cards. I use my PC for everything, not just mining.

280x clocked at 1075mhz @ 1.076v hashing in the 750's kh/s memory is stock speed

5770 clocked at 900mhz @ 1.000v hashing at 215 kh/s memory stock speed

i7 puts out 50-60 kh/s with only 4 threads being used.

Temps in the 70s is fine. You start getting a lot of reject hashes in the 80s. I put an mk-26 on my 280x to keep the temp at 72c under full load. The windforce 3 had temps in the mid 80s all the time even with fans at 100%. I plan to get a 780 for daily use and games and get another 280x for mining with my current 280x. Pm me and we can talk more about mining.


----------



## F3ERS 2 ASH3S

Hey guys just poking my head in will catch up on the posts later 2800 is a bit to go through

I will be getting my xfx r9 280x double d tomorrow as long as USA does not fail


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *nastytime*
> 
> Sorry guys for all the updates. But I am returning both cards in the morning. I just had my first card blow a cap while i was playing BF4. It was loud and I immediately shut the pc to figure out what had happened. Well took the first card out and I heard something rolling around.....yep found it.
> 
> Used to look like this:
> http://s239.photobucket.com/user/na...r Stuff/2013-10-14213742_zps6fa0fc7d.jpg.html
> 
> Then like this:
> http://s239.photobucket.com/user/na...r Stuff/2013-10-14213306_zps868fa342.jpg.html
> and
> http://s239.photobucket.com/user/na...r Stuff/2013-10-14213231_zpsb42fd366.jpg.html
> 
> Well looks like I will just wait for the 290x. If anyone sees a reduced price sticker on this model at frys in Houston beware.


Glad I dodge that bullet by accident. Glad to see someone else from houston.. which Frys was it.. there's only 3 of them


----------



## JoeDirt

L
Quote:


> Originally Posted by *hyp3rtraxx*
> 
> Anyone else running the "XFX Radeon R9 280X 3GB (Tahiti XTL)" or have any usefull info about those cards? I plan to mine litecoins and other alt coins with 2 of those cards crossfired, and use my cpu to mine as well, a amd [email protected] and use 4 threads of the cpu for mining. Do you think i can make 1500khash/s? And arent the Tahiti XTL cards supposed to draw a little less watt then other r280 cards?
> And how are those cards for mining primary litecoins? Are they good at that? Or should i combine and mine alt coins such as mooncoins, dogecoins etc? Im a complete noob when it comes to mining so it seems most simply for me to reg at multipool and strictly just go for litecoins straight to my wallet or? At least im thinking of doing that for the first month of mining, while i read up on different mining forums about alt coin mining and playing with the exchange rates and such. And is it worth to overclock these cards for more khash? I think ive read that oc them doesnt do much, that you rather want to undervolt them to make the cards last and be cool and that they hash better when cooler. But i think the cooler seems good on those cards, lets hope it functions as good as it looks!


Litecointalk.org


----------



## F3ERS 2 ASH3S

OK caught up on all posts.. so I see Asus is #1 in most purchased

what I was hoping to find was more info for xfx seems that all reviews and even owners have not really said that much about it.
Quote:


> Originally Posted by *Devildog83*
> 
> Thanks Sarge !!


Congrats on the ownership.


----------



## Mikewakefield

Wondering if anyone can help me since changing my cpu and gfx card to a r9 270 i have been suffering numerous blue screens while playing games


----------



## Sgt Bilko

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> OK caught up on all posts.. so I see Asus is #1 in most purchased
> 
> what I was hoping to find was more info for xfx seems that all reviews and even owners have not really said that much about it.
> Congrats on the ownership.


I agree. When I get my cards ill run them through their paces pretty hard and see how they handle it.

I dont think XFX are as bad as what people are saying......


----------



## [email protected]

GBT 280x rev2 , its possible to unlock 1.2v with new bios?


----------



## GTR Mclaren

1160/1520 is the max OC in my Gigabyte R9 270X

its a shame this cards are not voltage unlocked

the WF3 cooler is a wonderful thing, 68c max temp after 3 hours of Battlefield 3...and with an ambient temp of 30-32c


----------



## Devildog83

Quote:


> Originally Posted by *Mikewakefield*
> 
> Wondering if anyone can help me since changing my cpu and gfx card to a r9 270 i have been suffering numerous blue screens while playing games


Give us a bit more info and we can surely help. A sig rig if you will too so we can see what you have. Top right corner on this page click rig builder.


----------



## Devildog83

Quote:


> Originally Posted by *GTR Mclaren*
> 
> 1160/1520 is the max OC in my Gigabyte R9 270X
> 
> its a shame this cards are not voltage unlocked
> 
> the WF3 cooler is a wonderful thing, 68c max temp after 3 hours of Battlefield 3...and with an ambient temp of 30-32c


Updated my friend.

I hear ya about the unlocked part, if I wasn't running crossfire I could get 1230/1575 out of mine. I can run 1235/1450 nicely though. The 7870 is memory weak and the locked 270x is core weak but the 2 work nicely together.


----------



## Mikewakefield

I think i did it right :s the Graphics card on it is wrong though as i could not get mine to come up on search Sapphire Duel-X R9 270 OC Edition


----------



## Mikewakefield

Quote:


> Originally Posted by *Mikewakefield*
> 
> I think i did it right :s the Graphics card on it is wrong though as i could not get mine to come up on search Sapphire Duel-X R9 270 OC Edition


oh yes and the more info part sorry, Well it all started when i upgraded the cpu and GFX card as stated earlier whenever im playing games on steam (not tried any outside steam yet though) i get the BSOD with error code 6.1.7601.2.1.0.256.1 and everyone online is saying it's memory related and to run memtest which i have but it takes hours just to run 2 passes and i dont want to be leaving the pc on 24/7 and i had 0 problems before changing card/cpu and i also had a vbios from sapphire which fixed flashing/tears but now im clueless as to what i have to do

also again the gaming rig card is wrong the correct one is Sapphire Duel-X R9 270 OC Edition


----------



## Devildog83

Quote:


> Originally Posted by *Mikewakefield*
> 
> I think i did it right :s the Graphics card on it is wrong though as i could not get mine to come up on search Sapphire Duel-X R9 270 OC Edition


OK, what driver version do you have to start out with and what clocks and voltage are you running?

Also - Trixx
Afterburner
GPU Tweak - which are you using and have you turned off Graphics overdrive in CCC or Catalyst Control Center?


----------



## Mikewakefield

Quote:


> Originally Posted by *Devildog83*
> 
> OK, what driver version do you have to start out with and what clocks and voltage are you running?


13.250.18.0 is the driver it says in dxdiag and everything is as it came how do i check the clocks and voltage ?


----------



## Devildog83

Quote:


> Originally Posted by *Mikewakefield*
> 
> 13.250.18.0 is the driver it says in dxdiag and everything is as it came how do i check the clocks and voltage ?


Well CCC or it's a sapphire card so sapphire Trixx is a good place to start. If you don't have it get the newest from their website.


----------



## Mikewakefield

Quote:


> Originally Posted by *Devildog83*
> 
> Well CCC or it's a sapphire card so sapphire Trixx is a good place to start. If you don't have it get the newest from their website.


Ok so according to trixx
GPU Clocks: 1070/1400
Memory Clocks: 1400
VDDC : 1225


----------



## Devildog83

Quote:


> Originally Posted by *Mikewakefield*
> 
> Ok so according to trixx
> GPU Clocks: 1070/1400
> Memory Clocks: 1400
> VDDC : 1225


Open CCC and disable graphics overdrive and try it that way. Those are stock clocks I believe so the cards themselves should not cause it.


----------



## Mikewakefield

Quote:


> Originally Posted by *Devildog83*
> 
> Open CCC and disable graphics overdrive and try it that way. Those are stock clocks I believe so the cards themselves should not cause it.


ok i did that so it has AMD overdrive enable cpu overdrive disabled and graphics overdrive disabled that right ?


----------



## Devildog83

Quote:


> Originally Posted by *Mikewakefield*
> 
> oh yes and the more info part sorry, Well it all started when i upgraded the cpu and GFX card as stated earlier whenever im playing games on steam (not tried any outside steam yet though) i get the BSOD with error code 6.1.7601.2.1.0.256.1 and everyone online is saying it's memory related and to run memtest which i have but it takes hours just to run 2 passes and i dont want to be leaving the pc on 24/7 and i had 0 problems before changing card/cpu and i also had a vbios from sapphire which fixed flashing/tears but now im clueless as to what i have to do
> 
> also again the gaming rig card is wrong the correct one is Sapphire Duel-X R9 270 OC Edition


Try checking to see if that motherboard actually is compatible with a piledriver chip. I know it works for bulldozer and phenom but some don't work well with piledriver.


----------



## Mikewakefield

Quote:


> Originally Posted by *Devildog83*
> 
> Try checking to see if that motherboard actually is compatible with a piledriver chip. I know it works for bulldozer and phenom but some don't work well with piledriver.


ok will have a look but my previous cpu was an AMD FX 4100 and it was fine as far as i know is that not that same thing ?


----------



## Devildog83

Quote:


> Originally Posted by *Mikewakefield*
> 
> ok i did that so it has AMD overdrive enable cpu overdrive disabled and graphics overdrive disabled that right ?


Yes, what bios version do you have. You need bios F6 to support that chip. If you don't have it you will need to flash the bios.

Here - http://www.gigabyte.com/support-downloads/cpu-support-popup.aspx?pid=4122

The 4100 uses bios version F1


----------



## Mikewakefield

Quote:


> Originally Posted by *Devildog83*
> 
> Yes, what bios version do you have. You need bios F6 to support that chip. If you don't have it you will need to flash the bios.
> 
> Here - http://www.gigabyte.com/support-downloads/cpu-support-popup.aspx?pid=4122


Award Modular Bios v6.00PG is what dxdiag is telling me and i dont have a clue what that would be lol


----------



## Devildog83

Quote:


> Originally Posted by *Mikewakefield*
> 
> Award Modular Bios v6.00PG is what dxdiag is telling me


I f you download cpuz on the mainboard tab it will tell you what bios version you have.

Like here, in the middle you will see bios version 1801.


----------



## Mikewakefield

Version F7A


----------



## Devildog83

Quote:


> Originally Posted by *Mikewakefield*
> 
> Version F7A


It seems that the F7A is Beta version of the bios. If flashing is not a big deal flash back to F6 and try that.


----------



## Mikewakefield

Quote:


> Originally Posted by *Devildog83*
> 
> It seems that the F7A is Beta version of the bios. If flashing is not a big deal flash back to F6 and try that.


Luckily that is 1 thing i actually know how to do







i will give it a go and see what happens could that be the reason for the blue screens then ?


----------



## Devildog83

http://www.gigabyte.com/webpage/20/HowToReflashBIOS.html Try here.


----------



## Devildog83

Quote:


> Originally Posted by *Mikewakefield*
> 
> Luckily that is 1 thing i actually know how to do
> 
> 
> 
> 
> 
> 
> 
> i will give it a go and see what happens could that be the reason for the blue screens then ?


I truly am not sure, just taking you thru steps I would take to try and solve the problem. You said that it started when you replaced the CPU and GPU so you should trouble shoot both.


----------



## Mikewakefield

Quote:


> Originally Posted by *Devildog83*
> 
> http://www.gigabyte.com/webpage/20/HowToReflashBIOS.html Try here.


ok will run it now thanks


----------



## Mikewakefield

ok so that is now saying version is f6 im guessing i should try running some games now and see what happens, i have unplugged my second monitor which was run threw hdmi to see if that makes a diference also


----------



## Mikewakefield

Quote:


> Originally Posted by *Devildog83*
> 
> I truly am not sure, just taking you thru steps I would take to try and solve the problem. You said that it started when you replaced the CPU and GPU so you should trouble shoot both.


ok well that's new started a new game in skyrim and during the intro on the cart my screen went white although i could still hear the sound


----------



## Devildog83

Quote:


> Originally Posted by *Mikewakefield*
> 
> ok well that's new started a new game in skyrim and during the intro on the cart my screen went white although i could still hear the sound


The only other thing I can think of is to sweep the GPU driver and download the newest one from the AMD website. I have the 13.12 but the 13.11 Beta 9.5 worked well too. Might sound dumb but you have 2 6 pin power cables hooked up right?


----------



## Mikewakefield

im preety sure the one im using is the 13.11 at the moment will try going back to the 13.12 i guess and what do you mean by to sweep it ?


----------



## Devildog83

Quote:


> Originally Posted by *Mikewakefield*
> 
> im preety sure the one im using is the 13.11 at the moment will try going back to the 13.12 i guess and what do you mean by to sweep it ?


It is a good idea to use a driver sweep tool to make sure you clean the old one all the way off before you download the new one

I have to go but I hope you get it figured out and don't have to RMA the card or anything. You could also try the bulldozer/piledriver club or fx 8350 club has a ton of smart individuals in it.


----------



## Mikewakefield

Quote:


> Originally Posted by *Devildog83*
> 
> It is a good idea to use a driver sweep tool to make sure you clean the old one all the way off before you download the new one
> 
> I have to go but I hope you get it figured out and don't have to RMA the card or anything. You could also try the bulldozer/piledriver club or fx 8350 club has a ton of smart individuals in it.


ok man thanks alot for the help i appreciate it i should go soon also 3:15 am here :|


----------



## neurotix

Quote:


> Originally Posted by *GTR Mclaren*
> 
> 1160/1520 is the max OC in my Gigabyte R9 270X
> 
> its a shame this cards are not voltage unlocked
> 
> the WF3 cooler is a wonderful thing, 68c max temp after 3 hours of Battlefield 3...and with an ambient temp of 30-32c


Mine does 1250/1500mhz @ 1.3v and never passes 55c.

Your cooler is not great.


----------



## Pesmerrga

I had originally planned on an R9 290x, but with the ridiculous prices I picked up an Sapphire R9 270x 4GB Dual X card. Pretty much the same performance as the GTX 580 that I had in my system with less power and I can run 3 displays.

Stock was 1070 core and 1400 memory, but I can push mine to 1230 core and 1550 memory. Barely any performance increase betweeen stock and max OC on FS. Pushing my 8350 over 5 GHZ only really pushes up the Physics score a bit.

I was going to do the Red Mod to this card, but it's not really necessary. The Dual-X fan does a pretty job of keeping the core cool, after putting some MX-4 to replace the stock TIM.



And what's up with the WHQL driver not being accepted all of a sudden? I'm positive that I had valid results yesterday on the same driver. Unless I messed something up that I'm not aware of..


----------



## sterob

anyone have a list of what model/brand of 280x does not have its voltage locked?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *sterob*
> 
> anyone have a list of what model/brand of 280x does not have its voltage locked?


That would be a good thing to put in OP once the list is available


----------



## madorax

Quote:


> Originally Posted by *sterob*
> 
> anyone have a list of what model/brand of 280x does not have its voltage locked?


well... all i know MSI, HIS, Power Color is unlocked, don't now about ASUS, but i think it will unlock too since it OC very well. Sapphire series is locked AFAIK, so do Gigabyte CMIIW.


----------



## Durvelle27

Quote:


> Originally Posted by *Mikewakefield*
> 
> im preety sure the one im using is the 13.11 at the moment will try going back to the 13.12 i guess and what do you mean by to sweep it ?


What CPU you using


----------



## GTR Mclaren

Quote:


> Originally Posted by *neurotix*
> 
> Mine does 1250/1500mhz @ 1.3v and never passes 55c.
> 
> Your cooler is not great.


Come to my country and without A/C and lets see if you mantain your 55c xD


----------



## Durvelle27

Quote:


> Originally Posted by *neurotix*
> 
> Mine does 1250/1500mhz @ 1.3v and never passes 55c.
> 
> Your cooler is not great.


Ambients and fan speed


----------



## Devildog83

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> That would be a good thing to put in OP once the list is available


Great idea, if all of the members that have 280x's could send a PM to me and let me know I will post it and if anyone has an unlocked 270x or finds a way to unlock them, please let me know.


----------



## Devildog83

Quote:


> Originally Posted by *Durvelle27*
> 
> What CPU you using[/quote
> 
> He switched from a 4100 to a 6300, and upgraded to an R9 270x and he get's white screens and freezes in games. The motherboard had a beta bios so he flashed to the correct one, we tried a bunch of stuff, reinstalling drivers, turn off graphics overdrive, but I haven't heard back yet if it has worked. After a driver sweep and reinstall he might fair better. I think he is all at stock. Baffled me, I doubt it's memory but who knows.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Durvelle27*
> 
> What CPU you using[/quote
> 
> He switched from a 4100 to a 6300, and upgraded to an R9 270x and he get's white screens and freezes in games. The motherboard had a beta bios so he flashed to the correct one, we tried a bunch of stuff, reinstalling drivers, turn off graphics overdrive, but I haven't heard back yet if it has worked. After a driver sweep and reinstall he might fair better. I think he is all at stock. Baffled me, I doubt it's memory but who knows.
> 
> 
> 
> You still planing to make the switch to the 290
Click to expand...


----------



## benjamen50

So if someone posts an unlocked 270X bios, does that mean I could flash it onto my Gigabyte AMD R9 270X Windforce OC Edition with no issues?


----------



## Devildog83

Quote:


> Originally Posted by *Durvelle27*
> 
> You still planing to make the switch to the 290


I might consider it when the non-ref cards come out and the prices come down but for now I get all of the performance I really need...not just yet.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> I might consider it when the non-ref cards come out and the prices come down but for now I get all of the performance I really need...not just yet.


Why non-reference


----------



## Devildog83

Quote:


> Originally Posted by *Durvelle27*
> 
> Why non-reference


Cooler and quieter. Also better looking I am sure.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> Cooler and quieter. Also better looking I am sure.


Watercooling FTW but I don't like non-reference as most are voltage locked.


----------



## Devildog83

Quote:


> Originally Posted by *Durvelle27*
> 
> Watercooling FTW but I don't like non-reference as most are voltage locked.


If I was going to watercool my card/card's I would think ref, but I am not think of doing that right now. Maybe someday. I heard that EK was moving to make waterblocks for some of the non-ref cards coming up. The Sapphire is the only one out yet and it's hard to get and $550. When I was considering it when the 290's were $400 and it would have made some sense, but now I have $440 into my cards that get at least close to the 290's performance wise. For another $120 or so I am not sure the gains would be worth it. At least not to me.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> If I was going to watercool my card/card's I would think ref, but I am not think of doing that right now. Maybe someday. I heard that EK was moving to make waterblocks for some of the non-ref cards coming up. The Sapphire is the only one out yet and it's hard to get and $550. When I was considering it when the 290's were $400 and it would have made some sense, but now I have $440 into my cards that get at least close to the 290's performance wise. For another $120 or so I am not sure the gains would be worth it. At least not to me.


Actually there's about 5 non-reference 290w out right that I know of. OC'd a 290 would be a slight upgrade but nothing major


----------



## Jaffi

Is solo mining litecoin any good? It is more like playing the lottery, right? So joining a pool is required?


----------



## Devildog83

Quote:


> Originally Posted by *Durvelle27*
> 
> Actually there's about 5 non-reference 290w out right that I know of. OC'd a 290 would be a slight upgrade but nothing major


Yea, I see that Giga and MSI have some too, I really like my set up now. If I decide to start a new system some day, which I may, it will be completely different. Intel, gold and black, and full watercooling and maybe cross-fire 290's but if I do that it might take a year to build. I have to make the Wife crazy somehow.

If I do I will start with a Caselabs case and go from there.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> Yea, I see that Giga and MSI have some too, I really like my set up now. If I decide to start a new system some day, which I may, it will be completely different. Intel, gold and black, and full watercooling and maybe cross-fire 290's but if I do that it might take a year to build. I have to make the Wife crazy somehow.
> 
> If I do I will start with a Caselabs case and go from there.


Welp lucks to you. I'm playing with a 780 right now


----------



## Devildog83

Playing with a 780 sounds like fun.


----------



## Mikewakefield

Quote:


> Originally Posted by *Devildog83*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Durvelle27*
> 
> What CPU you using[/quote
> 
> He switched from a 4100 to a 6300, and upgraded to an R9 270x and he get's white screens and freezes in games. The motherboard had a beta bios so he flashed to the correct one, we tried a bunch of stuff, reinstalling drivers, turn off graphics overdrive, but I haven't heard back yet if it has worked. After a driver sweep and reinstall he might fair better. I think he is all at stock. Baffled me, I doubt it's memory but who knows.
> 
> 
> 
> Sadly nothing seemed to work i contacted the seller i got the card from and am getting a refund and will possibly go back to an nvidia card i put my old evga 460 v2 back in and no blue screens at all so must of been something fishy with the 270
Click to expand...


----------



## JoeDirt

Quote:


> Originally Posted by *neurotix*
> 
> Mine does 1250/1500mhz @ 1.3v and never passes 55c.
> 
> Your cooler is not great.


1250 @ 1.3v @ 55c lulz. Troll harder.


----------



## JoeDirt

Quote:


> Originally Posted by *Jaffi*
> 
> Is solo mining litecoin any good? It is more like playing the lottery, right? So joining a pool is required?


Join a pool. If not, then yes it will be like playing the lottery.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> Playing with a 780 sounds like fun.


Ehhh I was a little disappointed with it at first


----------



## pac08

Quote:


> Originally Posted by *madorax*
> 
> i had MSI 280X i can tell you it's not locked. it can go to 1300, the default voltage is 1156 in my card. i haven't try to undervolt it... and i don't know why you want to do that, but i can assure you the card is not voltage locked.
> 
> look for the bios in here, and match this with your card later, if the last digit is still 331 (or 3XX) instead 435 then you need to update the bios, take it from HERE or you can go to official MSI forum to ask one, either way you will get the same BIOS, i DO try both bios and compare though, it is the same bios.


Thanks for replying. I actually bought a Sapphire 280X. As for the undervolting part, i use the pc for mining when i'm not at home or not gaming so i want to save some power.

Got another "problem" i could use help with. Can my HX750W support two Sapphire 280X OC cards (undervolted)? Rest of the rig in my signature (the cpu runs on stock voltage when mining).


----------



## F3ERS 2 ASH3S

Someone make usps deliver my card.. says out for delivery office says they already came and I still have nada... booo


----------



## JoeDirt

Quote:


> Originally Posted by *pac08*
> 
> Thanks for replying. I actually bought a Sapphire 280X. As for the undervolting part, i use the pc for mining when i'm not at home or not gaming so i want to save some power.
> 
> Got another "problem" i could use help with. Can my HX750W support two Sapphire 280X OC cards (undervolted)? Rest of the rig in my signature (the cpu runs on stock voltage when mining).


Should be good undervolted.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Someone make usps deliver my card.. says out for delivery office says they already came and I still have nada... booo


Im so impatient just got here.. Pics out of case taken. will follow up shortly


----------



## Devildog83

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Im so impatient just got here.. Pics out of case taken. will follow up shortly


With bated breath we wait for pics of your new arrival. OK, that sounds a bit like your having a baby. LOL


----------



## MoraisGT

Does any one know if this 280X (http://www.gigabyte.pt/products/page/vga/gv-r928xoc-3gd-garev_20/) has a compatible waterblock for it?


----------



## Devildog83

Quote:


> Originally Posted by *MoraisGT*
> 
> Does any one know if this 280X (http://www.gigabyte.pt/products/page/vga/gv-r928xoc-3gd-garev_20/) has a compatible waterblock for it?


It looks like EK does not. I can't find anyone who does. Your best bet may be to trade it in for a ref card to do water cooling. The only non ref, card I can see that they have one for is the Matrix.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> It looks like EK does not. I can't find anyone who does. Your best bet may be to trade it in for a ref card to do water cooling. The only non ref, card I can see that they have one for is the Matrix.


Or go universal or Kraken


----------



## Devildog83

Quote:


> Originally Posted by *Devildog83*
> 
> It looks like EK does not. I can't find anyone who does. Your best bet may be to trade it in for a ref card to do water cooling. The only non ref, card I can see that they have one for is the Matrix.


I did find WB's for the Asus DCII Top cards.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> With bated breath we wait for pics of your new arrival. OK, that sounds a bit like your having a baby. LOL


They Are not pretty but'll have to do donkey

So far on this card I have found that I need to have the 2nd monitor plugged in to the top DVI port. (in this case the grey one) Primary (bottom DVI) is red





Spoiler: Lookie here even says on the box voltage unlocked :)









Spoiler: Actual pics of the card












Spoiler: In meh rig


----------



## Durvelle27

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> They Are not pretty but'll have to do donkey
> 
> So far on this card I have found that I need to have the 2nd monitor plugged in to the top DVI port. (in this case the grey one) Primary (bottom DVI) is red
> 
> 
> 
> 
> 
> Spoiler: Lookie here even says on the box voltage unlocked :)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Actual pics of the card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: In meh rig


Sexy card you got there


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Durvelle27*
> 
> Sexy card you got there


I know!!!

Fans at 100% pfft like buttah! can't here them with my case closed

fans at 100% with stock clock after a valley run nets me 51c at max temp

Stock volts are 1.075 at 1000/1500


----------



## Durvelle27

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> I know!!!
> 
> Fans at 100% pfft like buttah! can't here them with my case closed
> 
> fans at 100% with stock clock after a valley run nets me 51c at max temp
> 
> Stock volts are 1.075 at 1000/1500


Nice


----------



## F3ERS 2 ASH3S

http://www.3dmark.com/3dm/2089816 normal preset firestrike 7020 2k more than my SLI 460s OCd to 900

That is on stock clocks

I will get around to overclocking but there are so many new things to learn on overclocking these cards (or I was living under a rock)


----------



## Devildog83

F3ERS 2 ASH3S has been added. Welcome ! Nice card there. If you want the clocks changed later let me know.


----------



## arzel94

hi, recently I have buy I new sapphire 280x toxic. I want to oc it but I don't know what is the voltage max and temp max for the 24/7.

thank for your help and sorry for my poor English.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> F3ERS 2 ASH3S has been added. Welcome ! Nice card there. If you want the clocks changed later let me know.


Thank you

Not bad for $325 purchase. Will be a about 200 once my accounts clear for selling the 460s
Quote:


> Originally Posted by *arzel94*
> 
> hi, recently I have buy I new sapphire 280x toxic. I want to oc it but I don't know what is the voltage max and temp max for the 24/7.
> 
> thank for your help and sorry for my poor English.


85c under constant load and from what I am reading 1.3v


----------



## arzel94

but when I run furmark on stock the temp jump to 90°c. but on benchmark or in bf4 the temp not go over 75°c.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *arzel94*
> 
> but when I run furmark on stock the temp jump to 90°c.


I could be wrong on the temps


----------



## Devildog83

Quote:


> Originally Posted by *arzel94*
> 
> hi, recently I have buy I new sapphire 280x toxic. I want to oc it but I don't know what is the voltage max and temp max for the 24/7.
> 
> thank for your help and sorry for my poor English.


Post a pic and some clocks and I will add you.


----------



## pac08

Quote:


> Originally Posted by *JoeDirt*
> 
> Should be good undervolted.


Thanks . Also used a psu calculator and it seems i will be fine as long as i don't overdo it.


----------



## arzel94

I don't know but I think have read the max safe temp are 95°c. but I can't be wrong too. but on my card have 6 led for the temp and just 4 are turn on. its more hard to find answer, or a tutorial when the card are new.


----------



## DiceAir

So I got my card to sort of accept 1180MHz on core but then I see some green dots. I was running at 1.3V with 120% power limit. guess my cards won't overclock


----------



## wilflare

i can't decide whether to get a 280X or a 290... (or maybe save and get a 7950 or GTX760)


----------



## Durvelle27

Quote:


> Originally Posted by *wilflare*
> 
> i can't decide whether to get a 280X or a 290... (or maybe save and get a 7950 or GTX760)


I don't understand how this is a hard decision considering a 290 is much faster than all of those options


----------



## wilflare

Quote:


> Originally Posted by *Durvelle27*
> 
> I don't understand how this is a hard decision considering a 290 is much faster than all of those options


my bad for not explaining... the Asus 280X is $490 (Singapore Dollars) with BattleField 4
the 290 would require another $100 and it's all the ref cooler/design


----------



## gnemelf

Quote:


> Originally Posted by *DiceAir*
> 
> So I got my card to sort of accept 1180MHz on core but then I see some green dots. I was running at 1.3V with 120% power limit. guess my cards won't overclock


what is your memory at? What are you using to oc with? What is your clock speed to start on your cards?


----------



## Durvelle27

Quote:


> Originally Posted by *wilflare*
> 
> my bad for not explaining... the Asus 280X is $490 (Singapore Dollars) with BattleField 4
> the 290 would require another $100 and it's all the ref cooler/design


Woah that's some serious cash. What's your budget, res you will play at, games inteneded, and current PC


----------



## Sgt Bilko

Quote:


> Originally Posted by *wilflare*
> 
> my bad for not explaining... the Asus 280X is $490 (Singapore Dollars) with BattleField 4
> the 290 would require another $100 and it's all the ref cooler/design


some of the 290 Ref cards come with BF4 as well.

And the ref cooler does a good job at keeping the card cool but it trades heat for noise, I will say that my old 6970 CF set-up was louder than the 290x i had but it's still a bit on the noisy side for some.

If you want a comparison between a 7970 (280x) and a hawaii card then here you go:

http://www.3dmark.com/compare/fs/1441733/fs/1151742


----------



## GTR Mclaren

QUESTION

anyone have a guide to optimize Skyrim like this one for Nvidia??? :

http://www.geforce.com/optimize/guides/five-fast-elder-scrolls-v-skyrim-tweaks-guaranteed-to-make-your-game-look-even-better#1


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> some of the 290 Ref cards come with BF4 as well.
> 
> And the ref cooler does a good job at keeping the card cool but it trades heat for noise, I will say that my old 6970 CF set-up was louder than the 290x i had but it's still a bit on the noisy side for some.
> 
> If you want a comparison between a 7970 (280x) and a hawaii card then here you go:
> 
> http://www.3dmark.com/compare/fs/1441733/fs/1151742


Here another

HD 7970 1280/1850 vs R9 290 1215/1450

http://www.3dmark.com/compare/3dm11/7626071/3dm11/7462575


----------



## Loktar Ogar

Quote:


> Originally Posted by *Durvelle27*
> 
> Here another
> 
> HD 7970 1280/1850 vs R9 290 1215/1450
> 
> http://www.3dmark.com/compare/3dm11/7626071/3dm11/7462575


Nice comparison. Thanks. I'm also looking for their differences.


----------



## Durvelle27

Quote:


> Originally Posted by *Loktar Ogar*
> 
> Nice comparison. Thanks. I'm also looking for their differences.


Thx

When it comes to gaming the 290 there is no comparison. The 512bit bus and 4GB of VRAM really helps when cranking up the AA in games. So far it can max almost all games 1080p were as the 7970/280X has to dial back on some settings and eye candy


----------



## wilflare

Quote:


> Originally Posted by *Durvelle27*
> 
> Woah that's some serious cash. What's your budget, res you will play at, games inteneded, and current PC


this is my present rig - but the motherboard died so I'm doing some upgrades

Processor
AMD Phenom II X2 550BE @ X4 3.5GHz w/ Noctua U12P
Mainboard
MSi 790FX-GD70
Graphics Card
Sapphire HD5830 Xtreme
Memory
Kingston 2X4GB
Display
DELL 2209WA
Storage
WD 640GB Black (OS) + WD 1TB Green (Storage)
Optical Storage
Pioneer DVR-217BK (20x)
Casing
Silverstone FT01
PSU
Corsair HX620
Audio
Audinst HUD-MX1 @ AE Aego M
OS
Windows 7 (X64) Home Premium

was thinking of getting
Asrock B85 Pro4 + i5 4570
Asus R9 280X TOP with BF4
(and a Plextor M5Pro 128GB SSD to go along)

but seeing how the 290 offers so much more performance... I'm kinda changing my mind

my current monitor plays at 1680x1050
i'm thinking either of going with Dell U2412M to get some 1080P gaming or the Qnix for 1440P

1USD = 1.3 Singapore Dollars
Budget is $1000 (SGD) for CPU+Mobo+GFX


----------



## wilflare

Quote:


> Originally Posted by *Durvelle27*
> 
> Woah that's some serious cash. What's your budget, res you will play at, games inteneded, and current PC


replied with my specs.. but it's being held for moderation :X


----------



## Sgt Bilko

Quote:


> Originally Posted by *wilflare*
> 
> replied with my specs.. but it's being held for moderation :X


http://www.overclock.net/t/1258253/how-to-put-your-rig-in-your-sig


----------



## wilflare

hopefully it appears now

thinking of upgrading my rig with the following...
Asrock B85 Pro4 + i5 4570
Asus R9 280X TOP with BF4
gonna install my new Plextor M5Pro 128GB SSD as well

currently gaming at 1680x1050 but thinking of upgrading to either
(1) Dell U2412M for 1080p gaming
(2) Qnix for 1440p

Budget for CPU+Mobo+GFX is $1000 Singapore Dollars
1USD = 1.3SGD

was thinking at $490... I should just pay a little more for a 290? idk


----------



## Archea47

Quote:


> Originally Posted by *wilflare*
> 
> was thinking at $490... I should just pay a little more for a 290? idk


Honestly if I could do it all over again I would have gotten one or (well who are we kidding) two reference 290/290x (when the prices were MSRP) cards

I went with the 280x because a lot of people were saying the 290 reference was unstable/overheating and aftermarkets weren't out yet. Now a number of owners here are saying that's not the case if you don't mind the noise (I don't) of high fan speed and I'm smacking myself for going aftermarket because there are no full waterblocks for mine (Gigabyte Rev2)

And the cool factor is lacking compared to the Hawaii - AMD's latest and greatest

That said, my two 280x's do eat BF4 for breakfast so it's all moot for now


----------



## gnemelf

I'm just glad I got my 280x when they were 299 cause if I'd have paid 410 for this card I don't think I'd be as happy with it... I want the prices to come back down so I can do a crossfire (which I doubt they will cause if they are selling them at that price and making 100 dollars more why not from the retail standpoint) still happy with it for the price I got it for...


----------



## DiceAir

Quote:


> Originally Posted by *gnemelf*
> 
> what is your memory at? What are you using to oc with? What is your clock speed to start on your cards?


REf clock is 1100 core and 1500MHz memory


----------



## Jaffi

Any ideas if I have to register my card at Asus to get the 3 years warranty? I think it was like that with evga.


----------



## cookiesowns

Just built a system for a client with an R9 280X.

Asus DC2T, has Hynix GDDR5. Does around 1800Mhz when running cgminer







. Also stable at 1120 w/ 1800 ( 7.2 Ghz ) in Valley

H5GQ2H24AFR is the IC part # as reported from memory info.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Jaffi*
> 
> Any ideas if I have to register my card at Asus to get the 3 years warranty? I think it was like that with evga.


If their card support is like the motherboard support they go off of the serial number which tracks the date the card was created


----------



## Devildog83

I will be catching up a bit later on adding folks, my Power is out and I only have 4Glte at my office so it too slow to work and play.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Devildog83*
> 
> I will be catching up a bit later on adding folks, my Power is out and I only have 4Glte at my office so it too slow to work and play.


Sorry to hear it, Power outages suck


----------



## Devildog83

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sorry to hear it, Power outages suck


Typical 4 to 6 hrs here. I think somebody ran into a power pole.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Devildog83*
> 
> Typical 4 to 6 hrs here. I think somebody ran into a power pole.


Ours are anywhere between 30mins and 7 hrs.

They almost always happen at night when i have work the next day..........Hard to wake up for work with no alarm lol


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> Typical 4 to 6 hrs here. I think somebody ran into a power pole.


That right there is tru NW thinking LOL I grew up in Idaho so I know that feeling..

Update on the overclock of the XFX 280x looks like I am only going to get 1100Mhz out if it.. (this is out of the box haven't repasted yet)
Next step will be to see how the ram clocks

Update:

3dmark11 score 10560
http://www.3dmark.com/3dm11/7761595

Clock 1100/ mem 1850

ASIC quality is 70% for this chip

Called XFX up and was told that in North America Specifically that those warranty stickers on the screw are ok to remove and they would not affect the warranty









More Updates.. with a question.. I may have lucked out and got a copy of BF4 as well.. that is still pending
Anyone know when that promotion started when newegg started bundling BF with the XFX cards? Got the answer for this

Edit.. repasted the card here are pics of findings. They are using elpidia memory


Spoiler: Warning: Spoiler!


----------



## MoraisGT

Hi guys, I just sold my R9 280X Windforce Rev.2 and I'm getting the MSI version because it has a reference pcb, so i can watercool it!

Now, I've been reading alot, including in reviews about this card throttling because the vrm's get too hot. My question is, since I'm watercooling it with a full cover block, would the vrm's stop overheating, or does the block only have an impact on the core temperatures?

Thank's in advance!


----------



## JoeDirt

Just ordered a ASUS R9 270 to replace my 5770. Will flash to BIOS to a 270x. It's only purpose will be to mine LTC. Will look good under my Gigabyte 280x with a Prolimatech PRO-MK-26 on it.


----------



## JoeDirt

Quote:


> Originally Posted by *MoraisGT*
> 
> Hi guys, I just sold my R9 280X Windforce Rev.2 and I'm getting the MSI version because it has a reference pcb, so i can watercool it!
> 
> Now, I've been reading alot, including in reviews about this card throttling because the vrm's get too hot. My question is, since I'm watercooling it with a full cover block, would the vrm's stop overheating, or does the block only have an impact on the core temperatures?
> 
> Thank's in advance!


Good move dude.


----------



## MoraisGT

Quote:


> Originally Posted by *JoeDirt*
> 
> Good move dude.


Do you care to explain that remark?
And maybe help me understand why I made a good or bad decision?


----------



## JoeDirt

Quote:


> Originally Posted by *MoraisGT*
> 
> Do you care to explain that remark?
> And maybe help me understand why I made a good or bad decision?


For the water cooling. Water cooling systems always looks slick and function great. If I was going to OC these cards I would do the same thing or get NZXT Kraken G10 with a X60 cooler.


----------



## MoraisGT

Quote:


> Originally Posted by *JoeDirt*
> 
> For the water cooling. Water cooling systems always looks slick and function great. If I was going to OC these cards I would do the same thing or get NZXT Kraken G10 with a X60 cooler.


I'm sorry I misinterpreted you, I thought you were beeing sarcastic.









Yes I plan to overclock and allready have a costum loop cooling my CPU.

I got a really good deal on a watercool heatkiller gpu-x3 block.









Can't wait to get my hands on the card


----------



## JoeDirt

Quote:


> Originally Posted by *MoraisGT*
> 
> I'm sorry I misinterpreted you, I thought you were beeing sarcastic.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes I plan to overclock and allready have a costum loop cooling my CPU.
> 
> I got a really good deal on a watercool heatkiller gpu-x3 block.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can't wait to get my hands on the card


Sounds like a plan brother







If you plan to go beyond 1.3v let me know and we can do some tests to see if I can mod the BIOS to go beyond 1.3v for you.


----------



## LatinLover

*Question: I recently sold my Sapphire 7950 on Craigs at $340, right now I need a card, to play bf3 on 1920x1080 ips screen
And I don't need a over expensive card to kick arses, My question is, R9 270x 4gb it worth, over 2gb of vram for $20 more?, or is purely marketing card, is only useful in Xfire?
Thanks in advance...*


----------



## Durvelle27

Quote:


> Originally Posted by *LatinLover*
> 
> *Question: I recently sold my Sapphire 7950 on Craigs at $340, right now I need a card, to play bf3 on 1920x1080 ips screen
> And I don't need a over expensive card to kick arses, My question is, R9 270x 4gb it worth, over 2gb of vram for $20 more?, or is purely marketing card, is only useful in Xfire?
> Thanks in advance...*


A 270X would be a down grade


----------



## LatinLover

Quote:


> Originally Posted by *Durvelle27*
> 
> A 270X would be a down grade


*I know, but the temptation to sell, my 7950 to a miner overcome me, it's done
So I trust my skills and I don't need ultra, max or medium setting is fine for me
Just I need something, until the next generation of graphic cards
Then it worth 4gb over 2 gb? or no.*


----------



## Durvelle27

Quote:


> Originally Posted by *LatinLover*
> 
> *I know, but the temptation to sell, my 7950 to a miner overcome me, it's done
> So I trust my skills and I don't need ultra, max or medium setting is fine for me
> Just I need something, until the next generation of graphic cards
> Then it worth 4gb over 2 gb? or no.*


For $350 I'd spring for a GTX 680 or GTX 770


----------



## GuestVeea

Hello all. I new new to this specific forum. I recently switch from nvidia to AMD because my GTX 670 crapped out on me. Here is my current build with an R9 270x Toxic edition



Other specs:
i7-3770k
MSI-Z77a-GD55
8gb DDR3
Sound blaster z sound card
750w PSU
64gb SSD
750w HDD
HAF-XB


----------



## Devildog83

Welcome Guest Veea,

You have been added. Could you let us know what clock you run at and if you want, max overclock. A real nice pic would be good too and if you click on the "rigbuilder" link at the top right of this page you could complete a sig rig and when you have questions and or need help folks will be better equipped to do so.


----------



## Miiksu

I have found something interesting from iTurbo software and HIS R9 280X. First I raised card's MVDDC to 1.575 and then closed iTurbo and then open it again and boom you can set MVDDC up to 1.653. You can overclock memory even higher. I have been playing GW2 8-10 hours without problems running memory 1800 MHz with 1.618 MVDDC.


----------



## F3ERS 2 ASH3S

forget this post..


----------



## Devildog83

I managed the top spot on 3DMark11 for FX 8350 and 7870x2.

http://www.3dmark.com/3dm11/7768500 VALID

http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/3dm11/P/1541/765/500000?minScore=0&cpuName=AMD%20FX-8350&gpuName=AMD%20Radeon%20HD%207870


----------



## F3ERS 2 ASH3S

I got to 1200/1700 for a 3dmark11 11k score

http://www.3dmark.com/3dm11/7768995


----------



## GuestVeea

Quote:


> Originally Posted by *Devildog83*
> 
> Welcome Guest Veea,
> 
> You have been added. Could you let us know what clock you run at and if you want, max overclock. A real nice pic would be good too and if you click on the "rigbuilder" link at the top right of this page you could complete a sig rig and when you have questions and or need help folks will be better equipped to do so.


Alrighty, will do. thanks for your response. I'll get back to you with all that info momentarily.


----------



## F3ERS 2 ASH3S

http://www.3dmark.com/3dm11/7769051

I think this might be my cap now.. 1210/1750 #2 for single card setup


----------



## Durvelle27

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> http://www.3dmark.com/3dm11/7769051
> 
> I think this might be my cap now.. 1210/1750 #2 for single card setup


Still ways to go to catch my 7970









http://www.3dmark.com/3dm11/7462575


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Durvelle27*
> 
> Still ways to go to catch my 7970
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7462575


Yarp.. but getting close only I keeps getting driver fails any high than that bench so I'll have to see what's going

My temps are only 70c


----------



## Durvelle27

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Yarp.. but getting close only I keeps getting driver fails any high than that bench so I'll have to see what's going
> 
> My temps are only 70c


Water FTW


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Durvelle27*
> 
> Water FTW


?? That's on stock fans







air


----------



## Durvelle27

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> ?? That's on stock fans
> 
> 
> 
> 
> 
> 
> 
> 
> air


Yes ik but i was at 1280/1850 1.3v @40*C under load









have my 780 on water now and so far 1317/1600 @1.3v Max 49*C


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Durvelle27*
> 
> Yes ik but i was at 1280/1850 1.3v @40*C under load
> 
> 
> 
> 
> 
> 
> 
> 
> 
> have my 780 on water now and so far 1317/1600 @1.3v Max 49*C


Gotcha so comparably I have good clocks then


----------



## Durvelle27

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Gotcha so comparably I have good clocks then


Yep clocks better than my 7970


----------



## cremelo

Sup








This is my first test on Stock Vs OC








Stock 1070Mhz to 1150Mhz still without touching the voltage









Stock: 
OC:


----------



## Durvelle27

Quote:


> Originally Posted by *cremelo*
> 
> Sup
> 
> 
> 
> 
> 
> 
> 
> 
> This is my first test on Stock Vs OC
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Stock 1070Mhz to 1150Mhz still without touching the voltage
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Stock:
> OC:


Nice


----------



## GuestVeea

Alright, I made my RigBuilder set up. As of right now I have my 270x running at 1150mhz Core clock.
Hopefully these pictures are better. (Im not good at photography in any way)


----------



## Devildog83

Quote:


> Originally Posted by *GuestVeea*
> 
> Alright, I made my RigBuilder set up. As of right now I have my 270x running at 1150mhz Core clock.
> Hopefully these pictures are better. (Im not good at photography in any way)


That will do then, thanks.


----------



## hamzta09

Differences between

XFX 280X DD-Edition Ghost 2.0
and
XFX 280X Ghost 2.0?

The cards look exactly the same, but the upper one "DD" has Dual Dissipation for better heat? How come? Looks the same? And why does it not have turbo?
Is non-Turbo better or worse?
Or zero difference? Does it work just like Nvidias?

Which one of these for Crossfire?

CS 750M 80+ gold (92%)
http://www.komplett.se/corsair-cs-750m-750w-psu/802644#!tab:extra

or

CX 750M 80+ bronze (85%)
http://www.komplett.se/corsair-cx-750m-750w-psu/773377#!tab:extra


----------



## diggiddi

I believe the Gold modular psu is the better choice


----------



## hamzta09

Quote:


> Originally Posted by *diggiddi*
> 
> I believe the Gold modular psu is the better choice


Alright.

Regarding MS or Microstutter running CF. Have AMD done anything about this, has it improved alot? Is it noticable?

And is there an updated List of CF supported games?


----------



## jamponget9

Its almost 3 weeks now that I have owned my Toxic 270X and do noticed something on my temps... Back in December my Idle temps for my card is around 30-33. Now at its 3rd week, its been Idling at 38 degrees. Checked CPU for possible dust or anything that maybe contributing to the added heat... nothing... Also noticed after December room temperature is a bit warmer than before but not too noticeable as even in the cold night the card is still idling @ 38 degrees,,, weird,,, any thoughts?


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> Differences between
> 
> XFX 280X DD-Edition Ghost 2.0
> and
> XFX 280X Ghost 2.0?
> 
> The cards look exactly the same, but the upper one "DD" has Dual Dissipation for better heat? How come? Looks the same? And why does it not have turbo?
> Is non-Turbo better or worse?
> Or zero difference? Does it work just like Nvidias?
> 
> Which one of these for Crossfire?
> 
> CS 750M 80+ gold (92%)
> http://www.komplett.se/corsair-cs-750m-750w-psu/802644#!tab:extra
> 
> or
> 
> CX 750M 80+ bronze (85%)
> http://www.komplett.se/corsair-cx-750m-750w-psu/773377#!tab:extra


You want 850w for Crossfire at least, I've pulled up to 630w on my rig with a *Single* 7970, that was with no OC on the card, just the CPU and Ram


----------



## hamzta09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You want 850w for Crossfire at least, I've pulled up to 630w on my rig with a *Single* 7970, that was with no OC on the card, just the CPU and Ram


But 630w != 750w and you run a 8350 @ 5GHz, isnt there a big diff compared to a 2500K?

Heres a 4960X @ 4.4GHz
16GB RAM
1 SSD
Asus Rampage IV Extreme


Ive seen other people with 750W specs running 7970 in crossfire, 770 sli etc.

And the price difference between an 850W PSU and 750W Gold is ~60 bucks.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> Differences between
> 
> XFX 280X DD-Edition Ghost 2.0
> and
> XFX 280X Ghost 2.0?
> 
> The cards look exactly the same, but the upper one "DD" has Dual Dissipation for better heat? How come? Looks the same? And why does it not have turbo?
> Is non-Turbo better or worse?
> Or zero difference? Does it work just like Nvidias?
> 
> Which one of these for Crossfire?
> 
> CS 750M 80+ gold (92%)
> http://www.komplett.se/corsair-cs-750m-750w-psu/802644#!tab:extra
> 
> or
> 
> CX 750M 80+ bronze (85%)
> http://www.komplett.se/corsair-cx-750m-750w-psu/773377#!tab:extra


The dd helps a lot.. tbh I love it I can push 1.3v and stay at 70c


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> But 630w != 750w and you run a 8350 @ 5GHz, isnt there a big diff compared to a 2500K?
> 
> Ive seen other people with 750W specs running 7970 in crossfire, 770 sli etc.
> 
> And the price difference between an 850W PSU and 750W Gold is ~60 bucks.


That is true, 2500k will draw less than a 8350 for sure and 750w will be fine if there is no overclock but imo i think spending the extra $60 is worth it just for the overhead.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That is true, 2500k will draw less than a 8350 for sure and 750w will be fine if there is no overclock but imo i think spending the extra $60 is worth it just for the overhead.


Now I wish I had a watt meter. I'm running a 750w psu with ocd everything and a single 280x xfx

As for the turbo it is a non issue as it doesn't matter much when you over clock.

Side note I'm thinking of doing a full review of the card itself. Any input of what you would like to see let me know


----------



## Sgt Bilko

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Now I wish I had a watt meter. I'm running a 750w psu with ocd everything and a single 280x xfx
> 
> As for the turbo it is a non issue as it doesn't matter much when you over clock.
> 
> Side note I'm thinking of doing a full review of the card itself. Any input of what you would like to see let me know


Kill-a-watt's are fairly cheap, I tested Firstrike combined test and Valley + Prime95 to stress it out.

I'd love to see an OCN members opinion of the DD 280x seeing as im getting two of the 290 versions


----------



## cremelo

Someone is having problems with the new driver? The 13.12 ????
After I installed my Asus R9 280X DCU2 Top in Full Load stay on 72ºC, and the previous driver did not reach on 65ºC.....


----------



## Devildog83

Quote:


> Originally Posted by *jamponget9*
> 
> Its almost 3 weeks now that I have owned my Toxic 270X and do noticed something on my temps... Back in December my Idle temps for my card is around 30-33. Now at its 3rd week, its been Idling at 38 degrees. Checked CPU for possible dust or anything that maybe contributing to the added heat... nothing... Also noticed after December room temperature is a bit warmer than before but not too noticeable as even in the cold night the card is still idling @ 38 degrees,,, weird,,, any thoughts?


It's most likely poor application of thermal paste, if you can clean it and apply some real good TIM properly.


----------



## Devildog83

I just do not understand why people have to try and skimp on the PSU. If you have a good PSU and are deciding whether or not to upgrade I can see the worrying about 100w and do I have enough but if you are going to buy a new PSU by all means spend a few bucks and get the head room if nothing else. I recommend getting the highest quality and as much head room you can afford. The PSU is the heart of your system and should not be skimped on. It may tap the pocketbook a bit more but it's worth every penny to not stress about it.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> I just do not understand why people have to try and skimp on the PSU. If you have a good PSU and are deciding whether or not to upgrade I can see the worrying about 100w and do I have enough but if you are going to buy a new PSU by all means spend a few bucks and get the head room if nothing else. I recommend getting the highest quality and as much head room you can afford. The PSU is the heart of your system and should not be skimped on. It may tap the pocketbook a bit more but it's worth every penny to not stress about it.


this^^^^^^^^^^^^^^^^^^^^^^ !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!


----------



## cowmoo32

Are any of you running linux? I'm on Linux Mint 16 Cinnamon and the AMD driver keeps screwing everything up. I tried installing fglrx, fglrx-updates (both native to mint) and the AMD Catalyst driver from their website but it screws up my system. The only driver that works is the native xserver-xorg-video-ati. Running anything else breaks xorg and I'm not able to boot. Any ideas?


----------



## hamzta09

Quote:


> Originally Posted by *Devildog83*
> 
> I just do not understand why people have to try and skimp on the PSU. If you have a good PSU and are deciding whether or not to upgrade I can see the worrying about 100w and do I have enough but if you are going to buy a new PSU by all means spend a few bucks and get the head room if nothing else. I recommend getting the highest quality and as much head room you can afford. The PSU is the heart of your system and should not be skimped on. It may tap the pocketbook a bit more but it's worth every penny to not stress about it.


But Im not gonna run Furmark on the PC -.-
And games tend not to stress GPUs like Furmark

How about, since you guys obviously got wattage meters. Boot up BF4, play for an hour and monitor your wattage with a camera or something.

I did google: 750W 7970 crossfire, 280x crossfire, 770 crossfire etc and seems the majority says its fine.
There is also a thread here on OCN where a guy asks about 750W and Crossfire and seems majority seem to say its fine.
http://www.overclock.net/t/1419675/is-a-quality-750w-psu-enough-for-7970-crossfire

280X mentioned on page 5 or something.


----------



## F3ERS 2 ASH3S

Hey guys! I have just completed my review for the XFX 280x Double D. let me know how you like it. Sorry in advance that I don't have full run of games but I do want to add that at my max OC I can play Crysis 3 at max settings give post processing and particle density.

http://www.overclock.net/products/xfx-radeon-double-d-r9-280x-1000mhz-boost-ready-3gb-ddr5-2xmdp-hdmi-2xdvi-graphics-cards-r9-280x-tdfd/reviews/6485


----------



## Devildog83

Quote:


> Originally Posted by *hamzta09*
> 
> But Im not gonna run Furmark on the PC -.-
> And games tend not to stress GPUs like Furmark
> 
> How about, since you guys obviously got wattage meters. Boot up BF4, play for an hour and monitor your wattage with a camera or something.
> 
> I did google: 750W 7970 crossfire, 280x crossfire, 770 crossfire etc and seems the majority says its fine.
> There is also a thread here on OCN where a guy asks about 750W and Crossfire and seems majority seem to say its fine.
> http://www.overclock.net/t/1419675/is-a-quality-750w-psu-enough-for-7970-crossfire
> 
> 280X mentioned on page 5 or something.


Then don't take my advice and get a 750w, that's fine if you think it's fine. At least go with the Gold rated though. You need to get what you think is best, all any of us can do is make suggestions based on our experience. My experience and from talking with others tells me I would not go less than 850w for if I wanted to push 2 x 7970's or 280x's. I have more than $1500 into my system, I will not compromise.


----------



## hamzta09

Quote:


> Originally Posted by *Devildog83*
> 
> Then don't take my advice and get a 750w, that's fine if you think it's fine. At least go with the Gold rated though. You need to get what you think is best, all any of us can do is make suggestions based on our experience. My experience and from talking with others tells me I would not go less than 850w for if I wanted to push 2 x 7970's or 280x's. I have more than $1500 into my system, I will not compromise.


Dont PSUs have failsafe incase something goes wrong or its using too much juice, the PC shuts down?

If so - at First Notice - Get new PSU/Return PSU (if within returndays)


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> Dont PSUs have failsafe incase something goes wrong or its using too much juice, the PC shuts down?
> 
> If so - at First Notice - Get new PSU/Return PSU (if within returndays)


Most do however it depends on the quality of them and also how much do you want to rely on a fail safe that may not function correctly as all other electronics always carry that risk, I was running 2 GTX 460s pushing 1.087 with a gold rated 750w PSU (but key thing is that it is built with quality) I will need to check on the actual wattage that was being used by each card.

What DevilDog is saying is that why put that much money on the line as if a PSU fails or if a fail safe does not work correctly a surge of power can literally take out your entire system. You then would be out a grand. The thing about over head and PSUs is that if you work (just like any other component) it 100% at all times the life span would be greatly shortened. Not saying that it can't do it but why not spring for a littler more so that in the long run you do not A: have to buy another PSU and B: run the risk of pulling too much power and having the PSU fail.

As what has been stated it is your choice but is also a well advised suggestion. I have had a PSU kill a board, ram and chip so I am now more cautious.


----------



## hamzta09

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Most do however it depends on the quality of them and also how much do you want to rely on a fail safe that may not function correctly as all other electronics always carry that risk, I was running 2 GTX 460s pushing 1.087 with a gold rated 750w PSU (but key thing is that it is built with quality) I will need to check on the actual wattage that was being used by each card.
> 
> What DevilDog is saying is that why put that much money on the line as if a PSU fails or if a fail safe does not work correctly a surge of power can literally take out your entire system. You then would be out a grand. The thing about over head and PSUs is that if you work (just like any other component) it 100% at all times the life span would be greatly shortened. Not saying that it can't do it but why not spring for a littler more so that in the long run you do not A: have to buy another PSU and B: run the risk of pulling too much power and having the PSU fail.
> 
> As what has been stated it is your choice but is also a well advised suggestion. I have had a PSU kill a board, ram and chip so I am now more cautious.


Why would the system require 750W 24/7 during gaming?
It makes no sense at all.

Also 280X cards all seem to have this massive barrier blocking the Xfire connections meaning you have to buy additional bridges 23094meters long and bend them (possibly break em) in order to connect two cards.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> Why would the system require 750W 24/7 during gaming?
> It makes no sense at all.
> 
> Also 280X cards all seem to have this massive barrier blocking the Xfire connections meaning you have to buy additional bridges 23094meters long and bend them (possibly break em) in order to connect two cards.


You most likely will be ok with the 750w but as stated we are only suggesting what would be a great longer term solution, more or less future proof it.


----------



## hamzta09

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> You most likely will be ok with the 750w but as stated we are only suggesting what would be a great longer term solution, more or less future proof it.


No such thing as "future proof" in the world of computers.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> No such thing as "future proof" in the world of computers.


It was a loose term, you asked for suggestions and we gave answers, Not sure if you are meaning to argue however It does not make sense why you would ask a question then when you get the answer to completely knock it down..

Yes your i5 consumes less power. However Yes it would be more future proof to get a higher wattage PSU if you wanted to add a 3rd or 4th card in a year or so.. (still with in the lifespan of a PSU) Also depending on the game and if you overclock the card the total system watts can go up. so again it depends and we have brought to you suggestions and explanations.


----------



## hoevito

So I decided to ditch my crossfire setup and try something a little different. I've had quite a bit of bad luck since I went with 2 280x's, and I just did not have a good experience sadly, especially between all the driver issues and the amount of heat these cards were throwing out. I was also noticing lower framerates and an overall worse experience in a lot of games when using two cards as opposed to a single card (echoing a lot of what I've been reading around the net), and my Toxic 280x had to be RMA'd because it was just messed up. Newegg also managed to screw me when it came to RMA'ing the card, as I wanted it exchanged, but they processed a refund instead and wouldn't replace it (since they claimed to have none in stock at the time it was returned, but magically had some available for sale literally less than 12 hours after they processed my return...for $120 MORE than what I originally paid







).

Since I got my refund from newegg, I was contemplating either upgrading to a couple 290's or ditching this whole multi-gpu crap and going with a single 780ti instead, but last night after much consternation I bit the bullet and decided to go Nvidia and ordered two 780 Lightning's at $499 instead.

So guys, other than missing out on Mantle, am I gaining a lot here or what should I expect?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hoevito*
> 
> So I decided to ditch my crossfire setup and try something a little different. I've had quite a bit of bad luck since I went with 2 280x's, and I just did not have a good experience sadly, especially between all the driver issues and the amount of heat these cards were throwing out. I was also noticing lower framerates and an overall worse experience in a lot of games when using two cards as opposed to a single card (echoing a lot of what I've been reading around the net), and my Toxic 280x had to be RMA'd because it was just messed up. Newegg also managed to screw me when it came to RMA'ing the card, as I wanted it exchanged, but they processed a refund instead and wouldn't replace it (since they claimed to have none in stock at the time it was returned, but magically had some available for sale literally less than 12 hours after they processed my return...for $120 MORE than what I originally paid
> 
> 
> 
> 
> 
> 
> 
> ).
> 
> Since I got my refund from newegg, I was contemplating either upgrading to a couple 290's or ditching this whole multi-gpu crap and going with a single 780ti instead, but last night after much consternation I bit the bullet and decided to go Nvidia and ordered two 780 Lightning's at $499 instead.
> 
> So guys, other than missing out on Mantle, am I gaining a lot here or what should I expect?


Looks like it may have been better for the 290x
http://www.legitreviews.com/nvidia-geforce-gtx-780-ti-vs-amd-radeon-r9-290x-at-4k-ultra-hd_130055/10

So I would say your findings will be in between the 2 options..


----------



## hoevito

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Looks like it may have been better for the 290x
> http://www.legitreviews.com/nvidia-geforce-gtx-780-ti-vs-amd-radeon-r9-290x-at-4k-ultra-hd_130055/10
> 
> So I would say your findings will be in between the 2 options..


Well, I know the 290x is better than a 780, but a non-reference 290x is currently going for $200 MORE than what I just paid for a 780 Lightning. Since all of AMD's cards are selling for $100 to $200 over MSRP right now, it just doesn't make sense to buy right now, and sadly it doesn't look like prices are going down anytime soon...that's what sealed the deal for me.

I know a single 780 should be fine for 1440p, and sli 780's will have me set for awhile......but I just wonder what I'll be missing out on if anything by switching from AMD to Nvidia? I do miss my Toxic 280x though...that was a nice card while it lasted


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hoevito*
> 
> Well, I know the 290x is better than a 780, but a non-reference 290x is currently going for $200 MORE than what I just paid for a 780 Lightning. Since all of AMD's cards are selling for $100 to $200 over MSRP right now, it just doesn't make sense to buy right now, and sadly it doesn't look like prices are going down anytime soon...that's what sealed the deal for me.
> 
> I know a single 780 should be fine for 1440p, and sli 780's will have me set for awhile......but I just wonder what I'll be missing out on if anything by switching from AMD to Nvidia? I do miss my Toxic 280x though...that was a nice card while it lasted


I would say Mantle would be the biggest thing as you have already stated.. that and most games in the near future are going to be better aimed towards AMD GPUs with opencl so you have a a break even point to be honest..


----------



## hamzta09

So I found an 850W PSU by XFX

This one XFX ProSeries XXX Edition 850W PSU
http://www.komplett.se/xfx-proseries-xxx-edition-850w-psu/760109#!tab:extra 80+ Bronze apparently.

Is it a good PSU or should I stick with Corsair?
It has 60 month factory warranty aswell and its ~50 bucks cheaper than the Corsair 850W versin.

Is it enough for 2 280x in xfire?


----------



## GoLDii3

Quote:


> Originally Posted by *hamzta09*
> 
> So I found an 850W PSU by XFX
> 
> This one XFX ProSeries XXX Edition 850W PSU
> http://www.komplett.se/xfx-proseries-xxx-edition-850w-psu/760109#!tab:extra 80+ Bronze apparently.
> 
> Is it a good PSU or should I stick with Corsair?
> It has 60 month factory warranty aswell and its ~50 bucks cheaper than the Corsair 850W versin.
> 
> Is it enough for 2 280x in xfire?


More than enough.

Its done by Seasonic,like all XFX PSU's.


----------



## GTR Mclaren

Far Cry 3....nice first hour

one big problem: the virus called UPlay

BTW...SSAO, HDAO or HBAO for AMD cards??


----------



## hamzta09

Are there any titles that struggle with crossfire? Is there a list of modern gametitles that have scaling and other info?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> Are there any titles that struggle with crossfire? Is there a list of modern gametitles that have scaling and other info?


As for the problems I think AMD fixed the microstutter ... for the list of scaling I am not sure of


----------



## Takla

Quote:


> Originally Posted by *GTR Mclaren*
> 
> Far Cry 3....nice first hour
> 
> one big problem: the virus called UPlay
> 
> BTW...SSAO, HDAO or HBAO for AMD cards??


SSAO = AMD & Nvidia
HDAO = AMD
HBAO = Nvidia


----------



## Jaffi

Quote:


> Originally Posted by *Takla*
> 
> SSAO = AMD & Nvidia
> HDAO = AMD
> HBAO = Nvidia


So what if I set HBAO in BF4? I think it is possible in the game settings.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jaffi*
> 
> So what if I set HBAO in BF4? I think it is possible in the game settings.


You could just try them all out and pick which one works best for you?


----------



## Takla

Quote:


> Originally Posted by *Jaffi*
> 
> So what if I set HBAO in BF4? I think it is possible in the game settings.


Its ambient occlusion done by nvidia. it does work with amd cards but dice didn't make the effort to convert hdao (amd"s ambient occlusion) to battlefield 4.


----------



## jamponget9

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> I would say Mantle would be the biggest thing as you have already stated.. that and most games in the near future are going to be better aimed towards AMD GPUs with opencl so you have a a break even point to be honest..


Agree with you bro... And right now I think with the coming of Mantle, Nvidia's sidelines is currently focusing on their Mobile products right now. Seems Nvidia is avoiding the Mantle for now coz they feel that they cant compete with it once it comes out (if it comes out really). Nvidia is also currently backing up SteamOS, but with the coming of Mantle, many believe SteamOS would be in jeopardy once Mantle is out for AMD cards.


----------



## Devildog83

Quote:


> Originally Posted by *jamponget9*
> 
> Agree with you bro... And right now I think with the coming of Mantle, Nvidia's sidelines is currently focusing on their Mobile products right now. Seems Nvidia is avoiding the Mantle for now coz they feel that they cant compete with it once it comes out (if it comes out really). Nvidia is also currently backing up SteamOS, but with the coming of Mantle, many believe SteamOS would be in jeopardy once Mantle is out for AMD cards.


I am really stoked to see what Mantle brings to the table, should be out very soon on BF4. I hope they have the rest of the BF4 bugs fixed when Mantle is finally released which may be why it's delayed.


----------



## hamzta09

So I ordered

A Corsair 850W, 100% modular, 80+ Gold
And 2 x Gigabyte 280X 3GB Windforce 3X and apparently each have a BF4 voucher and I already own it.. facepalm but whatever.

Hopefully I get them this week, strange how the site had 18 in stock at the time of order. The day after 0 in stock and expected delivery of a whopping 8 pieces of them, is Jan 31...


----------



## shilka

Quote:


> Originally Posted by *hamzta09*
> 
> So I ordered
> 
> A Corsair 850W, 100% modular, 80+ Gold.


The Cooler Master V850 is the same PSU just better and cheaper


----------



## hamzta09

Quote:


> Originally Posted by *shilka*
> 
> The Cooler Master V850 is the same PSU just better and cheaper


Here it isnt, its actually higher.


----------



## Devildog83

Quote:


> Originally Posted by *shilka*
> 
> The Cooler Master V850 is the same PSU just better and cheaper


If it's the same how could it be better?


----------



## Devildog83

Quote:


> Originally Posted by *hamzta09*
> 
> So I ordered
> 
> A Corsair 850W, 100% modular, 80+ Gold
> And 2 x Gigabyte 280X 3GB Windforce 3X and apparently each have a BF4 voucher and I already own it.. facepalm but whatever.
> 
> Hopefully I get them this week, strange how the site had 18 in stock at the time of order. The day after 0 in stock and expected delivery of a whopping 8 pieces of them, is Jan 31...


Now your cookin' jack.


----------



## Archea47

Quote:


> Originally Posted by *hamzta09*
> 
> And 2 x Gigabyte 280X 3GB Windforce 3X and apparently each have a BF4 voucher and I already own it.. facepalm but whatever.


Welcome to the club!









I have the same cards and got the same promotion. You don't have to tie the BF4 copies to your card serials or your own purchase in any way - I gave mine to friends and they activated theirs no problem


----------



## shilka

Quote:


> Originally Posted by *Devildog83*
> 
> If it's the same how could it be better?


Better and bigger fan that better made and more quiet

Also has less coil whine then the Corsonic versions


----------



## hamzta09

Quote:


> Originally Posted by *shilka*
> 
> Better and bigger fan that better made and more quiet
> 
> Also has less coil whine then the Corsonic versions


But its the same!


----------



## shilka

Quote:


> Originally Posted by *hamzta09*
> 
> But its the same!


Same platform was what i meant

The V is newer therefore its been improved and tweaked compare to the older KM2 and KM3 units

Almost forgot AX750/850 are KM2

But the AX760/860 and the V are KM3


----------



## Jaffi

I think the horizontal white bar that flashes from time to time while I am browsing the web appears when there is a flash video loaded in one of the tabs.

Tapatalked from my Nexus 5


----------



## Amhro

Woah, prices of 280xs dropped again







Anywhere else or is it just here? Asus is 284€ now.


----------



## Devildog83

Quote:


> Originally Posted by *Amhro*
> 
> Woah, prices of 280xs dropped again
> 
> 
> 
> 
> 
> 
> 
> Anywhere else or is it just here? Asus is 284€ now.


Not in the US, they are still over $410 or 300 + Euros. You might find a better deal if you searched hard but that is the norm. I would be waiting or going crossfire 270x's for the same price and far better performance.
My opinion, if you can do X-Fire, DO IT NOW, while the 270x's are still cheap.


----------



## dr.evil

hi i want to ask something, a r9 270/x will be able to handle gaming at 1080p, i dont care about high setting only care fps in online games
and want to know where i can find one 270x theres no one on amazon, i cant buy from newegg becouse i am from venezuela. thx need help


----------



## InsideJob

Do we non x version r9 270 owners join here too?


----------



## rdr09

Quote:


> Originally Posted by *dr.evil*
> 
> hi i want to ask something, a r9 270/x will be able to handle gaming at 1080p, i dont care about high setting only care fps in online games
> and want to know where i can find one 270x theres no one on amazon, i cant buy from newegg becouse i am from venezuela. thx need help


my phenom at 4GHz Quad with a HD7770 handles 1080 easy but medium settings. plays BF4, Bf3, PS2, etc. pretty sure the 270 will do better even at high settings in BF4.


----------



## InsideJob

Quote:


> Originally Posted by *dr.evil*
> 
> hi i want to ask something, a r9 270/x will be able to handle gaming at 1080p, i dont care about high setting only care fps in online games
> and want to know where i can find one 270x theres no one on amazon, i cant buy from newegg becouse i am from venezuela. thx need help


I just got my 270 today and have only played some DayZ so far but I'm getting equivalent frames as I was with my 7970. That's paired with a 6300 at 4.5ghz and 8gb of RAM.


----------



## dr.evil

Quote:


> Originally Posted by *rdr09*
> 
> my phenom at 4GHz Quad with a HD7770 handles 1080 easy but medium settings. plays BF4, Bf3, PS2, etc. pretty sure the 270 will do better even at high settings in BF4.


thx, rigth now i only play bf3, RO2, cs:go, gw2. i only care for max performance on online games
i will buy a shapire r9 270x toxic for 234$ is a good deal??


----------



## benjamen50

That is a good deal, I got my Gigabyte AMD R9 270X Windforce 3X OC 2GB GDDR5 for 245$

and if you got higher than 2 GB of VRAM with the card you're buying, then I'd say go for it.


----------



## Devildog83

Quote:


> Originally Posted by *InsideJob*
> 
> Do we non x version r9 270 owners join here too?


Sure, I see no reason why not. I will see about changing the heading.


----------



## Devildog83

Quote:


> Originally Posted by *dr.evil*
> 
> thx, rigth now i only play bf3, RO2, cs:go, gw2. i only care for max performance on online games
> i will buy a shapire r9 270x toxic for 234$ is a good deal??


Yep, I think that's a good deal. 2 would be rockin'.









I have noticed after a search that all 270x's are starting to become hard to find also. The Hawk and Toxic seem to get the highest clocks but are the hardest to find.

1 more thing, the 270x's are Mantle ready and BF4 with Mantle will be out soon followed by a bunch of others I hope.


----------



## InsideJob

Add me in then








Got this after selling my 7970 to get the most cash out of it while I still could with the mining craze. Low on cash and needed a new mic set up so used the left over cash from selling the card to get a new desktop condenser mic












Spoiler: Warning: Spoiler!


----------



## Devildog83

Quote:


> Originally Posted by *InsideJob*
> 
> Add me in then
> 
> 
> 
> 
> 
> 
> 
> 
> Got this after selling my 7970 to get the most cash out of it while I still could with the mining craze. Low on cash and needed a new mic set up so used the left over cash from selling the card to get a new desktop condenser mic
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


What model is that, DD? What are the clocks and I will add you.

Never mind, got the DD


----------



## InsideJob

1025/1556 are my current clocks


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *InsideJob*
> 
> Add me in then
> 
> 
> 
> 
> 
> 
> 
> 
> Got this after selling my 7970 to get the most cash out of it while I still could with the mining craze. Low on cash and needed a new mic set up so used the left over cash from selling the card to get a new desktop condenser mic
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Let me know how that DD fairs. I have the 280x and wrote a decent review on it

And welcome


----------



## Devildog83

Insidejob has been added and the club now includes R9 270 owners. Welcome !!!


----------



## InsideJob

Quote:


> Originally Posted by *Devildog83*
> 
> Insidejob has been added and the club now includes R9 270 owners. Welcome !!!


You rock!


----------



## [CyGnus]

InsideJob why did you switch to worse? It was a good sell the 7970?


----------



## InsideJob

Quote:


> Originally Posted by *[CyGnus]*
> 
> InsideJob why did you switch to worse? It was a good sell the 7970?


Yeah it was a good sell with the mining craze and I only use a single 1080p monitor so I had a whole gb of vram on the 7970 being unused. If in the future I need more power I'll grab another 270, the main game I play (DayZ) gets the same or slightly better performance from the 270 so I'm not losing









I'm liking this card so far. Runs cool, clocked well out of the box, it's a much more manageable size than the 7970 and much quieter. I can see myself getting another one when I have the funds just because it's so darn nice


----------



## Devildog83

What do you guys think about posting Valley scores on the 1st post. I would like to see the score from 4 categories -

1) 280x/280 single card

2) 280x/280 X-fired - any other card will do as long as it's X-Fired

3) 270x/270 single

4) 270x/270 X-Fired - w/any card

All must be run at extreme HD, if there is enough scores I will post them in the OP.

Please post a clip and the card/cards type and clocks. Like this if you can.


----------



## Archea47

Quote:


> Originally Posted by *[CyGnus]*
> 
> It was a good sell the 7970?


FWIW my 79*50* sold in an hour for $380 just a few days ago, when I paid $280 for it new and got almost a years enjoyment out of it

So yeah I think they are still good sells


----------



## F3ERS 2 ASH3S

Here is my run.. com'on 280x guys beat meh score



XFX 280x DD Clock 1211 Mem 1700


----------



## hamzta09

I snatched a 290X Tri-X from sapphire aswell due in stock tomorrow aswell.

Dunno which ones Ill keep, the 290x or 280x's.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> I snatched a 290X Tri-X from sapphire aswell due in stock tomorrow aswell.
> 
> Dunno which ones Ill keep, the 290x or 280x's.


2x 280x > 1x 290x

as long as drivers don't mess up crossfire.. so its a give and take


----------



## Durvelle27

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Here is my run.. com'on 280x guys beat meh score
> 
> 
> 
> XFX 280x DD Clock 1211 Mem 1700


I can beat it


----------



## InsideJob

Here's my run on stock clocks. The card isn't playing nicely running Valley at the clocks I tried. Going to have to do some fooling around with it I guess


----------



## Warl0rdPT

Quote:


> Originally Posted by *Devildog83*
> 
> What do you guys think about posting Valley scores on the 1st post. I would like to see the score from 4 categories -
> 
> 1) 280x/280 single card
> 
> 2) 280x/280 X-fired - any other card will do as long as it's X-Fired
> 
> 3) 270x/270 single
> 
> 4) 270x/270 X-Fired - w/any card
> 
> All must be run at extreme HD, if there is enough scores I will post them in the OP.
> 
> Please post a clip and the card/cards type and clocks. Like this if you can.


What driver settings?


----------



## [CyGnus]

[CyGnus] --- 4770K @ 4.6GHz --- R9 280X @ 1225/1847 --- 2272 --- 54.3FPS


----------



## JCH979

Well my PC pretty much spent last weekend trying to get 4.3GHz stable on my subpar 3570K (over 1.35V for 4.5GHz.. come on!..), but I digress.. In the time I was waiting on my PC I came back to read some of the posts I've missed and decided to OC my gpu to see how far I could push my Asus Top 280X. Ended up being able to do 1290/1830 @ GPUTweak's 1.3V in 3DMark 11 and Firestrike.

Valley on the other hand artifacts at clocks higher than 1270/1840.

I had completely forgotten that I bought 3DMark 13 for around $2 during the Steam Winter Sales. I am no bencher by any means, curiosity just gets to me sometimes lol. Here are my stock to max OC results.



Spoiler: Stock 1070/1600 @ 1.2V



Firestrike - Valid Result: http://www.3dmark.com/fs/1484340

3DMark 11 - Valid Result: http://www.3dmark.com/3dm11/7783165

Valley







Spoiler: Overclocked using GPUTweak to 1290/1830 @ 1.3V



Firestrike - Valid Result: http://www.3dmark.com/fs/1490857

3DMark 11 - Valid Result: http://www.3dmark.com/3dm11/7783545

Valley @ 1270/1840



I'm pretty sure that is as far as the card will go without getting some bad artifacts.



Now just need to get my 24/7 OC and get back to my games


----------



## [CyGnus]

JCH979 great scores and what a nice card you have there


----------



## JCH979

Same goes for you







Wish I had your temps because it's only gonna get hotter here, currently around 17C ambient.


----------



## Devildog83

Quote:


> Originally Posted by *Warl0rdPT*
> 
> What driver settings?


You just cannot disable Tessy. If the drive version is not valid that's OK, because a lot of folks are still on 13.11 beta drivers.


----------



## Devildog83

Everyone looks good so far. Here is my high so far with a 270x and 7870 @ 1231/1445.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> Everyone looks good so far. Here is my high so far with a 270x and 7870 @ 1231/1445.


Little behind my R9 290 & GTX 780 but nice score


----------



## Devildog83

Quote:


> Originally Posted by *Durvelle27*
> 
> Little behind my R9 290 & GTX 780 but nice score










- Yea but they are so much more expensive.


----------



## bokchoi

here is my single 270X 1100/1500...


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> 
> 
> 
> 
> 
> 
> 
> - Yea but they are so much more expensive.


Not really

HD 7870/270X cost $199-$230

I paid $425 for my 780 and $495 after Water block

&

Paid $430 for my 290


----------



## Devildog83

Quote:


> Originally Posted by *Durvelle27*
> 
> Not really
> 
> HD 7870/270X cost $199-$230
> 
> I paid $425 for my 780 and $495 after Water block
> 
> &
> 
> Paid $430 for my 290


That's pretty good deals. Not easy to find those deals nowadays. If you can even find the 290's at all. Having said that I wish I would have bought one because the resale on them is astronomically high.


----------



## Pis

Mine :3


----------



## neurotix

DevilDog83, here's mine bro.

neurotix - Sapphire R9 270X Vapor-X @ 1300/1500mhz - FX-8350 @ 4.7ghz


Spoiler: Click







21C ambient and after 3 loops the card is only 56C. See screenshot.


----------



## Durvelle27

Quote:


> Originally Posted by *Devildog83*
> 
> That's pretty good deals. Not easy to find those deals nowadays. If you can even find the 290's at all. Having said that I wish I would have bought one because the resale on them is astronomically high.


I've seen 5 in the past 3 days for sale for $425


----------



## the9quad

Man I would be so afraid to buy a used 290/290x after miners have been running the crap out of them hardcore non stop 24/7.


----------



## neurotix

http://www.ebay.com/sch/i.html?_odkw=R9+290&LH_BIN=1&_osacat=0&_from=R40&_trksid=p2045573.m570.l1313.TR0.TRC0.XR9+290+tri+x&_nkw=R9+290+tri+x&_sacat=0

lol


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> http://www.ebay.com/sch/i.html?_odkw=R9+290&LH_BIN=1&_osacat=0&_from=R40&_trksid=p2045573.m570.l1313.TR0.TRC0.XR9+290+tri+x&_nkw=R9+290+tri+x&_sacat=0
> 
> lol


NIce score there.

I have e-mailed companies like that that have had outrageous prices and asked why so high and they said that it's because that are out of stock at the moment so the jack them up so high no-one will buy them and when they come in stock they readjust the price to normal.


----------



## wilflare

I ended up with a new ASUS R9 280X with a Dell U2412M

does the DVI cable quality matter alot?
wonder if I should get a good quality one or use the bundled ones


----------



## Butternut101

im not really sure if this has been said but when I play league of legends I use to get well over 200 fps but now with the r9 280x im at 170...which doesn't really bother me much cause im still up there in other games







but then again I haven't really played league in awhile so Im not really sure what frames I use to get


----------



## jamponget9

Im so so itchin about the mantle... hope the guys at EA get their acts together on their BF4... It wud be a disaster for AMD if BF4 fails with the coming of Mantle...


----------



## hamzta09

Question about the Sapphire 280X Dual-X version.

Ive read that it has flickering problems and strange noises? (Not coilwhine) cause by the fans.


----------



## DizzlePro

Does the Gigabyte R9 280x Rev 2.0 have have unlocked voltage?

i wanted to get the Asus DCU2 but it's oos everywhere in my country


----------



## hyp3rtraxx

Quote:


> Originally Posted by *DizzlePro*
> 
> Does the Gigabyte R9 280x Rev 2.0 have have unlocked voltage?
> 
> i wanted to get the Asus DCU2 but it's oos everywhere in my country


Ive heard that all r280x cards have unlocked voltage! At least my XFX Radeon R9 280X 3GB (Tahiti XTL) has unlocked voltage.


----------



## hyp3rtraxx

I have 2*XFX Radeon R9 280X 3GB (Tahiti XTL). Does anyone else has this card? I wonder whats the (XTL) stands for or what is it good for? lower wattage i think ive heard. I think ive heard its only like lower watt like 20 watts lower then the regular 280x? Or am i wrong.

And one more question, anyone else mining with these cards? Cause when mining the temp of the cards reach about 90c, is that safe? And i mean it is a big cooler with 2*100mm fans on them so they shouldnt get that warm!

But ive heard that Xfx has bad thermal påaste on alot of there products can anyone confirm that? is it worth to change the original paste to Artic silver Mx-2?


----------



## F3ERS 2 ASH3S

I have it. Stock voltage for me was 1.075v I also did a full teardown review in it

I changed my paste to antec formula 7


----------



## Archea47

Quote:


> Originally Posted by *hyp3rtraxx*
> 
> Ive heard that all r280x cards have unlocked voltage! At least my XFX Radeon R9 280X 3GB (Tahiti XTL) has unlocked voltage.


The Gigabyte Rev2 280x at least does have voltage locked via a value in the card's BIOS

I know the user JoeDirt who has been active in this topic also has a Gigabyte 280x and has experimented with modifying the BIOS. I'll let him chime in with any results


----------



## hyp3rtraxx

Quote:


> Originally Posted by *Archea47*
> 
> The Gigabyte Rev2 280x at least does have voltage locked via a value in the card's BIOS
> 
> I know the user JoeDirt who has been active in this topic also has a Gigabyte 280x and has experimented with modifying the BIOS. I'll let him chime in with any results


I could change the voltage and undervolt mine to 1.00volt for mining in afterburner anyway, still they get to 90celcius when undervolted. I think thats way to high. But for just ordinary use i think the temp is around 45c. Anyone know if 90c is dangerous temp for them? And should i change the tim? Damn XTL version and still so warm







Anyway now im off to do my first 3dmark11 run with them, gonna be fun to compare to my previous sli gtx660 cards. And these cards i dont run overclock. And atm i run them fully stock, the volt as well. Cause i uninstalled afterburner.

Do you guys know of any good program to see the temps and overclock/downclock them? I dont wanna use afterburner! Does "Xfx" have any own program for it like evga has?


----------



## hyp3rtraxx

http://www.3dmark.com/3dm11/7788738

12587 score with crossfired r280x in my sig. But now my cpu has downclocked itself to 4.2ghz? From 5 Ghz. no idea if my result is good i only know my physics score isnt good it should be around 9k!


----------



## tsm106

Quote:


> Originally Posted by *hyp3rtraxx*
> 
> http://www.3dmark.com/3dm11/7788738
> 
> 12587 score with crossfired r280x in my sig. But now my cpu has downclocked itself to 4.2ghz? From 5 Ghz. no idea if my result is good i only know my physics score isnt good it should be around 9k!


I would say you are having issues. A stock or mildly clocked tahiti should be around 10-11K ish for single card. The gs doesn't look that terrible but the rest of your config is taking a beating.


----------



## Durvelle27

Quote:


> Originally Posted by *hyp3rtraxx*
> 
> http://www.3dmark.com/3dm11/7788738
> 
> 12587 score with crossfired r280x in my sig. But now my cpu has downclocked itself to 4.2ghz? From 5 Ghz. no idea if my result is good i only know my physics score isnt good it should be around 9k!


seems low for 2x 280Xs


----------



## GoLDii3

Quote:


> Originally Posted by *hyp3rtraxx*
> 
> http://www.3dmark.com/3dm11/7788738
> 
> 12587 score with crossfired r280x in my sig. But now my cpu has downclocked itself to 4.2ghz? From 5 Ghz. no idea if my result is good i only know my physics score isnt good it should be around 9k!


Maybe downclocking due to high VRM temps/Not stable/Too much heat?


----------



## hyp3rtraxx

http://www.3dmark.com/3dm11/7788931

14332 score now. With my cpu at 5ghz, stable. But *** 3dmark isnt reckognizing my cpu? And now i get like 19k grapchics score, didnt kow that was bad for 2 r280´s? Can anyone else with crossfired r280x run pls so we can compare, and yes the cards are completly stock. But as said before they reach 90c when mining so maybe they reach that temp now and downclocks as someone said? Well i have to wait until someone else pls run 3dmark 11 with crossfired r280x and we can compare?

Here i have searched for other crossfired 280x running 3dmark:

http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/gpuname/fs/P/AMD%20Radeon%20R9%20280X&gpuName=AMD%20Radeon%20R9%20280X

And *** my score beat them so why do you think i have a bad score? Or am i missing something (probably am, lol)?

Meh forget it, i saw they running firestrike, ill guess i go dl firestrike and try it and see what i get?

Meh i dont get it, i compare now and i see they are all overclocked almost anyway so i think my score is normal for a completly stock crossfired rig?


----------



## Durvelle27

Quote:


> Originally Posted by *hyp3rtraxx*
> 
> http://www.3dmark.com/3dm11/7788931
> 
> 14332 score now. With my cpu at 5ghz, stable. But *** 3dmark isnt reckognizing my cpu? And now i get like 19k grapchics score, didnt kow that was bad for 2 r280´s? Can anyone else with crossfired r280x run pls so we can compare, and yes the cards are completly stock. But as said before they reach 90c when mining so maybe they reach that temp now and downclocks as someone said? Well i have to wait until someone else pls run 3dmark 11 with crossfired r280x and we can compare?
> 
> Here i have searched for other crossfired 280x running 3dmark:
> 
> http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/gpuname/fs/P/AMD%20Radeon%20R9%20280X&gpuName=AMD%20Radeon%20R9%20280X
> 
> And *** my score beat them so why do you think i have a bad score? Or am i missing something (probably am, lol)?


That's better but your Physics is terrible for 5GHz. I get that at 4.8GHz

Can do better


----------



## tsm106

Quote:


> Originally Posted by *hyp3rtraxx*
> 
> http://www.3dmark.com/3dm11/7788931
> 
> 14332 score now. With my cpu at 5ghz, stable. But *** 3dmark isnt reckognizing my cpu? And now i get like 19k grapchics score, didnt kow that was bad for 2 r280´s? Can anyone else with crossfired r280x run pls so we can compare, and yes the cards are completly stock. But as said before they reach 90c when mining so maybe they reach that temp now and downclocks as someone said? Well i have to wait until someone else pls run 3dmark 11 with crossfired r280x and we can compare?
> 
> Here i have searched for other crossfired 280x running 3dmark:
> 
> http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/gpuname/fs/P/AMD%20Radeon%20R9%20280X&gpuName=AMD%20Radeon%20R9%20280X
> 
> And *** my score beat them so why do you think i have a bad score? Or am i missing something (probably am, lol)?
> 
> Meh forget it, i saw they running firestrike, ill guess i go dl firestrike and try it and see what i get?
> 
> Meh i dont get it, i compare now and i see they are all overclocked almost anyway so i think my score is normal for a completly stock crossfired rig?












You posted a 3dmark 11 run, and compared it to Firestrike. 3dm11 runs are in the 18K-23K range in pscore and 25K range for gs.


----------



## hamzta09

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You posted a 3dmark 11 run, and compared it to Firestrike. 3dm11 runs are in the 18K-23K range in pscore and 25K range for gs.


http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/gpu/3dm11/X/897/?minScore=0&gpuName=AMD Radeon R9 280X
Looks like 10k to me and not 18-23k.

Heres extreme.
http://www.3dmark.com/3dm11/7506134
6.7k


----------



## hyp3rtraxx

3dmark 11, 19400score graphics, isnt that ok for a stock tahiti? or is there anything wrong with my cards? I know i have low psychichs score for 5ghz cant understand why? hmm maybe parked cores? Anyone got any idea?

http://valid.canardpc.com/c9vhm5

http://www.techpowerup.com/gpuz/d9xw/

Anyone got any ideas whats going on?

And whats this "tahihi XTL" the "XTL" part, what does it do?


----------



## tsm106

Quote:


> Originally Posted by *hamzta09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You posted a 3dmark 11 run, and compared it to Firestrike. 3dm11 runs are in the 18K-23K range in pscore and 25K range for gs.
> 
> 
> 
> http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/gpu/3dm11/X/897/?minScore=0&gpuName=AMD Radeon R9 280X
> Looks like 10k to me and not 18-23k.
> 
> Heres extreme.
> http://www.3dmark.com/3dm11/7506134
> 6.7k
Click to expand...











Who are you replying to? You linked 3dm11 EXTREME. The OP is asking about 3dm11 Performance.

Quote:


> Originally Posted by *hyp3rtraxx*
> 
> 3dmark 11, 19400score graphics, isnt that ok for a stock tahiti? or is there anything wrong with my cards? I know i have low psychichs score for 5ghz cant understand why? hmm maybe parked cores? Anyone got any idea?
> 
> http://valid.canardpc.com/c9vhm5
> 
> http://www.techpowerup.com/gpuz/d9xw/
> 
> Anyone got any ideas whats going on?
> 
> And whats this "tahihi XTL" the "XTL" part, what does it do?


Yea, you're gs looks ok. As far as the app goes, is it updated to the current one?

XTL is just the designation for the lastest stepping.


----------



## hyp3rtraxx

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Who are you replying to? You linked 3dm11 EXTREME. The OP is asking about 3dm11 Performance.
> Yea, you're gs looks ok. As far as the app goes, is it updated to the current one?
> 
> XTL is just the designation for the lastest stepping.


Ty for your answer and explaining "xtl", what do you mean by "As far as the app goes, is it updated to the current one?" sorry didnt understand that part :/


----------



## tsm106

Futuremark has routinely updates its sys info app. When you start it up, does it ask update? Also, your chipset drivers and other base system drivers need to be setup as well.


----------



## Devildog83

Quote:


> Originally Posted by *hyp3rtraxx*
> 
> http://www.3dmark.com/3dm11/7788738
> 
> 12587 score with crossfired r280x in my sig. But now my cpu has downclocked itself to 4.2ghz? From 5 Ghz. no idea if my result is good i only know my physics score isnt good it should be around 9k!


There is something wrong there, is it supposed to down-clock itself? The graphics are low too. I get 18,000+ with my R9 270x and 7870 but the clocks are somewhat high. By the way, do you have graphics overdrive enabled in CCC. If so try disabling it.


----------



## hamzta09

Quote:


> Originally Posted by *tsm106*
> 
> Who are you replying to? You linked 3dm11 EXTREME. The OP is asking about 3dm11 Performance.


Why do you run Performance and not Extreme?


----------



## tsm106

Quote:


> Originally Posted by *hamzta09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Who are you replying to? You linked 3dm11 EXTREME. The OP is asking about 3dm11 Performance.
> 
> 
> 
> Why do *you* run Performance and not Extreme?
Click to expand...











Who? I'm not running anything.


----------



## hamzta09

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Who? I'm not running anything.


You != You specifically.

10k seems to be fine for performance.
7k seems fine for extreme.

The guy got 13k.

Maybe 3dmark dont scale well with crossfire?!


----------



## Lisjak

Hey guys, I have a small problem with my gpu and was wondering if anyone has a solution to this. I wanted to update my gpu but when I click find update I get this error:



I then select the new bios version and click update and when installing I get this error.



Anyone know a fix for this?


----------



## NinjaToast

^ If I remember correctly that bios version isn't for the Asus 280x, it was for the Asus 270x but for some reason 280x users could see it and be told it's for their cards. Basically asus goofed and GPUTweak will likely crash everytime you try to update


----------



## Devildog83

Quote:


> Originally Posted by *Lisjak*
> 
> Hey guys, I have a small problem with my gpu and was wondering if anyone has a solution to this. I wanted to update my gpu but when I click find update I get this error:
> 
> 
> 
> I then select the new bios version and click update and when installing I get this error.
> 
> 
> 
> Anyone know a fix for this?


Uninstall your drivers and GPU Tweak, do a driver sweep *after restarting* , something like 'driver'cleaner pro' will work, then reinstall new drivers and GPU Tweak and restart. You could also try "Afterburner" which I like better.


----------



## gnemelf

http://www.3dmark.com/3dm/2149792
well thats my new high... at this point my cpu is my bottleneck imo


----------



## Sgt Bilko

280X DirectCUII V2 TOP's are back in stock at PCCG for those in Aus.

http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1559&products_id=25323


----------



## Lisjak

Quote:


> Originally Posted by *NinjaToast*
> 
> ^ If I remember correctly that bios version isn't for the Asus 280x, it was for the Asus 270x but for some reason 280x users could see it and be told it's for their cards. Basically asus goofed and GPUTweak will likely crash everytime you try to update


That's wierd. If it really is this problem than I don't mind it








Quote:


> Originally Posted by *Devildog83*
> 
> Uninstall your drivers and GPU Tweak, do a driver sweep *after restarting* , something like 'driver'cleaner pro' will work, then reinstall new drivers and GPU Tweak and restart. You could also try "Afterburner" which I like better.


I doubt it will help because the latest driver sweep was done less than 2 weeks ago. But I might try it anyway. Thank you.


----------



## [CyGnus]

Are you guys on 13.12 or 13.11 b9.5? i did never tried the 13.12 how are they?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *[CyGnus]*
> 
> Are you guys on 13.12 or 13.11 b9.5? i did never try the 13.12 how are they?


I am on 13.2 IT is ok.. I see flickering until windows loads.. after that not bad.. still not great of drivers though.. TBH nVidia had better drivers from what I was coming from


----------



## hyp3rtraxx

Quote:


> Originally Posted by *hyp3rtraxx*
> 
> 3dmark 11, 19400score graphics, isnt that ok for a stock tahiti? or is there anything wrong with my cards? I know i have low psychichs score for 5ghz cant understand why? hmm maybe parked cores? Anyone got any idea?
> 
> http://valid.canardpc.com/c9vhm5
> 
> http://www.techpowerup.com/gpuz/d9xw/
> 
> Anyone got any ideas whats going on?
> 
> And whats this "tahihi XTL" the "XTL" part, what does it do?


Anyone pls help me and look at my validations, does it seems like anythings wrong? And is 19400 score graphics score really bad for my crossfired 280x. Remember they are all stock!


----------



## hyp3rtraxx

Quote:


> Originally Posted by *Durvelle27*
> 
> That's better but your Physics is terrible for 5GHz. I get that at 4.8GHz
> 
> Can do better


What do you think might be wrong since i get such a bad psychics score? I have clocked by multiplier and fsb combined! What could be the problem? Im clueless and very annoyed since i consider that i can a little bit about overclocking im far from the best but not a tottally noob either and i have no clue whats going on









http://valid.canardpc.com/k2hl2e


----------



## hamzta09

Since I have to replace my powersupply with the new cards, do I need to reset my bios to default clocks (cpu) and such?
I dont need to format though right?

They finally packed the cards.... 3 days later so I wont have them until next week -.-


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> Since I have to replace my powersupply with the new cards, do I need to reset my bios to default clocks (cpu) and such?
> I dont need to format though right?
> 
> They finally packed the cards.... 3 days later so I wont have them until next week -.-


You bios should be fine as is. also no need to reload windows and hopefully I'll have your card soon


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hyp3rtraxx*
> 
> What do you think might be wrong since i get such a bad psychics score? I have clocked by multiplier and fsb combined! What could be the problem? Im clueless and very annoyed since i consider that i can a little bit about overclocking im far from the best but not a tottally noob either and i have no clue whats going on
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://valid.canardpc.com/k2hl2e


When you check your ram see if bumping cpu nb


----------



## Devildog83

Quote:


> Originally Posted by *[CyGnus]*
> 
> Are you guys on 13.12 or 13.11 b9.5? i did never tried the 13.12 how are they?


I am on 13.12 and it's fine for me.


----------



## hamzta09

31 minute mark.


----------



## Durvelle27

Quote:


> Originally Posted by *hyp3rtraxx*
> 
> What do you think might be wrong since i get such a bad psychics score? I have clocked by multiplier and fsb combined! What could be the problem? Im clueless and very annoyed since i consider that i can a little bit about overclocking im far from the best but not a tottally noob either and i have no clue whats going on
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://valid.canardpc.com/k2hl2e


Unstable maybe


----------



## Matheus

Hi, I'm Brazilian.
I bought an ASUS V2 R9 280x DC2T exactly this model https://www.asus.com/Graphics_Cards/R9280XDC2T3GD5V2/
I have the VGA in stock when running Heaven temp is 78 º C, with an ambient temperature 30 º C, is this normal?
Searching the internet I found another User with problems: http://forums.hexus.net/help-quick-relief-tech-headaches/304401-r9-280x-high-temps.html
Sorry for my bad english.


----------



## Devildog83

Quote:


> Originally Posted by *Matheus*
> 
> Hi, I'm Brazilian.
> I bought an ASUS V2 R9 280x DC2T exactly this model https://www.asus.com/Graphics_Cards/R9280XDC2T3GD5V2/
> I have the VGA in stock when running Heaven temp is 78 º C, with an ambient temperature 30 º C, is this normal?
> Searching the internet I found another User with problems: http://forums.hexus.net/help-quick-relief-tech-headaches/304401-r9-280x-high-temps.html
> Sorry for my bad english.


I don't think 78c is horrible if you are just using auto for the fans , I have been to 80c on my top card under Valley. I always run my fans at 80% plus when benching because that's when you will get the most heat. I would make a user profile the ramps the fans up when you get to 70c so the card will stay cooler.


----------



## JCH979

I had the day off so finally got around to getting my 24/7 clocks set, 1200/1666









@[CyGnus] I was experiencing some "flickering" similar to what F3ERS 2 ASH3S describes except at random, i just noticed it a few times in windows but no issues in games, that was with 9.5 BETA though.

I switched to 13.12 just a few days ago and I haven't experienced anything abnormal in windows or in games so far.

p. s. GPUTweak is starting to annoy me..


----------



## Devildog83

Quote:


> Originally Posted by *JCH979*
> 
> I had the day off so finally got around to getting my 24/7 clocks set, 1200/1666
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @[CyGnus] I was experiencing some "flickering" similar to what F3ERS 2 ASH3S describes except at random, i just noticed it a few times in windows but no issues in games, that was with 9.5 BETA though.
> 
> I switched to 13.12 just a few days ago and I haven't experienced anything abnormal in windows or in games so far.
> 
> p. s. GPUTweak is starting to annoy me..


Overclock updated.


----------



## Arizonian

*Announcement*

Seeking someone to take on thread starter role at the *xXCrossXFire ClubXx --Because one's not enough *

Anyone interested, please PM me.


----------



## Pis

Quote:


> Originally Posted by *[CyGnus]*
> 
> Are you guys on 13.12 or 13.11 b9.5? i did never tried the 13.12 how are they?


I am on 13.12 and it's fine for me too.

http://unitedrighunter.blogspot.com/2014/01/sapphire-toxic-r9-280x-benchmarks.html


----------



## G2O415

Not sure if it's the 13.12 drivers or my card, but I seem to get some artifacts in League of Legends. This is the only game I get artifacts and its not during an entire match but I randomly get a green/purple flash in random areas of my screen for split seconds.


----------



## [CyGnus]

Thanks for the opinions guys


----------



## hoevito

Hey peoples, due to the issues I was having with Crossfire, I decided to sell my remaining R9 280x (Matrix Platinum) and go in a different direction. The Crossfire setup I was running with the Matrix and Sapphire Toxic was just throwing off way too much heat, and even with a full pci slot between cards, 4 200mm fans and 5 140mm fans circulating air around my case, they just weren't enough to keep the cards cool after more than a few minutes of heavy gaming.

I originally had an Asus DC2T in my crossfire setup but it couldn't clock any higher than 1150mhz without artifacting badly, and the Toxic could only hit 1200mhz. The Matrix on the other hand could hit 1240mhz semi reliably but I just wasn't satisfied with the performance I was getting out of the setup. I also was having driver issues seemingly constantly, and the Xstar monitor I have is capable of 110hz but the stupid drivers would cause issues with flash, videos, and basically anything other than games for some reason when the monitor was overclocked. No set of drivers I tried (and I tried the last 5 or 6 betas AND the WHQL set) seemed to help with my setup.

I really didn't want to get rid of the cards, but Newegg screwed me when I attempted to return and exchange my Toxic, so they refunded me my money and claimed not to have any cards in stock for exchange. Frustratingly so, less than 12 hours later they had cards in stock but marked them up to $469 and expected me to pay the new amount when I originally got it for $341. Since the miners have inflated AMD prices so badly, I took my money and got 2 Lightning 780's instead, and the experience has been quite a bit smoother so far. The best part is, each card was only $499 brand new









Here's the old setup vs. new...

The scores with the old setup were with both 280x's overclocked as high as I could run them (both at 1200/1700mhz), and the 780's are NOT overclocked at all and still running out of the box specs...

Started with this...



Stepped up to this...




Now we here...





In my opinion I feel that two 280x's is a great value....AT MSRP lol. Notice the Valley scores, the 280's actually have a significantly higher minimum framerate than the 780's, and the max is just 2 fps lower. It's just the average fps where the 780's open up a lead, but other than that the 280x's performed fantastically (when they worked). I will miss them dearly as they represented my first attempt at running multiple gpu's and this is the first rig I've ever built


----------



## Jawwwwsh

Proud new owner of an XFX R9 270!!

http://www.techpowerup.com/gpuz/h4pay/

One question though, in my current sig rig, would crossfiring another 270 (or possibly a 270X if that is doable?) be worth it?


----------



## neurotix

Devildog would probably tell you it's worth it. He has a 7870 Crossfired with a 270X. He gets about the performance of a single R9 290(X).


----------



## Jawwwwsh

Oh wow, I may look into it then! I've never X-fired in my life so have no experience of it! How compatible are other models of ATI card (for example, a 270X and a 270 in Xfire) and shoudl you try and stick to the same manufacturer etc?


----------



## Archea47

Quote:


> Originally Posted by *Jawwwwsh*
> 
> Oh wow, I may look into it then! I've never X-fired in my life so have no experience of it! How compatible are other models of ATI card (for example, a 270X and a 270 in Xfire) and shoudl you try and stick to the same manufacturer etc?


You might stick to the same manufacturer for aesthetics or vendor management benefits but it's not necessary

There should be no issues pairing a 270X with your 270 - the faster card will downclock to the speed of the slowest card. So really it doesn't make sense to buy a card better than your current one unless you think it will have better cooling


----------



## Jawwwwsh

Quote:


> Originally Posted by *Archea47*
> 
> You might stick to the same manufacturer for aesthetics or vendor management benefits but it's not necessary
> 
> There should be no issues pairing a 270X with your 270 - the faster card will downclock to the speed of the slowest card. So really it doesn't make sense to buy a card better than your current one unless you think it will have better cooling


Ah ok the downclocking information is crucial, as there is no point in me purchasing a 270X then! I think I'll stick to just purchasing a 2nd XFX R9 270 as it does help to keep things simple!

Take a +rep good sir!

On the downclocking note, I presume if I purchased a 4Gb version of the 270X, it wouldn't disable 2Gb of the ram in the same manner though, right?


----------



## neurotix

The 270 and 270X are the exact same card, just with different clock speeds. They have the same amount of shaders. The 270 comes clocked slower. Even with a 270X, the stock speed varies depending on the manufacturer. I believe Curacao XT's (270X) base speed is 1000mhz. My Sapphire Vapor-X came stock at 1100mhz, but that doesn't matter since I'm overclocking it anyway. A 270 will come clocked slower and is probably binned lower so you might get less impressive overclocks with a 270 compared to a 270X. However, this probably doesn't matter much. If budget is an issue then go with the 270.

If you have a 4GB card and a 2GB card then they will only use a 2GB frame buffer in Crossfire. Don't quote me on that, but someone else asked recently and other people told them that to use all the RAM, both cards need to have the same amount.


----------



## Jawwwwsh

I did not realise that the cards were the same! Thanks, I'll bear it all in mind. Most likely going to grab a 2nd 270 and some new HDDs when February's pay comes around!


----------



## neurotix

Quote:


> Originally Posted by *Jawwwwsh*
> 
> I did not realise that the cards were the same! Thanks, I'll bear it all in mind. Most likely going to grab a 2nd 270 and some new HDDs when February's pay comes around!


They are pretty much the same card, yeah.

What resolution do you game at and with how many monitors? If it's at 1080p on a single display, a single 270 should be able to run everything except for Crysis 3/Far Cry 3 and Battlefield 4 on Ultra with high fps.

Still, more than one card is always better assuming you have the power supply for it and the room in your case.

Personally, I stick with one very powerful card because my PSU is only 650w with 2 PCI-E connectors, so I'm limited with that. I don't have the money to get two cards and a better PSU and I'm too lazy to tear my whole system apart to install the new PSU. I'd have to take my motherboard out and run the new wires behind it, and my drive cage is a mess.

I'm getting a R9 290 Tri-X in the next few days, so that will have to do for my setup for a few years.


----------



## Jawwwwsh

At the moment in honesty, only a single (albeit, 42") 1080p monitor, but this machine is rather a work in progress, and planning on twin 24-6" monitors in the near future, and I'd rather the PC be ready for the monitors when they arrive than the monitors arrive and me have to get a 2nd 270!
If you fancy sending that Sapphire 270X of yours over when you're done with it, that'd be just dandy








PSU wise, I've got a 620W Modular PSU by Enermax, (the newer one broke ;( ) and when I say the current PSU is old, I mean it's got a "P4 Prescott Ready" sticker on the side of it.


----------



## neurotix

LOL P4 Prescott ready. That thing must be from 2002.

I *have* seriously considered selling this 270X on the marketplace and asking more than I paid for it because of the high ASIC. When I put my 290 in, this one is going into another rig my brother uses for gaming. He's on a 4670 right now, and as of late I think the fan is wearing out. He was playing Skyrim the other day and the card's core was at 98C. I made him turn it off and I'm surprised the old 4670 didn't expload. *poof* So I think I'll hang on to my 270X









Are you sure you only want to do 2 monitors? It depends on the games you play but, with only two monitors the bezels are right in the center of the playfield. It makes it impossible to aim in FPS. Other games with the characters in the center of the screen would be hard too, stuff like Torchlight or Diablo III.


----------



## taem

Quote:


> Originally Posted by *hoevito*
> 
> Hey peoples, due to the issues I was having with Crossfire, I decided to sell my remaining R9 280x (Matrix Platinum) and go in a different direction. The Crossfire setup I was running with the Matrix and Sapphire Toxic was just throwing off way too much heat, and even with a full pci slot between cards, 4 200mm fans and 5 140mm fans circulating air around my case, they just weren't enough to keep the cards cool after more than a few minutes of heavy gaming:


Thanks for that post. I returned my 280x for various reasons and Preordered a 290 gaming msi, but I'm thinking more and more I should go with reference 780 just for this reason. Surprised you got lightning a instead of reference if heat is the issue though. Lightning is a great card but even if you can run them cool it's still pretty much the same amount of hot air in your case. Even a single 280x was killing my cpu/mobo temps.


----------



## Devildog83

Quote:


> Originally Posted by *Jawwwwsh*
> 
> I did not realise that the cards were the same! Thanks, I'll bear it all in mind. Most likely going to grab a 2nd 270 and some new HDDs when February's pay comes around!


The 270 is basically a 7850 with a few minor upgrades. 2 of those will do nicely I am sure. The 270x is a 7870 which has a higher clock speed capability than the 7850's or 270's. I don't get double the performance but I get very close to it so X-Fire to me has been worth every penny. If you look at the cost of a 290, I saved some money. 2 x 7850 should get slightly better performance than the 280x's I would think.


----------



## Devildog83

Jawwwwish has been added. Welcome !!


----------



## hoevito

Quote:


> Originally Posted by *taem*
> 
> Thanks for that post. I returned my 280x for various reasons and Preordered a 290 gaming msi, but I'm thinking more and more I should go with reference 780 just for this reason. Surprised you got lightning a instead of reference if heat is the issue though. Lightning is a great card but even if you can run them cool it's still pretty much the same amount of hot air in your case. Even a single 280x was killing my cpu/mobo temps.


I was very close to pulling the trigger on two reference cards myself, but considering I snagged the Lightnings for the same price ($499) I went ahead and got them. Surprisingly, even as close as they are in my case, they both idle in the low 30's and the top card has only gotten up to 78 degrees at stock voltages. Now I don't expect them to stay at those temps once I start over clocking, but as strong as they've been so far I might not need to! I may also venture into water cooling so I set myself up nicely if I ever go that route. All in all they run MUCH cooler than 280x's hands down


----------



## kersoz2003

I bought a new sapphire vapor-x o.c R9 280X and overclocked it to 1240 / 1800 .

however my EXTREME HD valley benchmark results are low I guess:



So why will be the reason ? is it really low for this overclock.


----------



## Devildog83

Quote:


> Originally Posted by *kersoz2003*
> 
> I bought a new sapphire vapor-x o.c R9 280X and overclocked it to 1240 / 1800 .
> 
> however my EXTREME HD valley benchmark results are low I guess:
> 
> 
> 
> So why will be the reason ? is it really low for this overclock.


RAM speed and CPU clocks affect Valley, I would expect in the mid 40's for a 280x, try to get to 4.7 Ghz if you can.


----------



## Devildog83

kersoz2003 has also been added. Welcome !!


----------



## kersoz2003

Quote:


> Originally Posted by *Devildog83*
> 
> RAM speed and CPU clocks affect Valley, I would expect in the mid 40's for a 280x, try to get to 4.7 Ghz if you can.


one of my friend with his 5.2 ghz intel cpu and hd 7950 @ 1225/1750 he gets more than me :



So why ?


----------



## kersoz2003

Oh ok now after completel wiping my drivers and reinstalling them i got very good results. They are comng now









edit :

here is my new results :


----------



## hamzta09

Quote:


> Originally Posted by *kersoz2003*
> 
> one of my friend with his 5.2 ghz intel cpu and hd 7950 @ 1225/1750 he gets more than me :
> 
> 
> 
> So why ?


He has better cpu.


----------



## kersoz2003

Quote:


> Originally Posted by *hamzta09*
> 
> He has better cpu.


but as you see my latest post I win over him


----------



## Devildog83

Quote:


> Originally Posted by *hamzta09*
> 
> He has better cpu.


NOT!!!! a 2500k is not better than an FX 8350, not even close.


----------



## Falconx50

Well I was asked to post some numbers so here is what I have:
InWin BUC case
GA-990XA-UD3 Mother board
AMD FX 6100 @ 4.0 GHz
G Skill Sniper Memory @ (1600 Hz) 16GB
Cooler Master 750W Power Supply (Bronze ed.)
Powercolor R9 270x "Devil" @ 1180/1400 (stock)

Benchmarks:
MSI - Kombuster:

Heaven: (Ultra)


----------



## neurotix

Your image isn't showing up, and I think Devildog83 was asking for Valley, not Heaven.


----------



## hamzta09

Quote:


> Originally Posted by *Devildog83*
> 
> NOT!!!! a 2500k is not better than an FX 8350, not even close.


What are you talking about?

Intel > AMD in IPC.
Doesnt matter if you got more cores.

http://cdn.overclock.net/e/e6/e62d7182_bit-tech.png
http://cdn.overclock.net/c/ca/cab45429_51141.png


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> What are you talking about?
> 
> Intel > AMD in IPC.
> Doesnt matter if you got more cores.
> 
> http://cdn.overclock.net/e/e6/e62d7182_bit-tech.png
> http://cdn.overclock.net/c/ca/cab45429_51141.png


2500k is faster in single threaded applications but overall the 8350 is better CPU.


----------



## hamzta09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 2500k is faster in single threaded applications but overall the 8350 is better CPU.


You really need a beefy OC on an AMD CPU in order to get near intel CPUs in anything but synthetics.
In terms of rendering videos and other things: QuickSync.

And tell me what games benefit from 4+ cores though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> You really need a beefy OC on an AMD CPU in order to get near intel CPUs in anything but synthetics.
> In terms of rendering videos and other things: QuickSync.
> 
> And tell me what games benefit from 4+ cores though.


No games that I know of will run faster on 8 cores rather than 4 but then again we expect Mantle to change the landscape again by removing the CPU bottleneck.


----------



## resetes12

Hey, anybody know if XT2 GPUs and XTL GPUs are compatible with crossfire?

Another question, is VTX3D 280x good for OC? It has the XTL GPU?

Thanks!


----------



## Archea47

Quote:


> Originally Posted by *resetes12*
> 
> Hey, anybody know if XT2 GPUs and XTL GPUs are compatible with crossfire?
> 
> Another question, is VTX3D 280x good for OC? It has the XTL GPU?
> 
> Thanks!


Yes the Tahiti XT2 and XTL (280x) are both crossfire capable. I have two XTLs in X-fire with no issues. There should be no problem pairing an XT2 with an XTL either


----------



## Devildog83

Quote:


> Originally Posted by *hamzta09*
> 
> What are you talking about?
> 
> Intel > AMD in IPC.
> Doesn't matter if you got more cores.
> 
> http://cdn.overclock.net/e/e6/e62d7182_bit-tech.png
> http://cdn.overclock.net/c/ca/cab45429_51141.png


There is no sense in even having this stupid argument, if you do a fair comparison with many games and many resolutions, which I have seen, the 8350 comes out better than the 3570k and just under the 3770k. It's a fact, it's not disputable, 4 cores- 8 cores it don't matter. You can't show me screen shot's of a specific game at a specific resolution with different clocks and tell me that that that's what I should believe. If you said the 3770k is better I could see it but the 2500k is not even close.


----------



## Devildog83

Quote:


> Originally Posted by *Falconx50*
> 
> Well I was asked to post some numbers so here is what I have:
> InWin BUC case
> GA-990XA-UD3 Mother board
> AMD FX 6100 @ 4.0 GHz
> G Skill Sniper Memory @ (1600 Hz) 16GB
> Cooler Master 750W Power Supply (Bronze ed.)
> Powercolor R9 270x "Devil" @ 1180/1400 (stock)
> 
> Benchmarks:
> MSI - Kombuster:
> 
> Heaven: (Ultra)
> 
> OP updated. Yes I was looking for Valley, just thought if I could get enough scores I would post it in the OP for all to look over. I will be setting that up soon. Thanks.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> You really need a beefy OC on an AMD CPU in order to get near intel CPUs in anything but synthetics.
> In terms of rendering videos and other things: QuickSync.
> 
> And tell me what games benefit from 4+ cores though.


Crysis 3
Battlefield 3 and 4
Metro 2033
Metro Last light

^just to name a few


----------



## hamzta09

Quote:


> Originally Posted by *Devildog83*
> 
> There is no sense in even having this stupid argument, if you do a fair comparison with many games and many resolutions, which I have seen, the 8350 comes out better than the 3570k and just under the 3770k. It's a fact, it's not disputable, 4 cores- 8 cores it don't matter. You can't show me screen shot's of a specific game at a specific resolution with different clocks and tell me that that that's what I should believe. If you said the 3770k is better I could see it but the 2500k is not even close.


Sounds like someone is very fond of his highly overpriced CPU.

Why is a 3770K better than a 2500K?

http://www.dslreports.com/forum/r28517633-I5-2500K-vs-I7-4770K-Gaming-Benchmarks


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> Sounds like someone is very fond of his highly overpriced CPU.
> 
> Why is a 3770K better than a 2500K?
> 
> http://www.dslreports.com/forum/r28517633-I5-2500K-vs-I7-4770K-Gaming-Benchmarks


----------



## hamzta09

Quote:


> Originally Posted by *F3ERS 2 ASH3S*


Grab your popcorn then.
He insists that Intel < AMD.
Which is rather interesting.

AMD > Nvidia though, I agree with that, in everything but shadowplay.

I get my cards on monday, we'll see if I encounter any problems, Ive read the drivers got some issues with crossfire.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> Grab your popcorn then.
> He insists that Intel < AMD.
> Which is rather interesting.
> 
> AMD > Nvidia though, I agree with that, in everything but shadowplay.
> 
> I get my cards on monday, we'll see if I encounter any problems, Ive read the drivers got some issues with crossfire.


13.2 should work out for you.. they cleaned up a lot in that set

Good news... I corrupted my OS.. after reinstall.. I get better scores woot! http://www.3dmark.com/3dm11/7805852


----------



## kersoz2003

Quote:


> Originally Posted by *hamzta09*
> 
> Sounds like someone is very fond of his highly overpriced CPU.
> 
> Why is a 3770K better than a 2500K?
> 
> http://www.dslreports.com/forum/r28517633-I5-2500K-vs-I7-4770K-Gaming-Benchmarks


there are alot of games that use 4 and more cores. and bf4 uses 8 cores and most games gonna use 8 cores. so fx 8350 is better than both


----------



## Devildog83

Quote:


> Originally Posted by *hamzta09*
> 
> Sounds like someone is very fond of his highly overpriced CPU.
> 
> Why is a 3770K better than a 2500K?
> 
> http://www.dslreports.com/forum/r28517633-I5-2500K-vs-I7-4770K-Gaming-Benchmarks


OK, this is a graphics card forum not a place for Intel fan-boys. By the way the 2500k is more expensive than a 8350 an doesn't perform as well. Please do not come to this thread and tell folks that there scores are low due to the CPU they use when it's not true.


----------



## jamponget9

Mind me coz i havent tried this yet... is an A10 FM2 LLANO APU Crossfire or can perform in tandem with an R9 GPU?


----------



## Devildog83

Quote:


> Originally Posted by *hamzta09*
> 
> Grab your popcorn then.
> He insists that Intel < AMD.
> Which is rather interesting.
> 
> AMD > Nvidia though, I agree with that, in everything but shadowplay.
> 
> I get my cards on monday, we'll see if I encounter any problems, Ive read the drivers got some issues with crossfire.


No I don't, I insist that as an overall processor the 8350 is better than a 2500k, of this there is no doubt, anything beyond that all depends on what you do with your system.


----------



## Devildog83

Quote:


> Originally Posted by *jamponget9*
> 
> Mind me coz i havent tried this yet... is an A10 FM2 LLANO APU Crossfire or can perform in tandem with an R9 GPU?


They do not X-fire, if you use them it would be the GPU solo just like it would work with a CPU.


----------



## hamzta09

Quote:


> Originally Posted by *kersoz2003*
> 
> there are alot of games that use 4 and more cores. and bf4 uses 8 cores and most games gonna use 8 cores. so fx 8350 is better than both


Rofl.

2 fps difference, seriously? AND it runs at 4GHz.

AND no BF4 does not use 8 cores as evidenced by your image.


----------



## tsm106

Lmao, is he still arguing for the vishera?


----------



## Delphi

Add me up. Radeon R9 270X IceQ X2 2GB in Crossfire. Although I am not sure everything looks correct. Been out of the game for awhile so maybe you guys can give me some pointers.


----------



## cremelo

Well, i need change my OC Settings to 1100Mhz on my Asus R9 280x DCU2 Top








Was having problems in GTA IV and Infestation Survival Stories I believe it was the driver but did not have before with the beta....
Here is my "new" Score =/
( Yes my CPU is on Stock )


----------



## Devildog83

Quote:


> Originally Posted by *hamzta09*
> 
> Rofl.
> 
> 2 fps difference, seriously? AND it runs at 4GHz.
> 
> AND no BF4 does not use 8 cores as evidenced by your image.


If you want to continue to argue this go to the Vishera thread and tell them guys that, this is not the place for it.

By the way it looks like 11 FPS, not that it even matters.


----------



## GTR Mclaren

I strongly believe that a good 3570k at 4.2Ghz or more is a better gaming CPU than any AMD one, and the cost is almost the same at least here in my country

I would really like to get an AMD, but the power consumptions is too high and the AM3 plus mobos are behind is some specs


----------



## sterob

Hi, does anyone have experiences witb both sapphire and msi r9 280x? How is the tempurature, and fan noise? Can their voltage be adjusted? I wanted to go for the asus r9 280x because of its supperior cooling and quiet fan. However it is oos everywhere so i have to make do with either msi or sapphire.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *sterob*
> 
> Hi, does anyone have experiences witb both sapphire and msi r9 280x? How is the tempurature, and fan noise? Can their voltage be adjusted? I wanted to go for the asus r9 280x because of its supperior cooling and quiet fan. However it is oos everywhere so i have to make do with either msi or sapphire.


For temps and fan noise I would have to say the xfx 280x is a great buy. 70 to 73c max at 100percent fan and I still hear my case fans over the card fans


----------



## madorax

Quote:


> Originally Posted by *sterob*
> 
> Hi, does anyone have experiences witb both sapphire and msi r9 280x? How is the tempurature, and fan noise? Can their voltage be adjusted? I wanted to go for the asus r9 280x because of its supperior cooling and quiet fan. However it is oos everywhere so i have to make do with either msi or sapphire.


my MSI 280X run about 70-72c @ 100% fan on BF4 online fullHD max out about 1 hours, and run 75c flat after mining with GUIminer for about 5 hours, the fan noise is still acceptable and my case fan (Aerocool Shark) is more loud than the MSI fan, and it can adjust voltage up to 1.3v with afterburner or trixx. Sapphire one is voltage locked and the fan is a bit louder than MSI AFAIK.


----------



## Z0K1

280x Toxic










1250/1660 0%pt fan 50% 65c-66c
5th run in 1920x1080 *native for my monitor* with everything maxed aa8 vsync shadows ect.


1250/1660 0%pt fan 50% 65c-66c
5th run in 1600x900
settings

results


----------



## Archea47

Some people complain about the Toxic color scheme but it fits your case perfectly Z0K1!







Congrats on the nice clock


----------



## benjamen50

Can my graphics card get damaged when litecoin mining? Using Gigabyte AMD R9 270X Windforce 3X Edition, overclocked with stock voltage and 65°C max. I think it's my low-quality-ish power supply that is causing it to crash with the drivers and possible also artefacts.


----------



## hamzta09

Might have a problem.

My mobo has only an x16 and an x8 slot.

1 x PCI Express x16 slot, running at x16 (PCIEX16)
1 x PCI Express x16 slot, running at x8 (PCIEX8)

This a problem with xfire?


----------



## sterob

Quote:


> Originally Posted by *madorax*
> 
> my MSI 280X run about 70-72c @ 100% fan on BF4 online fullHD max out about 1 hours, and run 75c flat after mining with GUIminer for about 5 hours, the fan noise is still acceptable and my case fan (Aerocool Shark) is more loud than the MSI fan, and it can adjust voltage up to 1.3v with afterburner or trixx. Sapphire one is voltage locked and the fan is a bit louder than MSI AFAIK.


wait what you need to run fan at 100% to cool the msi card to 70 C while gaming? Will it shorten the fan life? And i suppose that you also used 100% fan speed to archive 75 C temp in guiminer


----------



## Z0K1

thanks Archea47

TBH that's what drew me to it, same thing with my a30 TT case.. a lot of people didn't like the way that it looked, and for good reason cause on paper seemed like a great idea but RW app. cooling is awful (could have also been because i zip-tied the 920 to the back of that case with xspc bracket and non-reference card)








my 10in honeywell fixed the mobo overheating issue









@whom it may concern about















100% fan speed 24/7 or even 12/7 (in my experience) will most definitely kill the fan prematurely, i would try pointing a fan at your gpu and find the happy med. of fan speed to not choke it with fresh air + down clocking mem & undervolting on gpu since core is what you want and mem only adds to the heat... [(i take no responsibility)but if you have a voltage locked card like mine you can use VBE7 to under-volt] cryptobager has great guide along side using the hardware wiki
GL, *always* watch for new







, look out for the trolls in the trollboxes, and have fun hashing away


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> Might have a problem.
> 
> My mobo has only an x16 and an x8 slot.
> 
> 1 x PCI Express x16 slot, running at x16 (PCIEX16)
> 1 x PCI Express x16 slot, running at x8 (PCIEX8)
> 
> This a problem with xfire?


PCIe 2.0 or 3.0?

If 3.0 then no probs......if 2.0 then you might lose some speed


----------



## nubki11a

I've got a quesiton for Gigabyte 280X owners. How loud is it during gaming? I read that it can hit 45 dB, which, especially compared to other brands like the MSI and ASUS, seems pretty loud.
Also, is its voltage unlocked? I read lots of conflicting posts about that.

Thanks!









P.S. Reason I'm asking is cuz I can get the Gigabyte for about the same price as the MSI (which is quieter), but the Gigabyte has BF4 included.


----------



## Farih

Is it ok to go over 1.3V with a 270x ?
I have got mine to 1.268V but i would like to get more out of it offcourse.

ATM the fastest 270x on 3DM11.




Overvolted 260x coming up next and yes it will surpass an overvolted 7850


----------



## kersoz2003

Sapphire vapor-x R9 280X very different PROBLEM









Hi friends .I have a slight problem. now my card NEVER exceed 74 degrees all the time because when it hits 74 degree VRM 1 is 109 degress and doing the throttle down (speed cutting ) . VRM heat is over my gpu heat. I do not see gpu temperature over than 74 degrees . automatic fan on the GPU at 74 degrees is not coming out on top of the 50% ( VRM heat does not take into account the fan) so I give 60-70 % .

examples of scenarios :

with some overclocking

VRM temperature of about 70 , while gpu is 55 ( 60% of the fans . )

VRM temperature around 80 , while gpu is 65 ( 60% of the fans )

VRM heat 90-100 while gpu is 70 ( I also hold 60% fans )

and finally throttle scenario :

gpu temperature is 74 and VRM 109 (Even when I keep fans 100% there is no 74 degrees up from the GPU temperature , but it makes 109 in the VRM and it throttles the card. so my overclocking limit is 109 VRM temperature. )

So I use this optimal scenario :

In 1240/1800 overclock ,

VRM temperature around 80 , while gpu is 65 ( 60% of the fans )

briefly to cool the VRMs s possible?


----------



## Devildog83

Delphi has been added. Welcome to the club. If you want me to post your Valley score in the OP which is coming soon, please run it in ExtremeHD. Thanks


----------



## Devildog83

Quote:


> Originally Posted by *GTR Mclaren*
> 
> I strongly believe that a good 3570k at 4.2Ghz or more is a better gaming CPU than any AMD one, and the cost is almost the same at least here in my country
> 
> I would really like to get an AMD, but the power consumptions is too high and the AM3 plus mobos are behind is some specs


Now that I can believe, in all reality the are very close in gaming from what I have seen in tests. Anyone can tweak there configuration to get better than the other one either way. It's very hard to test them with the exact same specs because some components like RAM may work better with one or the other. When you are talking about a couple of frames here or there most will never notice any difference. How the hardware works with a particular game makes the most difference when all is that close. A 2600k too, but a 2500k no way.


----------



## Devildog83

Z0K1 has been added. Welcome to my nightmare. !!!! LOL Your Toxic is in jail.


----------



## madorax

Quote:


> Originally Posted by *sterob*
> 
> wait what you need to run fan at 100% to cool the msi card to 70 C while gaming? Will it shorten the fan life? And i suppose that you also used 100% fan speed to archive 75 C temp in guiminer


nope, i don't run 100% fan for daily usage. i tend to test everything for any new parts in my PC so later i will know what and how exactly they are, in my case here i do run 100% fan by default and testing play BF4 online all maxed out on 1080p on purpose so i know what is my VGA temp at 100% fan, how the noise, and so on. i use setup 80C/100% fan for daily usage, usually it only goes for 74-75C so the fan not run at 100%, for mining i use 80C/100% fan setup, the fan run at 87% if i'm not mistake. never try mining again after that until today...


----------



## cremelo

For those who have Sapphire R9 280X Toxic: which the temperature at Full Load?
thinking of changing my 280x DCU2 TOP to 280X Toxic


----------



## DrClaw

the best 280x is the ASUS Matrix rog, best heatsink, easily overclock to be better than sapphire gpu and a longer warranty


----------



## Devildog83

Quote:


> Originally Posted by *DrClaw*
> 
> the best 280x is the ASUS Matrix rog, best heatsink, easily overclock to be better than sapphire gpu and a longer warranty


OK, but I am sure there is a few folks in here that would disagree. The Toxic might not have as much headroom to overclock, that is debated by some, but is a very fast card and I hear it's rather quiet too.


----------



## Takla

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> For temps and fan noise I would have to say the xfx 280x is a great buy. 70 to 73c max at 100percent fan and I still hear my case fans over the card fans


your case must sound like a jet engine if you can't hear gpu fans @ 100%. i can't believe that there are people out there who find 73c with 100% fan speed acceptable.

meanwhile my r9 280x asus d2c at 1160MHz / 7300MHz 120% PLimit does not go above 69c at an un-audible 35% fan speed


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Takla*
> 
> your case must sound like a jet engine if you can't hear gpu fans @ 100%. i can't believe that there are people out there who find 73c with 100% fan speed acceptable.
> 
> meanwhile my r9 280x asus d2c at 1160MHz / 7300MHz 120% PLimit does not go above 69c at an un-audible 35% fan speed


You missed what I said. In comparison this is the most quiet video card. The 73c is full load at 1.3v and 1210 clock wither 1730 ram. My case is pretty quiet a dull hum. So not worried. With my old cards I used to have to turn my TV up. Now I use the same volume to hear the same things. Computer on or off.

I will also mention that this is a dual slot card not tripple. Big selling point for me.


----------



## cremelo

Quote:


> Originally Posted by *DrClaw*
> 
> the best 280x is the ASUS Matrix rog, best heatsink, easily overclock to be better than sapphire gpu and a longer warranty


In my country Asus does not guarantee







I'm two months waiting for my Asus Xonar DGX waiting back guarantee I had to send to the U.S.A =/


----------



## hamzta09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> PCIe 2.0 or 3.0?
> 
> If 3.0 then no probs......if 2.0 then you might lose some speed


Its 2.0 with SB
3.0 with 22nm CPU, which I dont have.

Lose some speed? 2%?


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> Its 2.0 with SB
> 3.0 with 22nm CPU, which I dont have.
> 
> Lose some speed? 2%?


This is always worth a read: http://www.hardocp.com/article/2012/07/18/pci_express_20_vs_30_gpu_gaming_performance_review/1

Since you will have one card in a x8 slot you will lose more speed than whats in the article.


----------



## hamzta09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> This is always worth a read: http://www.hardocp.com/article/2012/07/18/pci_express_20_vs_30_gpu_gaming_performance_review/1
> 
> Since you will have one card in a x8 slot you will lose more speed than whats in the article.


Are there any non-triplescreen ridiculous resolution tests?

http://www.hardocp.com/article/2012/07/18/pci_express_20_vs_30_gpu_gaming_performance_review/11

Seems like the difference is little to none at 2560, and in batman apparently 2.0 is better? lol.


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> Are there any non-triplescreen ridiculous resolution tests?


You could always search for it yourself you know?

And at 1080p you are more CPU bound but at higher res you are generally more GPU bound so its better to test at higher res rather than lower.

But no....in general for CF its not worth upgrading to PCIe 3.0 just for a few extra frames. Tri-fire and Quad-fire sure but just two cards will be fine at PCIe 2.0 x16/x16


----------



## G2O415

Anybody having artifact issues with the latest drivers 13.12 on the game League of Legends?


----------



## nubki11a

Quote:


> Originally Posted by *G2O415*
> 
> Anybody having artifact issues with the latest drivers 13.12 on the game League of Legends?


Not as far as I know, you could always check www.reddit.com/r/leagueoflegends


----------



## cremelo

Quote:


> Originally Posted by *G2O415*
> 
> Anybody having artifact issues with the latest drivers 13.12 on the game League of Legends?


I had in GTA IV, ISS, BF4 and 3DMark.... I returned to the driver 13:11 Beta 9 and stoped


----------



## F3ERS 2 ASH3S

Since this Is my first card in a long time from amd. What is the max safe vrm temp


----------



## JCH979

90C should be a safe max vrm temp, 120C is the limit iirc.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *JCH979*
> 
> 90C should be a safe max vrm temp, 120C is the limit iirc.


Thank you


----------



## GTR Mclaren

Soooooo

Any bios mod to be able to get 1.3 voltage??

My Gigabyte R9 270x needs that.


----------



## JoeDirt

Quote:


> Originally Posted by *GTR Mclaren*
> 
> Soooooo
> 
> Any bios mod to be able to get 1.3 voltage??
> 
> My Gigabyte R9 270x needs that.


Send me your BIOS and I should be able to make that happen.


----------



## Ccaution

Quote:


> Originally Posted by *JoeDirt*
> 
> Send me your BIOS and I should be able to make that happen.


That's the latest for my GV-R927XOC-2GD -> https://docs.google.com/file/d/0Byw67CyZYE4QOEN2UzVFaGx4bkk/edit?pli=1 any input would be appreciated







VBE7 just won't do it, atm :/


----------



## JoeDirt

Quote:


> Originally Posted by *Ccaution*
> 
> That's the latest for my GV-R927XOC-2GD -> https://docs.google.com/file/d/0Byw67CyZYE4QOEN2UzVFaGx4bkk/edit?pli=1 any input would be appreciated
> 
> 
> 
> 
> 
> 
> 
> VBE7 just won't do it, atm :/


With your BIOS VBE7 let me set it. Was it not allowing you to change the value?


----------



## Ccaution

Quote:


> Originally Posted by *JoeDirt*
> 
> With your BIOS VBE7 let me set it. Was it not allowing you to change the value?


I can change the value by VBE7 too & save the BIOS. CRC's good and the flash goes ok. But the actual multimeter-measured voltage remains the same.

There's a IR3567A controller on those ones

which is a rather sophisticated controller. Latest HWiNFO will measure both voltages (vcore&vmem) pretty accurately and you can adjust them too using OC GURU II. BUT the actual core voltage cap is set @ 1.206. Which is the stock voltage pretty much.

I'm sure that cap, is dictated by the BIOS (since I can undervolt successfully, I've MM'd it). So we need a hex-savvy guy to take that cap to e.g. 1.3v.


----------



## GoLDii3

Quote:


> Originally Posted by *Ccaution*
> 
> I can change the value by VBE7 too & save the BIOS. CRC's good and the flash goes ok. But the actual multimeter-measured voltage remains the same.
> 
> There's a IR3567A controller on those ones
> 
> which is a rather sophisticated controller. Latest HWiNFO will measure both voltages (vcore&vmem) pretty accurately and you can adjust them too using OC GURU II. BUT the actual core voltage cap is set @ 1.206. Which is the stock voltage pretty much.
> 
> I'm sure that cap, is dictated by the BIOS (since I can undervolt successfully, I've MM'd it). So we need a hex-savvy guy to take that cap to e.g. 1.3v.


Doesn't mean nothing,for sure the card is hardware limited,so even if you flash the bios with unlocked voltage it wont change.


----------



## Ccaution

Quote:


> Originally Posted by *GoLDii3*
> 
> Doesn't mean nothing,for sure the card is hardware limited,so even if you flash the bios with unlocked voltage it wont change.


There's no "hard" VID topology around the controller and if you check it's pdf -> http://www.irf.com/product-info/datasheets/data/pb-ir3567a.pdf that's a fully fleged CHiL/IR 6+2 controller, PMBus controlled.

So unless I miss something major (in such case, please correct me) there's no "hardware limitation" to go up to 1.6v since those controllers draw the line there, cause they loose their characteristics above that voltage.


----------



## Luckael

hi,

time to upgrade..

From Asus R9 280X DCU2T V2 to Sapphire R9 290 Tri X


----------



## Farih

Quote:


> Originally Posted by *Ccaution*
> 
> There's no "hard" VID topology around the controller and if you check it's pdf -> http://www.irf.com/product-info/datasheets/data/pb-ir3567a.pdf that's a fully fleged CHiL/IR 6+2 controller, PMBus controlled.
> 
> So unless I miss something major (in such case, please correct me) there's no "hardware limitation" to go up to 1.6v since those controllers draw the line there, cause they loose their characteristics above that voltage.


I can change it easely in VBE7 myself with an 270x Toxic.

Try this bios:

GB270x1.268V.zip 97k .zip file


----------



## Delphi

So I have only managed to get 1125 core/1500 mem on my R9 270x crossfired setup before it crashes to the desktop. Looks like I will need to start saving more pennies for some water blocks, couple rads and what not and then look into bios modding the voltage or see if there is a way to hard mod.


----------



## Devildog83

Quote:


> Originally Posted by *Delphi*
> 
> So I have only managed to get 1125 core/1500 mem on my R9 270x crossfired setup before it crashes to the desktop. Looks like I will need to start saving more pennies for some water blocks, couple rads and what not and then look into bios modding the voltage or see if there is a way to hard mod.


What are you using for an overclock tool?


----------



## Devildog83

Quote:


> Originally Posted by *JoeDirt*
> 
> With your BIOS VBE7 let me set it. Was it not allowing you to change the value?


My 7870 devil goes to 1.3v in afterburner, my 270x devil does not. Do you think there is a way to get the 270x to match the 7870's 1.3v also or would it affect the X-Fire and cause glitch's?


----------



## sterob

Quote:


> Sapphire one is voltage locked and the fan is a bit louder than MSI AFAIK.


So i guess the sapphire is a no go if i want to put it inside my bed room


----------



## hamzta09

So, theyre here.
Yep, those arent Gigabyte cards and thats not an RM850!

I did read around about the RM and noticed it wasnt made by: seasonic, corsair or channel well. But some random no-name company? So I changed the order (since it wasnt packed anyway) and replaced it with the V850 from Cooler Master.

And the Gigabyte cards werent in stock for some reason even though they were at the time of order, so I contacted the support and had them change the cards to the XFX DD Ghost 2.0 version, its not a black edition but whatever, not gonna OC for a while if all works well.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> 
> 
> So, theyre here.
> Yep, those arent Gigabyte cards and thats not an RM850!
> 
> I did read around about the RM and noticed it wasnt made by: seasonic, corsair or channel well. But some random no-name company? So I changed the order (since it wasnt packed anyway) and replaced it with the V850 from Cooler Master.
> 
> And the Gigabyte cards werent in stock for some reason even though they were at the time of order, so I contacted the support and had them change the cards to the XFX DD Ghost 2.0 version, its not a black edition but whatever, not gonna OC for a while if all works well.


Congrats you should be happy with those cards. Feel free to read the review I did on them. They over clock decently when you get to it


----------



## shilka

Quote:


> Originally Posted by *hamzta09*
> 
> 
> 
> So, theyre here.
> Yep, those arent Gigabyte cards and thats not an RM850!
> 
> I did read around about the RM and noticed it wasnt made by: seasonic, corsair or channel well. But some random no-name company? So I changed the order (since it wasnt packed anyway) and replaced it with the V850 from Cooler Master..


Good choice

And here is why you should not buy a Corsair RM

http://www.overclock.net/t/1455892/why-you-should-not-buy-a-corsair-rm-psu/10#post_21573254


----------



## Devildog83

hamzta09 has been added.


----------



## Delphi

Quote:


> Originally Posted by *Devildog83*
> 
> What are you using for an overclock tool?


I have been using MSI Afterburner. Is there anything I can do to try and hit higher overclocks?


----------



## Devildog83

Quote:


> Originally Posted by *Delphi*
> 
> I have been using MSI Afterburner. Is there anything I can do to try and hit higher overclocks?


That's what I am trying to figure out. I have tried a bunch of different tools but all of them keep me locked at stock VDDC.


----------



## Delphi

Quote:


> Originally Posted by *Devildog83*
> 
> That's what I am trying to figure out. I have tried a bunch of different tools but all of them keep me locked at stock VDDC.


Hmm,

If I have time tonight (putting in a new head unit for my car) I will see if I can do any bios mods to these cards. I know I have a lot of temperature head room as I am only hitting 65C after 5 loops of heaven. Farih did put up a bios I could try. I wonder if flashing bios's in XFire make it different. Haven't flashed a bios since my 8800GT!


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Delphi*
> 
> Hmm,
> 
> If I have time tonight (putting in a new head unit for my car) I will see if I can do any bios mods to these cards. I know I have a lot of temperature head room as I am only hitting 65C after 5 loops of heaven. Farih did put up a bios I could try. I wonder if flashing bios's in XFire make it different. Haven't flashed a bios since my 8800GT!


If its anything like when I flashed my 460's in SLI you would need to specify which cards get updated.


----------



## ricko99

I'm going to get R9 280x to replace my 660ti. I was looking at asus direct cu ii but the card is too long to fit inside my case so my alternative is msi.

1) does msi and asus come with tahiti XTL already?
2) i saw issues about msi card about the new bios and loud fan in order to cool down the vrm but how loud is it under full load? how hot can the vrm be after bios flash?


----------



## Devildog83

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> If its anything like when I flashed my 460's in SLI you would need to specify which cards get updated.


I would think that you would have to only have 1 installed at a time and update the bios separately.


----------



## GTR Mclaren

Quote:


> Originally Posted by *Farih*
> 
> I can change it easely in VBE7 myself with an 270x Toxic.
> 
> Try this bios:
> 
> GB270x1.268V.zip 97k .zip file


can I try that bios with my gigabyte?


----------



## JoeDirt

Friends, I'm cooking a 104 fever with the h1n1 flu right now. I will get back with everyone in regards to the 1.3v edits asap. Just wanted to give a heads up, not ignoring anyone.


----------



## Delphi

Quote:


> Originally Posted by *JoeDirt*
> 
> Friends, I'm cooking a 104 fever with the h1n1 flu right now. I will get back with everyone in regards to the 1.3v edits asap. Just wanted to give a heads up, not ignoring anyone.


I just had that. It was the worst flu I've ever had. Get better man!!!


----------



## Devildog83

Quote:


> Originally Posted by *JoeDirt*
> 
> Friends, I'm cooking a 104 fever with the h1n1 flu right now. I will get back with everyone in regards to the 1.3v edits asap. Just wanted to give a heads up, not ignoring anyone.


Yes, get well, people have died from that flu.


----------



## hamzta09

Get well.



Spoiler: Warning: Spoiler!



So the V850

Talk about PITA, SATA cables on that thing, needed to hire the Hulk in order to get them into the Harddrives and SSDs.
The PSU is also enormous so I couldnt exactly do any good cable management due to it blocking the holes I used to use.

Currently i Have to use Realtek instead of my asus card (xonar DX) and I have an issue thats not relevant here but I ask anyway: Is there a way for my Keyboard Mute and Volume up/Down to function together with Realtek? I cant change ANY volume with the keyboard for some reason and I do know that you should be able to, even with realtek. Cause the keyboard wasnt designed with Asus cards in mind. Its a Sidewinder X4.

I remember my old G15 worked fine with Realtek changing volume etc......

Fixed it had to reinstall the microsoft app for some reason.



Anyway.

Why can I not see Temperature or adjust fanspeed on GPU #2?

Running Valley GPU Usage on #2 is 100% while #1 is all over the place.

Booted up BF4.
Lower FPS than wiht my 680 on the same settings i.e. low.

Changed to Ultra, apply - Crash.
"Total Resource memory: 0kb. Make sure you have a supported graphics card with at least 512 MB."


----------



## Delphi

Quote:


> Originally Posted by *hamzta09*
> 
> Get well.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> So the V850
> 
> Talk about PITA, SATA cables on that thing, needed to hire the Hulk in order to get them into the Harddrives and SSDs.
> The PSU is also enormous so I couldnt exactly do any good cable management due to it blocking the holes I used to use.
> 
> Currently i Have to use Realtek instead of my asus card (xonar DX) and I have an issue thats not relevant here but I ask anyway: Is there a way for my Keyboard Mute and Volume up/Down to function together with Realtek? I cant change ANY volume with the keyboard for some reason and I do know that you should be able to, even with realtek. Cause the keyboard wasnt designed with Asus cards in mind. Its a Sidewinder X4.
> 
> I remember my old G15 worked fine with Realtek changing volume etc......
> 
> Fixed it had to reinstall the microsoft app for some reason.
> 
> 
> 
> Anyway.
> 
> Why can I not see Temperature or adjust fanspeed on GPU #2?
> 
> Running Valley GPU Usage on #2 is 100% while #1 is all over the place.
> 
> Booted up BF4.
> Lower FPS than wiht my 680 on the same settings i.e. low.
> 
> Changed to Ultra, apply - Crash.
> "Total Resource memory: 0kb. Make sure you have a supported graphics card with at least 512 MB."


I get the same results with the gpu usage. It is either 0% or 100% while gpu1 is everywhere.

I am at work right now but I think was able to change the fan speed of both GPU's using MSI Afterburner. I just made sure the setting would sync both cards.

Try BF4 with Crossfire disabled and see if it works. I suspect a driver issue there. If it works with it disabled re-enable it and see if it works correctly.


----------



## hamzta09

Quote:


> Originally Posted by *Delphi*
> 
> I get the same results with the gpu usage. It is either 0% or 100% while gpu1 is everywhere.
> 
> I am at work right now but I think was able to change the fan speed of both GPU's using MSI Afterburner. I just made sure the setting would sync both cards.
> 
> Try BF4 with Crossfire disabled and see if it works. I suspect a driver issue there. If it works with it disabled re-enable it and see if it works correctly.


My performance is definately worse with 2 cards, Ive seen videos on youtube with 280x running the game just fine in crossfire with ~100 fps on ultra.
I get 50 with dips to 30.

Is there a driver thats supposedly better than 13.12?


----------



## Delphi

Quote:


> Originally Posted by *hamzta09*
> 
> My performance is definately worse with 2 cards, Ive seen videos on youtube with 280x running the game just fine in crossfire with ~100 fps on ultra.
> I get 50 with dips to 30.
> 
> Is there a driver thats supposedly better than 13.12?


I think a new beta driver is coming out soon. I really hope it provides some performance increases. Some games play amazingly while others need some loving if you know what I mean.


----------



## hamzta09

Quote:


> Originally Posted by *Delphi*
> 
> I think a new beta driver is coming out soon. I really hope it provides some performance increases. Some games play amazingly while others need some loving if you know what I mean.


New driver comes out in jan sometime yes, with mantle support.


----------



## Farih

Quote:


> Originally Posted by *GTR Mclaren*
> 
> can I try that bios with my gigabyte?


Just send me your original BIOS and i will change voltage to anything you want. (if possible)

I have done with great succes on a 270x Toxic.
Here some proof:

Just checkling if it could handle the extra heat from 1.25V

Will play with 1.3V in a few days


----------



## Ccaution

It won't work on GV-R927XOC-2GD. I've tried it. The GPUz reported the new VDDC (1.268 nominal), but the HWiNFO reports the same voltages as before, the multimeter agrees too and not a single C or MHz up for us. It's the same story with the VBE7 edits.

It's as if the VBE7 only changes the voltage identifier on those Giga's, but not the actual voltage hex value. It's either a VBE7 update, or someone must do the dirty work for all of us. I know how to use a hex editor and I know all the "reverese hex values" and "voltage table & ASIC" theory. But I'm not yet able to find tha damn voltage table in that mess


----------



## Devildog83

I am a bit confused, which is not unusual,







. I run Valley for a bit with HWinfo64 and Afterbuner running and Afterburner is set at 1.269v but HWinfo64 says it maxed out at 1.219 on VDDC. That's not right is it, or is the VDDC not core volts. The VRM volts go quite a bit higher, is that what I should be reading?


----------



## Ccaution

Latest HWiNFO is the one agreeing with my multimeter most of the times. You are going up to 1259mV (real) and you ask for 1269mV. That's an ok vdroop, you're good as gold. You might be able to offset your VRM further - to minimize that vdroop further - but it's a lot of trial & error and it doesn't worth it, imho


----------



## Devildog83

Quote:


> Originally Posted by *Ccaution*
> 
> Latest HWiNFO is the one agreeing with my multimeter most of the times. You are going up to 1259mV (real) and you ask for 1269mV. That's an ok vdroop, you're good as gold. You might be able to offset your VRM further - to minimize that vdroop further - but it's a lot of trial & error and it doesn't worth it, imho


Thanks, I do like HWinfo. It works with AMD best for sure. My 270x is locked but it acually get's a bit higher than the 7870 which let's me go to 1.3v.


----------



## Ccaution

Quote:


> Originally Posted by *Devildog83*
> 
> Thanks, I do like HWinfo. It works with AMD best for sure. My 270x is locked but it acually get's a bit higher than the 7870 which let's me go to 1.3v.


Higher ASIC on that 270x probably, which is nice!

Finding a way to go up to 1.35v would be even nicer though, especially on those beefy 270x like yours, with the extra phases and the baller coolers


----------



## Farih

Quote:


> Originally Posted by *Ccaution*
> 
> It won't work on GV-R927XOC-2GD. I've tried it. The GPUz reported the new VDDC (1.268 nominal), but the HWiNFO reports the same voltages as before, the multimeter agrees too and not a single C or MHz up for us. It's the same story with the VBE7 edits.
> 
> It's as if the VBE7 only changes the voltage identifier on those Giga's, but not the actual voltage hex value. It's either a VBE7 update, or someone must do the dirty work for all of us. I know how to use a hex editor and I know all the "reverese hex values" and "voltage table & ASIC" theory. But I'm not yet able to find tha damn voltage table in that mess


Thats a shame really









On the 270x Toxic it does really increasy the voltage and not just the numbers.
I couldnt get stable past 1220mhz before (1.219V) and now got 1280mhz stable at 1.268V.

No idea how high i can go though, ill go for 1.3V next time, maybe if possible 1.35V


----------



## Devildog83

Quote:


> Originally Posted by *Ccaution*
> 
> Higher ASIC on that 270x probably, which is nice!
> 
> Finding a way to go up to 1.35v would be even nicer though, especially on those beefy 270x like yours, with the extra phases and the baller coolers


Don't know how much difference it makes but the Asic for my 270x is 78.2% and for the 7870 it's 78.8%.


----------



## Ccaution

Quote:


> Originally Posted by *Farih*
> 
> Thats a shame really
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On the 270x Toxic it does really increasy the voltage and not just the numbers.
> I couldnt get stable past 1220mhz before (1.219V) and now got 1280mhz stable at 1.268V.
> 
> No idea how high i can go though, ill go for 1.3V next time, maybe if possible 1.35V


You Toxic has a CHiL8214 (if memory serves) which is around since HD68XX-era cards, so the RBE7 does it's work properly with this is one, which is super nice. 1280MHz is damn nice. I strungle for 1.2GHz @ stock volts.

Tips to try going higher:

-If your pushing the memory hard, see if there's a tradeoff there (relaxing the memory to get a bit more out of that core).
-If you think you are hovering around that +20% of the default power delivered, edit that BIOS to extend that to 50% and see if that helps.
-Go even colder









Quote:


> Originally Posted by *Devildog83*
> 
> Don't know how much difference it makes but the Asic for my 270x is 78.2% and for the 7870 it's 78.8%.


You're right, the difference there is minimal indeed. They both ought to be decent overclockers. Anything bellow 65% is usually rubbish for ambient temperatures, but 70%+ will definitely do


----------



## Archea47

Quote:


> Originally Posted by *hamzta09*
> 
> Why can I not see Temperature or adjust fanspeed on GPU #2?


Try setting all occurences of EnableULPS in your registry to 0. It solved that probleb for me with my xfired 280x's. I have to do it again every time I reinstall drivers, but afterward I can monitor temps and control the fan on GPU2


----------



## Devildog83

The real strange difference between the 2 Devil's is that the 7870 can run at 1275/1450 but the 270x can run at 1230/1590. The memory can go way higher on the 270x and the core can go higher on the 7870 (shrug). If I could increase the volts some I think I could run them both close to 1300/1500 at least for bench's.


----------



## kersoz2003

Those who use Sapphire Vapor-X R9 280X , can you write your overclock values and voltages ?


----------



## hamzta09

So how does one bench using 3dmark without paying 1k USD for it.
The runs I run look like nintendo graphics.


----------



## Z0K1

*update using HWinfo64 playing Dirt 3 for a few hours


----------



## Devildog83

Quote:


> Originally Posted by *Z0K1*
> 
> *update using HWinfo64 playing Dirt 3 for a few hours


I love those temps.


----------



## GTR Mclaren

Quote:


> Originally Posted by *Farih*
> 
> Just send me your original BIOS and i will change voltage to anything you want. (if possible)
> 
> I have done with great succes on a 270x Toxic.
> Here some proof:
> 
> Just checkling if it could handle the extra heat from 1.25V
> 
> Will play with 1.3V in a few days


How can I send you the bios ??

super noob, sorry xD


----------



## hamzta09

I flashed my bios on card #1 using atiwinflash via cmd.

First card was a success.
Second card, I dont know really, it gave me an error at the end: Failed to read ROM, ERROR: 0FL01

But in GPU-Z both cards have the same bios!?

ASIC on #1 is 70%
ASIC on #2 is 64%


----------



## hamzta09

Mustve had a glitch with the drivers earlier.

I uninstalled everything (again) and ran driversweeper in safe mode ( ugghhhh windows 8... why you have to be so complicated ) and Driver sweeper apparently found my 680 LOL!

Anyway I cleand EVERYTHING including the AMD drivers.

Went back to windows in normal mode.
Installed the 13.X Beta (the latest one on AMDs site, not the WHQL) and voila I can now see temps, fanspeed, vrm temps and everything on both GPUs. However GPU #2 is apparently running at 1000mhz constantly.

okay, so when I disable crossfire and I check the sensors in GPUZ I see everything on both GPUs, and core is at 300mhz on #2.

When I enable crossfire and check the sensors in GPUZ I only (this is just #2) see coreclock which is suddenly at 1000mhz, mem fan speed which is at 20% and wont budge, and GPU load: 99%

Kay

Fixed it by disabling both lines of EnableULPS in regedit.
Now sensors work in crossfire and card #2 is idle.

Card #1 though runs at 500/1500 due to dual monitor, is it possible to change this to lower!? 45c idle is high.


----------



## madorax

just curious about bitcoin mining... if my 280X result 600/mhash average with guiminer, how many hours / days / month i need to get 1 BTC? the calculator just make me even more confused though... i just trying to make sense here... is it really can make profit or i just waste my time and power and harddisk space for bitcoin data (about 10Gb i think....)


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> Mustve had a glitch with the drivers earlier.
> 
> I uninstalled everything (again) and ran driversweeper in safe mode ( ugghhhh windows 8... why you have to be so complicated ) and Driver sweeper apparently found my 680 LOL!
> 
> Anyway I cleand EVERYTHING including the AMD drivers.
> 
> Went back to windows in normal mode.
> Installed the 13.X Beta (the latest one on AMDs site, not the WHQL) and voila I can now see temps, fanspeed, vrm temps and everything on both GPUs. However GPU #2 is apparently running at 1000mhz constantly.
> 
> okay, so when I disable crossfire and I check the sensors in GPUZ I see everything on both GPUs, and core is at 300mhz on #2.
> 
> When I enable crossfire and check the sensors in GPUZ I only (this is just #2) see coreclock which is suddenly at 1000mhz, mem fan speed which is at 20% and wont budge, and GPU load: 99%
> 
> Kay
> 
> Fixed it by disabling both lines of EnableULPS in regedit.
> Now sensors work in crossfire and card #2 is idle.
> 
> Card #1 though runs at 500/1500 due to dual monitor, is it possible to change this to lower!? 45c idle is high.


Glad you are getting that worked out.. As for the 3d mark choose the option that is not the pro edition one. It should be (if memory serves me correct) around $24.95usd

Just scroll all the way down to the bottom http://www.futuremark.com/benchmarks/3dmark


----------



## Farih

Quote:


> Originally Posted by *GTR Mclaren*
> 
> How can I send you the bios ??
> 
> super noob, sorry xD


Save your BIOS to file with GPU-Z.
Then ZIP it and post it here as an attachment.


----------



## Tasm

I can grab a GA 280x WF r2.0 for just 230bucks.
I know its voltage locked...anyway around besides BIOS editing?
Should i go for it?
Currently i have a Tahiti LE making 1230mhz, 10k 3D11.


----------



## nubki11a

I've done quite a bit of research on the card and as long as you get the Rev 2.0, you're good. It's a great card and apparently doesn't run as loud as some of the reviews say. I'd say go for it! I'm ordering mine right now









The BIOS shouldn't be a problem as it has a BIOS switch


----------



## hamzta09

Anyone here run 280X in crossfire and does not get a redscreen in BF4 when loading level!?

For some reason GPU #1 refuses to go down into 2D clocks, it remains at 500/1500 rather than 300/whatever

The secondary monitor is plugged into my HD3000 now and yet the GPU does not downclock!? Why does the GPU treat it as if its running in dualmonitor!? When it only handles one.


----------



## Archea47

Quote:


> Originally Posted by *hamzta09*
> 
> Anyone here run 280X in crossfire and does not get a redscreen in BF4 when loading level!?


Hey hamzla09,

I've never had that issue with BF4 and run two (gigabyte rev2) 280x cards. I only run one monitor though. Settings are ultra @ 1080p


----------



## GoLDii3

Quote:


> Originally Posted by *Tasm*
> 
> I can grab a GA 280x WF r2.0 for just 230bucks.
> I know its voltage locked...anyway around besides BIOS editing?
> Should i go for it?
> Currently i have a Tahiti LE making 1230mhz, 10k 3D11.


No sense,my 280X at 1200 MHz makes 12K off graphics score.

I had a 7950 before,sold it and bought a 280X just because it had BF4. Wasn't a great deal.


----------



## hamzta09

Quote:


> Originally Posted by *Archea47*
> 
> Hey hamzla09,
> 
> I've never had that issue with BF4 and run two (gigabyte rev2) 280x cards. I only run one monitor though. Settings are ultra @ 1080p


I just tried BF4 on singlemonitor, with the second monitor unplugged.

Singleplayer works fine, but if I alt tab I get a crash saying something about memory 0kb.

Havent tried MP yet but likely Ill get the RLOD.

If I plug in second monitor, SP doesnt work anymore.


----------



## hyp36rmax

Which 280x work with the current 7970 water blocks?


----------



## Tasm

Quote:


> Originally Posted by *GoLDii3*
> 
> No sense,my 280X at 1200 MHz makes 12K off graphics score.
> 
> I had a 7950 before,sold it and bought a 280X just because it had BF4. Wasn't a great deal.


http://www.3dmark.com/3dm11/7820897


----------



## [CyGnus]

Anyone tried the 13.30?


----------



## hamzta09

Strange issue.

Playing some Bioshock Infinite.

Both cards work fine (though I get hiccups, when moving towards "new" areas but I did get that aswell with a 680)

I get alt tabbed after ~15min of gaming.
Windows says

Close programs to prevent information loss.

Your computer is low on memory. Save your files and close these programs:
Bioshock Infinite

Why in the world would I do that?
I check Perofrmance tab in Taskmanager - 3.3/8GB is used on my physical RAM.
VRAM us at 5.5GB... that doesnt make any sense!? Cause the cards got 3GB not 6. Or is afterburner reporting both GPU VRAMs as One? 3+3=6. But reality = 3.

And as I wrote this message, chrome crashed, lmao..

Why is the OS trying to use VRAM for background applications!?


----------



## Jaffi

Anyone seen this, it is happening occasionally for me for a millisecond, perhaps 2 times a day: http://www.youtube.com/watch?v=21yL7FZ-I68

I read it has something to do with memclocks switching whilst in 2D mode.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> Strange issue.
> 
> Playing some Bioshock Infinite.
> 
> Both cards work fine (though I get hiccups, when moving towards "new" areas but I did get that aswell with a 680)
> 
> I get alt tabbed after ~15min of gaming.
> Windows says
> 
> Close programs to prevent information loss.
> 
> Your computer is low on memory. Save your files and close these programs:
> Bioshock Infinite
> 
> Why in the world would I do that?
> I check Perofrmance tab in Taskmanager - 3.3/8GB is used on my physical RAM.
> VRAM us at 5.5GB... that doesnt make any sense!? Cause the cards got 3GB not 6. Or is afterburner reporting both GPU VRAMs as One? 3+3=6. But reality = 3.
> 
> And as I wrote this message, chrome crashed, lmao..
> 
> Why is the OS trying to use VRAM for background applications!?


I can't remember did you do an Os reload?. It sounds as if you are having driver conflicts or register errors.

will you run a test for me?.

Try running the windows system performance test and see if that helps any

You may want to look at a registry cleaner. .. not sure if you've done this yet
Quote:


> Originally Posted by *Jaffi*
> 
> Anyone seen this, it is happening occasionally for me for a millisecond, perhaps 2 times a day: http://www.youtube.com/watch?v=21yL7FZ-I68
> 
> I read it has something to do with memclocks switching whilst in 2D mode.


Odd I haven't but after switching the amd.. I feel as the drivers need work. It feels like the release drivers are still beta


----------



## [CyGnus]

Just installed the 13.30's and all is good dont know if its placebo or not but BF4 feels better with these


----------



## G2O415

Quote:


> Originally Posted by *[CyGnus]*
> 
> Anyone tried the 13.30?


I have, dropped my overall score in Firestrike by 7; decreased my graphic score by a little bit, increased my physic score by like 130. I also got some artifacts in Saints Row 3 that I never gotten before, so I had a bad experience. Not that long of an experience with it but enough for me to uninstall and go back to 13.12

Of course, it's never the same for everyone but it works with R9 200 series that's for sure.


----------



## hamzta09

Played Crysis 2.
Scaling seems poor, 50% on both GPUs.
50fps in some areas.

Quit the game.

Grey screen.
PC Lockup.
Reboot.

Did a 3dmark11 720p test on each GPU individually.

GPU 1

















GPU 2

















Quite a massive difference, is it supposed to be like this? Me think not!

Why does it say (2x) on the top one when the bridge isnt even connected?


----------



## Archea47

Quote:


> Originally Posted by *hamzta09*
> 
> Played Crysis 2.
> Scaling seems poor, 50% on both GPUs.
> 50fps in some areas.
> 
> Quit the game.
> 
> Grey screen.
> PC Lockup.
> Reboot.
> 
> Did a 3dmark11 720p test on each GPU individually.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> GPU 1
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPU 2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quite a massive difference, is it supposed to be like this? Me think not!


Are you able to monitor your VRM temperatures?

I hate to bring this up, but ... I had initially ordered XFX 280Xs but returned them and got the Gigabytes after reading the XFX cards' vrms got too hot (the heat spreader on those is made of plastic) when in crossfire. That wasn't the 2.0 though, so it may not be an issue on yours

If the VRM temperatures are OK I would check Windows Event Viewer for any interesting errors and investigate those


----------



## hamzta09

Quote:


> Originally Posted by *Archea47*
> 
> Are you able to monitor your VRM temperatures?
> 
> I hate to bring this up, but ... I had initially ordered XFX 280Xs but returned them and got the Gigabytes after reading the XFX cards' vrms got too hot (the heat spreader on those is made of plastic) when in crossfire. That wasn't the 2.0 though, so it may not be an issue on yours
> 
> If the VRM temperatures are OK I would check Windows Event Viewer for any interesting errors and investigate those


VRM temps are 40-50c as far as I can see in GPUZ.

Gonna boot BF4 on this Mr Slowly card now and see temps and also framerate.


----------



## rdr09

Quote:


> Originally Posted by *hamzta09*
> 
> Played Crysis 2.
> Scaling seems poor, 50% on both GPUs.
> 50fps in some areas.
> 
> Quit the game.
> 
> Grey screen.
> PC Lockup.
> Reboot.
> 
> Did a 3dmark11 720p test on each GPU individually.
> 
> GPU 1
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPU 2
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quite a massive difference, is it supposed to be like this? Me think not!
> 
> Why does it say (2x) on the top one when the bridge isnt even connected?


you need to oc your i5. my thuban gets a higher physics score than that at 4GHz. you are bottlenecking your gpu big time.


----------



## hamzta09

Quote:


> Originally Posted by *rdr09*
> 
> you need to oc your i5. my thuban gets a higher physics score than that at 4GHz. you are bottlenecking your gpu big time.


CPU is at stock.

And overclocking CPU doesnt fix the redscreen when booting BF4, or grey screen when quitting games in crossfire.

Or the inconsistent performance between the two cards.

I benched Bioshock Infinite, and the results are +/- 1-5 fps.
I guess the cards are both fine?

But 3dmark11 disagrees with that, as #2 managed to halve the score of #1.. lol


----------



## rdr09

Quote:


> Originally Posted by *hamzta09*
> 
> CPU is at stock.
> 
> And overclocking CPU doesnt fix the redscreen when booting BF4, or grey screen when quitting games in crossfire.
> 
> Or the inconsistent performance between the two cards.
> 
> I benched Bioshock Infinite, and the results are +/- 1-5 fps.
> I guess the cards are both fine?
> 
> But 3dmark11 disagrees with that, as #2 managed to halve the score of #1.. lol


i would not crossfire with that i5.

edit: go check post 6 . . . not gonna sugar coat it . . .

http://www.overclock.net/t/1459028/crossfire-asus-hd7970-dc2t-3gd5#post_21581151


----------



## hamzta09

Quote:


> Originally Posted by *rdr09*
> 
> i would not crossfire with that i5.


If youre just gonna troll, go away.

It doesnt bottleneck, even at stock I had 120fps in the little multiplayer gameplay I actually got out of these two together, in BF4 on Ultra, 4x MSAA.


----------



## hamzta09

I believe I have found the problem.

120hz + 60hz in dual monitor on AMD cards is apparently no go.

So I have to stick with 60hz on both? Really...

If I run both at 60hz, no problems with xfire.

If I run main monitor at 120hz and the other at 60hz, problems. Seems the cards cant sync the refreshrates so it goes alzheimers and dont know what to do!?

When I run main at 120 and the other at 60, the 60hz monitor has a strange lag like effect and increased ghosting.
If I run both at 60 the other monitor has no lag or ghosting.

Video example of refreshrate issue running xfire with dualmonitor, both at 60hz.




At 60hz, bF4 works.
At 120+60 BF4 redscreens and other games (on exit mostly) greyscreen and pc reboots.

It seems that AMD cannot handle a refreshrate above 60hz in BF4.

http://battlelog.battlefield.com/bf4/forum/threadview/2955065670139205856/


----------



## hoevito

Totally academic I know, but just in case anyone wants to see a nearly clock for clock comparison between a cream of the crop R9-280x and a Gtx 780, here's some Firestrike scores I pulled. The 280x is a Matrix Platinum clocked at 1275 core/1750 mem and the 780 is an Msi Lightning at 1164 core boosting to 1280 and 1652 mem. The Matrix was pretty much maxed out and was essentially a suicide run as it was pretty unstable, but the Lightning has quite a bit more headroom and can easily be pushed over 1300 core. Both are still using stock air coolers and the system specs are the same, which are in my sig.


R9-280x-http://www.3dmark.com/fs/1354611


Gtx 780-http://www.3dmark.com/fs/1542802


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> I believe I have found the problem.
> 
> 120hz + 60hz in dual monitor on AMD cards is apparently no go.
> 
> So I have to stick with 60hz on both? Really...
> 
> If I run both at 60hz, no problems with xfire.
> 
> If I run main monitor at 120hz and the other at 60hz, problems. Seems the cards cant sync the refreshrates so it goes alzheimers and dont know what to do!?
> 
> When I run main at 120 and the other at 60, the 60hz monitor has a strange lag like effect and increased ghosting.
> If I run both at 60 the other monitor has no lag or ghosting.
> 
> Video example of refreshrate issue running xfire with dualmonitor, both at 60hz.
> 
> 
> 
> 
> At 60hz, bF4 works.
> At 120+60 BF4 redscreens and other games (on exit mostly) greyscreen and pc reboots.
> 
> It seems that AMD cannot handle a refreshrate above 60hz in BF4.
> 
> http://battlelog.battlefield.com/bf4/forum/threadview/2955065670139205856/


No you can't run a 120hz and 60hz screen together........I hate to say it but it seems you did little to no research about these cards before you bought them.


----------



## nubki11a

Quote:


> Originally Posted by *Sgt Bilko*
> 
> No you can't run a 120hz and 60hz screen together........I hate to say it but it seems you did little to no research about these cards before you bought them.


Is the issue exclusive to X-fire? Or also with a single card? I've got a HD 6870 atm with no problems.


----------



## Sgt Bilko

Quote:


> Originally Posted by *nubki11a*
> 
> Is the issue exclusive to X-fire? Or also with a single card? I've got a HD 6870 atm with no problems.


Tbh i'm not 100% sure, I've always stuck to the RRR Rule with multiple screens, Keep the Resolution and Refresh Rate the same across the board........makes life much simpler.

If i ever upgrade my monitors then i'm replacing both of them.


----------



## nubki11a

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Tbh i'm not 100% sure, I've always stuck to the RRR Rule with multiple screens, Keep the Resolution and Refresh Rate the same across the board........makes life much simpler.
> 
> If i ever upgrade my monitors then i'm replacing both of them.


Even if you use one for just desktop and the other one for gaming?


----------



## Sgt Bilko

Quote:


> Originally Posted by *nubki11a*
> 
> Even if you use one for just desktop and the other one for gaming?


yeah, i don't play a lot of FPS's so the 120hz doesn't mean that much to me atm tbh.


----------



## rdr09

Quote:


> Originally Posted by *hamzta09*
> 
> If youre just gonna troll, go away.
> 
> It doesnt bottleneck, even at stock I had 120fps in the little multiplayer gameplay I actually got out of these two together, in BF4 on Ultra, 4x MSAA.


i don't mean to piss you off. i was just gonna suggest to oc your cpu a little. i play BF4 with the i7 HToff @ 4.5 with a 290 and it so smooth but i only use 60Hz. your 280s are much faster.


----------



## rdr09

Quote:


> Originally Posted by *hoevito*
> 
> Totally academic I know, but just in case anyone wants to see a nearly clock for clock comparison between a cream of the crop R9-280x and a Gtx 780, here's some Firestrike scores I pulled. The 280x is a Matrix Platinum clocked at 1275 core/1750 mem and the 780 is an Msi Lightning at 1164 core boosting to 1280 and 1652 mem. The Matrix was pretty much maxed out and was essentially a suicide run as it was pretty unstable, but the Lightning has quite a bit more headroom and can easily be pushed over 1300 core. Both are still using stock air coolers and the system specs are the same, which are in my sig.
> 
> 
> R9-280x-http://www.3dmark.com/fs/1354611
> 
> 
> Gtx 780-http://www.3dmark.com/fs/1542802


here is the 290 at 1260 . . .



http://www.3dmark.com/3dm/1895751


----------



## hoevito

Quote:


> Originally Posted by *rdr09*
> 
> here is the 290 at 1260 . . .
> 
> 
> 
> Well that definitely shows that a 290 is every bit as capable if not more so than a 780. Not bad at all!


----------



## Devildog83

That is a nice Firestrike score for 1 card. It also make me happy that my graphics score is very close to that score.


----------



## hamzta09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> No you can't run a 120hz and 60hz screen together........I hate to say it but it seems you did little to no research about these cards before you bought them.


Research? Lmao.

I ran the monitor at 144hz and 120hz (strobelight) with the other monitor at 60hz just fine on my 680.

And why would the 280X struggle with it? I dont see the reason.


----------



## gnemelf

Get a new monitor or video card... simple. deal with it. ATI does't like the diff rates so it is what it is not a bad card etc.


----------



## hamzta09

Quote:


> Originally Posted by *gnemelf*
> 
> Get a new monitor or video card... simple. deal with it. ATI does't like the diff rates so it is what it is not a bad card etc.


That logic.


----------



## GTR Mclaren

Checking some prices of the RAM...holy cow, 16GB 2400 at 200$ or more...this is insane


----------



## hamzta09

I am getting flickering now running crossfire. Mainmonitor when running 3dmark11 or bioshock infinite benchmarking utility, the monitor has this strange flickering, like massive rectangles moving downwards upwards and vice versa. Like the flicker you can see if you grab a camera and record an old CRT monitor.


----------



## Archea47

Quote:


> Originally Posted by *hamzta09*
> 
> I am getting flickering now running crossfire. Mainmonitor when running 3dmark11 or bioshock infinite benchmarking utility, the monitor has this strange flickering, like massive rectangles moving downwards upwards and vice versa. Like the flicker you can see if you grab a camera and record an old CRT monitor.


Ok, I'm pulling down Bioshock Infinite to see if I have the same issue. I haven't experienced anything like that yet, but the only games I've been using are BF4, MWO and Skyrim.

Do you still get the flickering with one monitor? I only have one 60Hz monitor to test with


----------



## hamzta09

Quote:


> Originally Posted by *Archea47*
> 
> Ok, I'm pulling down Bioshock Infinite to see if I have the same issue. I haven't experienced anything like that yet, but the only games I've been using are BF4, MWO and Skyrim.
> 
> Do you still get the flickering with one monitor? I only have one 60Hz monitor to test with


With just one monitor I had no flickering.

I plugged the other monitor into the intel HD3000 again, via DVI.
Flicker again and the GPU does not downclock in 2D.

I then changed the cable from DVI on secondary to a cheap HDMI and plugged it into the HD3000, no more flickering... this is strange, and GPU also downclocks now to 300/150


----------



## Amph

i need the bios mob for the 280x toxic, someone has it plz?


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> Research? Lmao.
> 
> I ran the monitor at 144hz and 120hz (strobelight) with the other monitor at 60hz just fine on my 680.
> 
> And why would the 280X struggle with it? I dont see the reason.


AMD cards dont support the different refresh rates at the same time iirc.

I think its fine with a single card but not CF.

And you cant compare your 680 to it...different architecture, drivers and tech.

It is probably a driver issue but ive never looked into it too hard because when I go 120hz then im replacing both my screens.

And research lmao.....yeah, sorry but I dont have any sympathy for your problems now.


----------



## Lisjak

So after reading about the 120hz with 60hz monitor problem I am curious. Is this only a problem when using crossfire or would I have problems running them on just 1 card as well? I am planning on buying a 120 hz in the near future so I would be in the same situation.


----------



## hamzta09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> AMD cards dont support the different refresh rates at the same time iirc.
> 
> I think its fine with a single card but not CF.
> 
> And you cant compare your 680 to it...different architecture, drivers and tech.
> 
> It is probably a driver issue but ive never looked into it too hard because when I go 120hz then im replacing both my screens.
> 
> And research lmao.....yeah, sorry but I dont have any sympathy for your problems now.


Please direct me to a reliable google source that says: Cant use 120hz-144hz together with 60hz in dualmonitor mode when using AMD GPUs in Crossfire.

See, cant take you seriously either because no such source exists, thus I Could not research it.

Quote:


> Originally Posted by *Lisjak*
> 
> So after reading about the 120hz with 60hz monitor problem I am curious. Is this only a problem when using crossfire or would I have problems running them on just 1 card as well? I am planning on buying a 120 hz in the near future so I would be in the same situation.


Likely crossfire related.


----------



## hks215

i just brought a r9 270x hawk and it really not that much different than the hd 7950
the 7870 would be way weaker just because in benchmarks the scores and dont mean nothing its in the gamplay
so people say the r9 270x is like the 7870 thats just stupid coz its not more like the 7950 there is no 280 so the 270x is replaced the hd 7950
i just had a tf3 hd 7950 and the 270x is better in games i think my theory would be correct the why are people saying the 270x is a 7870?


----------



## hks215

has anyone manged to get the triple over voltage to work on the hawk


----------



## nubki11a

Quote:


> Originally Posted by *hks215*
> 
> i just brought a r9 270x hawk and it really not that much different than the hd 7950
> the 7870 would be way weaker just because in benchmarks the scores and dont mean nothing its in the gamplay
> so people say the r9 270x is like the 7870 thats just stupid coz its not more like the 7950 there is no 280 so the 270x is replaced the hd 7950
> i just had a tf3 hd 7950 and the 270x is better in games i think my theory would be correct the why are people saying the 270x is a 7870?


Punctuation is appreciated


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> Please direct me to a reliable google source that says: Cant use 120hz-144hz together with 60hz in dualmonitor mode when using AMD GPUs in Crossfire.
> 
> See, cant take you seriously either because no such source exists, thus I Could not research it.
> Likely crossfire related.


Well. Unplug one of the cards and see if that fixes the problem. After readin through a few threads on other forums and such it seems that its a dual gpu problem more than anything. Im guessing you are using the right ports and cables.

Ive heard that some drivers work better than others as well.

Or you could just buy another 120hz screen


----------



## hks215

ok thanks


----------



## hamzta09

VRM Temps on GPU #1
Max 85
Avg 83/77

In BF4. Quite high.


----------



## hks215

on beta 17 u cant get core voltage unlock just the memory voltage.
and the aux voltage works is there something wrong with my card?
can someone help me.


----------



## dmfree88

Yo devil how you been man?

Finally making my appearance here at the club as i will be joining soon!

Had a question for you guys. I was going to get a 280x but i realized that being almost double the price and sometimes performing at 600kh/s mining (not often but its possible). I was thinking it would be easier for me to get a 270x. Then i started looking into them and noticed that the windforce version on newegg is marked at 4gb memory while all the others are 2gb. If i were to go for the 4gb windforce would I likely get better results then others with 2gb versions? Or was it a typo and its not 4gb? I get 400kh/s out of my 7870 hawk which is just a old 270x so I was hoping it would at least get 450-500? If it does it would be worth the buy doubling my speed with a little extra. Anyone have one that can let me know?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125496

Any input greatly appreciated


----------



## GuestVeea

My build!







The 12.22" Toxic R9 270x Fits into the MIni-itx Prodigy XD
*Sorry about the bad picture quality. I used me cell phone







*
What do you guys think?
Specs:
Bitfenix Prodigy Mini-itx Case
i7-3770k
Sapphire Toxic edition R9 270x
8gb DDR3
Asus P8H61-1 R2.0
Cooler Master Hyper N520 CPU Cooler
64gb Crucial M4 SSD
750gb Western Digital Green HDD
80gb Western Digital Black HDD (For Movies/Homework)


----------



## Devildog83

Quote:


> Originally Posted by *hks215*
> 
> i just brought a r9 270x hawk and it really not that much different than the hd 7950
> the 7870 would be way weaker just because in benchmarks the scores and dont mean nothing its in the gamplay
> so people say the r9 270x is like the 7870 thats just stupid coz its not more like the 7950 there is no 280 so the 270x is replaced the hd 7950
> i just had a tf3 hd 7950 and the 270x is better in games i think my theory would be correct the why are people saying the 270x is a 7870?


Performance wise it's close 7950 but it's the same as a 7870 in most ways. 1st of all they are both 2 Gbs of memory @ 256 bit and the 7950 is 3 Gbs @ 384 bit, 2nd it's a Pitcairn chip and the 7950 is a Tahiti. I have both 7870 and R9 270x and they are almost the same except some internals like the 270x will clock a ton higher on the memory so the memory controller has to be different. Both of mine use Elpida 6 Ghz memory chips but the 7870 will barely hit 1500 if that. The PCB of a 270x is also based off of the 7870 so I guess all of those reasons are why. High end 7870's and 270x's will clock much higher than most 7950's but have less memory and stream processors so it kinda evens out some.


----------



## dmfree88

Quote:


> Originally Posted by *GuestVeea*
> 
> 
> 
> 
> My build!
> 
> 
> 
> 
> 
> 
> 
> The 12.22" Toxic R9 270x Fits into the MIni-itx Prodigy XD
> *Sorry about the bad picture quality. I used me cell phone
> 
> 
> 
> 
> 
> 
> 
> *
> What do you guys think?
> Specs:
> Bitfenix Prodigy Mini-itx Case
> i7-3770k
> Sapphire Toxic edition R9 270x
> 8gb DDR3
> Asus P8H61-1 R2.0
> Cooler Master Hyper N520 CPU Cooler
> 64gb Crucial M4 SSD
> 750gb Western Digital Green HDD
> 80gb Western Digital Black HDD (For Movies/Homework)


looks perdy







. Kinda looks like a fat hackintosh


----------



## Devildog83

Quote:


> Originally Posted by *dmfree88*
> 
> Yo devil how you been man?
> 
> Finally making my appearance here at the club as i will be joining soon!
> 
> Had a question for you guys. I was going to get a 280x but i realized that being almost double the price and sometimes performing at 600kh/s mining (not often but its possible). I was thinking it would be easier for me to get a 270x. Then i started looking into them and noticed that the windforce version on newegg is marked at 4gb memory while all the others are 2gb. If i were to go for the 4gb windforce would I likely get better results then others with 2gb versions? Or was it a typo and its not 4gb? I get 400kh/s out of my 7870 hawk which is just a old 270x so I was hoping it would at least get 450-500? If it does it would be worth the buy doubling my speed with a little extra. Anyone have one that can let me know?
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125496
> 
> Any input greatly appreciated


Hey man how's it doin'? I love it, a 4Gb 270x. I can't wait to see how that match's up against my Devil's. Show me picks and clocks when it arrives and I'll sign you right up.


----------



## dmfree88

Quote:


> Originally Posted by *Devildog83*
> 
> Hey man how's it doin'? I love it, a 4Gb 270x. I can't wait to see how that match's up against my Devil's. Show me picks and clocks when it arrives and I'll sign you right up.


oh you know i will. if its legit 4gb imma buy it. I just didnt want to buy it if it was some dumb guy at neweggs typo







.

I dont think i have quite enough yet though. maybe 2-3 more days if bitcoin prices stay nice to me.


----------



## Devildog83

Quote:


> Originally Posted by *GuestVeea*
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> My build!
> 
> 
> 
> 
> 
> 
> 
> 
> The 12.22" Toxic R9 270x Fits into the MIni-itx Prodigy XD
> *Sorry about the bad picture quality. I used me cell phone
> 
> 
> 
> 
> 
> 
> 
> *
> What do you guys think?
> Specs:
> Bitfenix Prodigy Mini-itx Case
> i7-3770k
> Sapphire Toxic edition R9 270x
> 8gb DDR3
> Asus P8H61-1 R2.0
> Cooler Master Hyper N520 CPU Cooler
> 64gb Crucial M4 SSD
> 750gb Western Digital Green HDD
> 80gb Western Digital Black HDD (For Movies/Homework)


That looks like a screamin' little mini ITX there. Nice job.


----------



## dmfree88

oh also one more question if anyone knows. Does the windforce come with elpida or hynix memory? if its 4gb of hynix im sold.


----------



## Devildog83

Quote:


> Originally Posted by *dmfree88*
> 
> oh you know i will. if its legit 4gb imma buy it. I just didnt want to buy it if it was some dumb guy at neweggs typo
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I dont think i have quite enough yet though. maybe 2-3 more days if bitcoin prices stay nice to me.


It's legit and should perform well, it's only $10 more than my 270x.

Here, http://www.techpowerup.com/196152/gigabyte-also-rolls-out-radeon-r9-270x-oc-with-4-gb-memory.html


----------



## dmfree88

http://www.amazon.com/Gigabyte-GDDR5-4GB-2xDVI-Graphics-GV-R927XOC-4GD/dp/B00H707TIW

says hynix or samsung memory. maybe i could get super lucky and get samsung! ive heard its actually better then hynix but are very rarely put in cards. im sold man i wish i had enough now.

Also saw the msi gaming 270x has a 4gb edition. couldn't find much info on it though yet.


----------



## Juventas

Can someone check for me, when you set your fan speed to manual, what is the lowest value? Does the fan turn off completely? Preferably using MSI Afterburner 3 on a 270X, but any R9 utility should be the same.


----------



## benjamen50

Quote:


> Originally Posted by *Juventas*
> 
> Can someone check for me, when you set your fan speed to manual, what is the lowest value? Does the fan turn off completely? Preferably using MSI Afterburner 3 on a 270X, but any R9 utility should be the same.


I think 30%/20% is the minimum fan speed you can set, and No the fan should not turn off completely, their set to a safety limit so that the fans do not stop due to fan speed.


----------



## kpo6969

Quote:


> Originally Posted by *TempAccount007*
> 
> when burger flippers get into cryptocurrency it's time to sell


Quote:


> Originally Posted by *dmfree88*
> 
> i wish there was a neg rep button for discrimination. Just cause someone flips burgers for a living doesn't make them any better or worse then you. money discrimination is still discrimination.
> 
> plus if your referring to me i make a decent living working from home which is better then most are doing right now. Im the only one working in my family so its not much right now but I feel lucky. No reason for you to act like your better then anyone due to your pockets being fatter.


Well said.


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> Please direct me to a reliable google source that says: Cant use 120hz-144hz together with 60hz in dualmonitor mode when using AMD GPUs in Crossfire.
> 
> See, cant take you seriously either because no such source exists, thus I Could not research it.
> Likely crossfire related.


http://hardforum.com/showthread.php?t=1789178

2nd post, try rolling back to 13.9 WHQL driver.


----------



## Tasm

I flashed my GA R9 280x R 2.0 to the BIOS unlocked voltage, now i can place voltage up to 1.3v.

Did i void my warranty by flashing the modded BIOS?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Tasm*
> 
> I flashed my GA R9 280x R 2.0 to the BIOS unlocked voltage, now i can place voltage up to 1.3v.
> 
> Did i void my warranty by flashing the modded BIOS?


Yeah but if you need to return it you can just flash it back


----------



## JCH979

I have some wierd stuff going on with watching some videos on YouTube with the 13.12 drivers. My clocks seem to be fluctuating between 501/1600 and the stock 300/150 2D clocks making some but not all video playback on YouTube jittery. I don't remember seeing this with 9.5 BETA so I'm thinking it's some sort of driver issue going on. Not that big a deal for me since it just happens on YouTube and just on some videos at that.

VLC playback seems to be unaffected and I still haven't seen anything abnormal while playing games. At least I'm not getting anymore flickering or seen any black screens, freezing, or anything else so far.

Just thought I would share my experience with the 13.12 drivers.

Also found this earlier: AMD Catalyst 13.35 BETA Driver With Mantle and HSA Support Scheduled For End of January

Well off to work with me.


----------



## JoeDirt

Quote:


> Originally Posted by *Ccaution*
> 
> I can change the value by VBE7 too & save the BIOS. CRC's good and the flash goes ok. But the actual multimeter-measured voltage remains the same.
> 
> There's a IR3567A controller on those ones
> 
> which is a rather sophisticated controller. Latest HWiNFO will measure both voltages (vcore&vmem) pretty accurately and you can adjust them too using OC GURU II. BUT the actual core voltage cap is set @ 1.206. Which is the stock voltage pretty much.
> 
> I'm sure that cap, is dictated by the BIOS (since I can undervolt successfully, I've MM'd it). So we need a hex-savvy guy to take that cap to e.g. 1.3v.


Give this one a try and see it the multimeter shows 1.3

r927xo2d.zip 192k .zip file


----------



## Ccaution

Quote:


> Originally Posted by *JoeDirt*
> 
> Give this one a try and see it the multimeter shows 1.3
> 
> r927xo2d.zip 192k .zip file


Nope - still no voltage gains :/

Your BIOS works properly (idle/load clocks&voltages) nonetheless


----------



## JoeDirt

A few days before I became sick, my Asus R9 270 came in. I replaced the TC and added some vram heatsinks. Put it in and ran it for a few days. Rant great except it would get very hot like my 280x. So I opened my case to re-position a fan to give it better airflow. Figured I'd just do a quick move and did the bonehead move of not powering down... Fan header extension slipped from my hand and touched the backside of the 270... POP dead. Lesson: Always power down your computer when you do any work inside of it.

I'm not even going to try to RMA it. It was my fault up and I will pay for it. So I did by ordering a PowerColor 270. The PowerColor is not clocked as high but does have 3 heatpipes instead of 2 like the ASUS card. I am very please with it so far. Keeps cool and works perfect. Going to leave this one stock until I get a 290x.


----------



## JoeDirt

Quote:


> Originally Posted by *Ccaution*
> 
> Nope - still no voltage gains :/
> 
> Your BIOS works properly (idle/load clocks&voltages) nonetheless


Well, crap... I'm sure the solution will present itself to us in time.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *JoeDirt*
> 
> A few days before I became sick, my Asus R9 270 came in. I replaced the TC and added some vram heatsinks. Put it in and ran it for a few days. Rant great except it would get very hot like my 280x. So I opened my case to re-position a fan to give it better airflow. Figured I'd just do a quick move and did the bonehead move of not powering down... Fan header extension slipped from my hand and touched the backside of the 270... POP dead. Lesson: Always power down your computer when you do any work inside of it.
> 
> I'm not even going to try to RMA it. It was my fault up and I will pay for it. So I did by ordering a PowerColor 270. The PowerColor is not clocked as high but does have 3 heatpipes instead of 2 like the ASUS card. I am very please with it so far. Keeps cool and works perfect. Going to leave this one stock until I get a 290x.


yowza but respectable


----------



## Ccaution

Quote:


> Originally Posted by *JoeDirt*
> 
> Well, crap... I'm sure the solution will present itself to us in time.


I hope so







Thanks for trying!

( Monstrous cooler on that card - nice







)


----------



## Archea47

Quote:


> Originally Posted by *JoeDirt*
> 
> I'm not even going to try to RMA it. It was my fault up and I will pay for it. So I did by ordering a PowerColor 270. The PowerColor is not clocked as high but does have 3 heatpipes instead of 2 like the ASUS card. I am very please with it so far. Keeps cool and works perfect. Going to leave this one stock until I get a 290x.


1.) good on ya!

2.) I see you have some heatsinks on the rearside of your 280X - thanks for the idea!


----------



## JoeDirt

Quote:


> Originally Posted by *Archea47*
> 
> 1.) good on ya!
> 
> 2.) I see you have some heatsinks on the rearside of your 280X - thanks for the idea!


They do eat up a good amount of space so be warned on clearance! Plus side is they do help a good amount with the cooling. I put them on when I burnt my hand on the back plate. The cooler itself is the Prolima Tech MK-26. With fans it eats up 6 slots of space. Keeps this pigs right at 70-73c at all times. I'm going to swap to some Corsair SP 120's to give it better cooling. These silent 140's just dont move enough air.


----------



## Warl0rdPT

Quote:


> Originally Posted by *Jaffi*
> 
> Anyone seen this, it is happening occasionally for me for a millisecond, perhaps 2 times a day: http://www.youtube.com/watch?v=21yL7FZ-I68
> 
> I read it has something to do with memclocks switching whilst in 2D mode.


It happens me when I'm browsing. Outside the browser I don't notice it...


----------



## Devildog83

hamzta09,

Could you please stop being so rude to folks that try to help. If what you say is true then just respond with I already have the issue resolved but thanks anyway. Being rude only creates problems that I do not want it this thread.

Thanks.


----------



## F3ERS 2 ASH3S

So I found something interesting with multi resolution.. I have a 1080p and 720p monitor set up

When I scroll to the smaller monitor the top and bottom will allow the curser to get stuck,. sound similar effects to Hamzta but not flickers like Hz
(which makes sense)

My conclusion.. This has to be a Driver issue and not a card issue. Mine is not that noticeable. Just figured to check things out and report with updates

I'm on Driver 13.2

So no need for the bickering found root cause of both scenarios (sadly i don't have hard facts boooo)


----------



## kersoz2003

With a 1.225 voltages 1200/1700 overclocked sapphire vapor-x R9 280x , I get maximum of 85 C of vrm 1 heat. Is it ok ? If I dont overclock ( 1070/1550 @ 1.131 volts) vrm1 is 65 dergress. So is 85 bad ? Do I have to worry ?


----------



## nubki11a

Quote:


> Originally Posted by *kersoz2003*
> 
> With a 1.225 voltages 1200/1700 overclocked sapphire vapor-x R9 280x , I get maximum of 85 C of vrm 1 heat. Is it ok ? If I dont overclock ( 1070/1550 @ 1.131 volts) vrm1 is 65 dergress. So is 85 bad ? Do I have to worry ?


If that's in stresstests I wouldnt be too worried. You might wanna tone it down a little if you get 85 during gaming. Its still safe though, around 92 would be the absolute maximum, which you generally wanna avoid.


----------



## kersoz2003

Quote:


> Originally Posted by *nubki11a*
> 
> If that's in stresstests I wouldnt be too worried. You might wanna tone it down a little if you get 85 during gaming. Its still safe though, around 92 would be the absolute maximum, which you generally wanna avoid.


Also with stock voltage of my card (1.131 v) I can go up to 1130/1650 . Is it worth for extra heat with 1.225 v and 1200/1700 or

Also I tested I can undervolt my card and make it run at 1.080 voltages with 1070/1555 (which are the stock mhz values) and vrm are just around 40-50








these are the heat while benchmarking . So 85 is ok ? or just use with stock voltage of my card (1.131 v) I and go up to 1130/1650 and stay there. Or gind some range between both







?


----------



## nubki11a

Quote:


> Originally Posted by *kersoz2003*
> 
> Also with stock voltage of my card (1.131 v) I can go up to 1130/1650 . Is it worth for extra heat with 1.225 v and 1200/1700 or
> 
> Also I tested I can undervolt my card and make it run at 1.080 voltages with 1070/1555 (which are the stock mhz values) and vrm are just around 40-50
> 
> 
> 
> 
> 
> 
> 
> 
> these are the heat while benchmarking . So 85 is ok ? or just use with stock voltage of my card (1.131 v) I and go up to 1130/1650 and stay there. Or gind some range between both
> 
> 
> 
> 
> 
> 
> 
> ?


I assume you use your card for mining then? Must admit I'm far from an expert on that, but if you need the extra power I'd say you should OC, imo 85C is pretty good under stresstests, normal use is likely to be a bit lower.


----------



## gt12345

men newegg getting really greedy lately. I try to price match asus r9 280x from cdw.com. and newegg removes ironegg from their asus r9 280x page lmao.


----------



## neurotix

Arguing, sigh.


----------



## cookiesowns

So instead of having a buddy pick up some R9 290s over seas, he ended up getting 2x R9 280X Powercolor TurboDUO cards. Reviews so far seem nice, and confirmed physically that they have hynix. Not sure if I should sell, use, or mine with them. Late December build, so hopefully they have Tahiti XTL chips. Advertised as native DX11.2 support so who knows.

Any recommendations?

Anyone have 280X's in crossfire on a Z87 platform? Any driver bugs? Eyefinity issues?


----------



## Tasm

Got my GA R9 280x.

The cooler is by far better than the Asus DC 3 slot cooler i had on mine 7950.

It has no Vrm temperatura sensor, and the fan speed meter is totally bugged, the readings are at 76000rpm!

Anyone with this bug?


----------



## Modovich

Guys, whats the XFX R9 280X like? Looking to get two to add to my 2x 7970 rig for mining. Is it good when it comes to temp/noise/heat ratio:?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Modovich*
> 
> Guys, whats the XFX R9 280X like? Looking to get two to add to my 2x 7970 rig for mining. Is it good when it comes to temp/noise/heat ratio:?


http://www.overclock.net/products/xfx-radeon-double-d-r9-280x-1000mhz-boost-ready-3gb-ddr5-2xmdp-hdmi-2xdvi-graphics-cards-r9-280x-tdfd/reviews/6485

F3ERS 2 ASH3S wrote a review on it.


----------



## Devildog83

Quote:


> Originally Posted by *gt12345*
> 
> men newegg getting really greedy lately. I try to price match asus r9 280x from cdw.com. and newegg removes ironegg from their asus r9 280x page lmao.


Quote:


> Originally Posted by *Modovich*
> 
> Guys, whats the XFX R9 280X like? Looking to get two to add to my 2x 7970 rig for mining. Is it good when it comes to temp/noise/heat ratio:?


Those R9 cards seem to be getting ridiculous but for the 270x and below. I am not even going to consider one until the mass psychosis is gone. In a year maybe.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Modovich*
> 
> Guys, whats the XFX R9 280X like? Looking to get two to add to my 2x 7970 rig for mining. Is it good when it comes to temp/noise/heat ratio:?


I get 730khs at 70c with a single card


----------



## Devildog83

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> I get 730khs at 70c with a single card


How much money (I mean actual cash money) do you miners think you make in a day? Is it worth building a separate mining system? Say someone put together a system with 2 x 290's with just the basics and good cooling who cares what it looks like as long as it's cool and efficient. I have a case and power supply already. I don't think it would be when the video cards themselves will cost over 1100 unless you love a lot of heat and noise. If you could make enough mining though the investment might be worth it. What do you think?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> How much money (I mean actual cash money) do you miners think you make in a day? Is it worth building a separate mining system? Say someone put together a system with 2 x 290's with just the basics and good cooling who cares what it looks like as long as it's cool and efficient. I have a case and power supply already. I don't think it would be when the video cards themselves will cost over 1100 unless you love a lot of heat and noise. If you could make enough mining though the investment might be worth it. What do you think?


Depends really you have to go with amount in coin×coin value/ power×selling+transfer costs


----------



## Devildog83

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Depends really you have to go with amount in coin×coin value/ power×selling+transfer costs


I just want to know if there is enough money in it to justify spending $1500 or more on a rig just for mining or would it be a waste of money.


----------



## Farih

Check this 270x on fire









Nr1 in 3Dmark11 with a 2600K (cant browse through other processor on there website







) (If somebody can then please do so)



Also Nr1 in Firestrike with a 2600K.
Again when i browse results to compare with other cpu's the site mocks up.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Farih*
> 
> Check this 270x on fire
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nr1 in 3Dmark11 with a 2600K (cant browse through other processor on there website
> 
> 
> 
> 
> 
> 
> 
> ) (If somebody can then please do so)
> 
> 
> 
> Also Nr1 in Firestrike with a 2600K.
> Again when i browse results to compare with other cpu's the site mocks up.


my 3dmark11 http://www.3dmark.com/3dm11/7834966


----------



## Archea47

Quote:


> Originally Posted by *Tasm*
> 
> Got my GA R9 280x.
> 
> The cooler is by far better than the Asus DC 3 slot cooler i had on mine 7950.
> 
> It has no Vrm temperatura sensor, and the fan speed meter is totally bugged, the readings are at 76000rpm!
> 
> Anyone with this bug?


I have a couple (Rev2)

Yes, they don't appear to have a VRM sensor (my only complaint)

In my crossfire setup I couldn't control GPU2's fan without disabling ULPS (set all entries of EnableULPS to 0 in the registry). I thought that was just an issue when in crossfire though


----------



## Devildog83

Quote:


> Originally Posted by *Farih*
> 
> Check this 270x on fire
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Nr1 in 3Dmark11 with a 2600K (cant browse through other processor on there website
> 
> 
> 
> 
> 
> 
> 
> ) (If somebody can then please do so)
> 
> 
> 
> Also Nr1 in Firestrike with a 2600K.
> Again when i browse results to compare with other cpu's the site mocks up.


That is on fire for a 270x. Nice work.


----------



## Farih

Quote:


> Originally Posted by *Devildog83*
> 
> [/SPOILER]
> 
> That is on fire for a 270x. Nice work.


should i go for 1.35V now ?
Temps are good and fan not nearly on 100% yet but i am getting scared lol.

If it breaks i only got a 260x to fall back to.......


----------



## Devildog83

Quote:


> Originally Posted by *Farih*
> 
> should i go for 1.35V now ?
> Temps are good and fan not nearly on 100% yet but i am getting scared lol.
> 
> If it breaks i only got a 260x to fall back to.......


Are the temps on the VRM's good too? I think it would handle it as long as you can keep the temps down. You could wait until someone beats your score and then take the crown back.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> Are the temps on the VRM's good too? I think it would handle it as long as you can keep the temps down. You could wait until someone beats your score and then take the crown back.


Thats the only thing I am having an issue with on this XFX 280x,, The VRMs are pretty hot.. I think I could OC a lot more if I could cool them down more. I will have to think hard about how to do it

Oh Devil

Think about mining as like stock. There is money to be made in it and it is only worth as much as the want for it. As for the payout and making a profit there are several ways to do it.. You can mine and then start buying and trading and playing the other coins off. Kinda like how Litecoin is supposed to be the silver of bit coin.

As for investing it is a very hard thing to say as something as a government putting a sanction on it could cripple the cost.. but then again Bitcoin dropped in half when the chinese denied it, Then about 3 weeks later is now 85% of where it was before that.

There is no black and white answer on it but I hope that puts it in the perspective that you are looking for


----------



## punkrocker

Can someone tell me, which BIOS might be the best for my MSI 280x-cards with respect to the lowest possible temperature? I found these four:


MSI.R9280X.3072.131009.rom
MSI.R9280X.3072.131029.rom
MSI.R9280X.3072.131011.rom
MSI.R9280X.3072.130912.rom


----------



## Farih

Quote:


> Originally Posted by *Devildog83*
> 
> Are the temps on the VRM's good too? I think it would handle it as long as you can keep the temps down. You could wait until someone beats your score and then take the crown back.


The VRM on the 270x Toxic has no sensor








No idea how hot really is.

I just use 60% fan speed when running at 1.3V and the core doesnt get over 66 degrees really.

Would like to know if my toxic is world Nr1 now compared to other CPU's to but Futuremarks website keeps acting funny when i browse results.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Farih*
> 
> The VRM on the 270x Toxic has no sensor
> 
> 
> 
> 
> 
> 
> 
> 
> No idea how hot really is.
> 
> I just use 60% fan speed when running at 1.3V and the core doesnt get over 66 degrees really.
> 
> Would like to know if my toxic is world Nr1 now compared to other CPU's to but Futuremarks website keeps acting funny when i browse results.


World Number 3 & 4: (you got the 4th spot







)

http://www.3dmark.com/compare/fs/1556751/fs/1421321

World number 1 & 2:

http://www.3dmark.com/compare/fs/1447148/fs/1501187


----------



## Farih

Quote:


> Originally Posted by *Sgt Bilko*
> 
> World Number 3 & 4: (you got the 4th spot
> 
> 
> 
> 
> 
> 
> 
> )
> 
> http://www.3dmark.com/compare/fs/1556751/fs/1421321
> 
> World number 1 & 2:
> 
> http://www.3dmark.com/compare/fs/1447148/fs/1501187


Thanks for the links.

Something seems out of place with those score's on Firestrike.
I got alot more on the physics scrore then all of them.... for them to be having that much more points in graphics is like having there cards over 1500mhz.
Not saying there cant be something wrong with my own PC though.


----------



## F3ERS 2 ASH3S

Here is something... More notice of AMD needs to do more work on the driver side.

after driver failed due to overclock I started noticing similarities that Hamzta had,, Not quite the same.. Also a few others notices screen tears.. I saw that too..

Only difference is that I see it once the drivers fail. Not sure what that means exactly but pretty sure that is is driver related.


----------



## Durvelle27

Quote:


> Originally Posted by *Farih*
> 
> Check this 270x on fire
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nr1 in 3Dmark11 with a 2600K (cant browse through other processor on there website
> 
> 
> 
> 
> 
> 
> 
> ) (If somebody can then please do so)
> 
> 
> 
> Also Nr1 in Firestrike with a 2600K.
> Again when i browse results to compare with other cpu's the site mocks up.


Come on man catch my 7970 Lol


----------



## Devildog83

Quote:


> Originally Posted by *Farih*
> 
> The VRM on the 270x Toxic has no sensor
> 
> 
> 
> 
> 
> 
> 
> 
> No idea how hot really is.
> 
> I just use 60% fan speed when running at 1.3V and the core doesnt get over 66 degrees really.
> 
> Would like to know if my toxic is world Nr1 now compared to other CPU's to but Futuremarks website keeps acting funny when i browse results.


Does HWinfo64 show like this. I don't have a sensor in the OC tool but in this I do.


Spoiler: Warning: Spoiler!


----------



## Farih

Quote:


> Originally Posted by *Devildog83*
> 
> Does HWinfo64 show like this. I don't have a sensor in the OC tool but in this I do.
> 
> 
> Spoiler: Warning: Spoiler!


No. dont have VRM readings in HWinfo64 either


----------



## Devildog83

Here is mine compared to 270x's X-Fired, the high on 7870's is 13,560. As 7870's I would be second and as 270x's I would be 1st. These are only valid results.

http://www.3dmark.com/compare/3dm11/7749206/3dm11/7790663

http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/3dm11/P/1541/765/500000?minScore=0&cpuName=AMD%20FX-8350&gpuName=AMD%20Radeon%20HD%207870


----------



## kersoz2003

*Solution is found to Sapphire vapor-x R9 280X "VRM" over heating and throttling problem







GOOD NEWS !!







*

After searching alot for my new Sapphire vapor-x R9 280X card's vrm overheating problem. I finally found the solution







It is actually a bios problem. Old bios cards of this series never have vrm overheat , however with new bios of this series vrm get hot. Especilly when overclocked to high frequencies with like 1.2,5- 1.3 voltages you can see the card goes up to 100-110 vrm heat and throttles. NOW TAKE A DEEP BREATH cause I find the solution.

Sapphire uses 2 different vram types in their cards . And one of them gets hot after a certain level of overclock. and this bios ( the latest one which came with the latest part of these series) is not suitable for especially that vram users. So in a forum (a gpu mining forum actually ) I found the solution. people there mods the bad bios for the new series and makes the vrm's run 20-30 c cooler. for example mine was 90-95 at stress test and now it is 70-75:thumb:

you just need to go the forum and find the correct bios in the topic:

https://litecointalk.org/index.php?topic=10313.0

and also here

https://litecointalk.org/index.php?topic=12369.225

or just use the bios I use for sapphire vapor-x r9 280x:

http://www.upload.gen.tr/d.php/www/eTyTg/Sapphire_E21004_V44_K2_NT_AGR.rom.html

you can use "winflash" to flash bios in windows 8

now I am enjoying full overclock and throttleless and cool and very high performance card.









I am sure Sapphire is gonna make new bios like these people do and make it officially-solved . but if you dont want to wait sapphire to do it. just do as I tell you above









other 2xx card users of sapphire (and maybe other brands) can also use this as a guide especially this link gives moded bioses for each cards. :

https://litecointalk.org/index.php?topic=12369.225


----------



## Devildog83

Quote:


> Originally Posted by *Farih*
> 
> No. dont have VRM readings in HWinfo64 either


Bummer, if you are only in the 60's on the core I think you would be OK but would hate to steer you wrong. my VRM's will get a few degrees hotter than the core but the also take more heat. I just turn my fans to 80%+ when I am benching with high clocks.


----------



## F3ERS 2 ASH3S

Is the only difference between these BIOS just clocks?

http://www.techpowerup.com/vgabios/index.php?architecture=&manufacturer=XFX&model=&interface=&memType=&memSize=&did=1002-6798--226f%2C1002-6798--2273%2C1002-6798--2775%2C1002-6798--2776%2C1002-6798--2777%2C1002-6798--3001%2C1002-6798--3002%2C1002-6798--3003%2C1002-6798--3004%2C1002-6798--3005%2C1002-6798--3006


----------



## Farih

Quote:


> Originally Posted by *Devildog83*
> 
> Bummer, if you are only in the 60's on the core I think you would be OK but would hate to steer you wrong. my VRM's will get a few degrees hotter than the core but the also take more heat. I just turn my fans to 80%+ when I am benching with high clocks.


Last attempt









1345/1575mhz @ 1.337V (max 61 degrees at 85% fan)



Still gotta figure out why i score lower then i should in Firestrike (or why the 3 others score higher then they should lol)

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Is the only difference between these BIOS just clocks?
> 
> http://www.techpowerup.com/vgabios/index.php?architecture=&manufacturer=XFX&model=&interface=&memType=&memSize=&did=1002-6798--226f%2C1002-6798--2273%2C1002-6798--2775%2C1002-6798--2776%2C1002-6798--2777%2C1002-6798--3001%2C1002-6798--3002%2C1002-6798--3003%2C1002-6798--3004%2C1002-6798--3005%2C1002-6798--3006


See for yourself by opening it in VBE7 or something similar


----------



## Devildog83

I have an issue with firestrike too, I get good graphics scores and not bad physics but the combined is horrible.


----------



## neurotix

Same thing happens to me. My 290 should do 10k+ in Firestrike but my combined is so low I only get 7500.

Consider me disappointed.


----------



## hamzta09

Why does VRAM and Core show the SAME temperatures just VRM #1 and #2 inverted?

Much temps.
Very 97c.
Wow.

GPU 1 is crazy hot.
GPU 2's max is 80c.

This was accomplished running Valley at Ultra with 8x AA.
The more AA the hotter they get.

And apparently, afterburner shows the Diode temp, shame really, when you think its 10+c cooler than it really is.


----------



## cookiesowns

Quote:


> Originally Posted by *hamzta09*
> 
> 
> Why does VRAM and Core show the SAME temperatures just VRM #1 and #2 inverted?
> 
> Much temps.
> Very 97c.
> Wow.
> 
> GPU 1 is crazy hot.
> GPU 2's max is 80c.
> 
> This was accomplished running Valley at Ultra with 8x AA.
> The more AA the hotter they get.
> 
> And apparently, afterburner shows the Diode temp, shame really, when you think its 10+c cooler than it really is.


wat.

Read the description, HWINFO is reporting off the PMbus ( power monitoring bus ) of the CHIL VRM controller. VRM temps @ 97C is perfectly fine, especially when your GPU core ( diode ) is at 87. Poor mount / ventilation?


----------



## hamzta09

Quote:


> Originally Posted by *cookiesowns*
> 
> wat.
> 
> Read the description, HWINFO is reporting off the PMbus ( power monitoring bus ) of the CHIL VRM controller. VRM temps @ 97C is perfectly fine, especially when your GPU core ( diode ) is at 87. Poor mount / ventilation?


My core is not 87?








Did you mean 83?

But there are 2 VRMs? One shows 97 and the other 89.

Ok so diode is the core, you sure?

I have a fan at the front running at full blast and one in the rear suck the air out. One at the bottom blowing cold air in. I will buy more fans, but I cant doanything on the sidepanel, whenver I place a fan on it, it causes this annoying weeeeeeeeeeeeeeeeeeeeeeeee whine noise.

Is there a way to undervolt the card? Changing voltage in afterburner does nothing.

Card #1 runs at 1.225v
Card #2 runs at 1.275v

During load, cant get them lower.


----------



## hamzta09

Or is afterburner/Gpuz reading wrong VDDC?

HwInfo shows 1.200v for both cards.

But afterburner/gpuz shows 1.225 for #1 and 1.275 for #2.

Is there a way to unlock voltage adjustment via VBE7? Or can you simply not change voltage under crossfire?


----------



## madorax

Quote:


> Originally Posted by *kersoz2003*
> 
> *Solution is found to Sapphire vapor-x R9 280X "VRM" over heating and throttling problem
> 
> 
> 
> 
> 
> 
> 
> GOOD NEWS !!
> 
> 
> 
> 
> 
> 
> 
> *
> 
> After searching alot for my new Sapphire vapor-x R9 280X card's vrm overheating problem. I finally found the solution
> 
> 
> 
> 
> 
> 
> 
> It is actually a bios problem. Old bios cards of this series never have vrm overheat , however with new bios of this series vrm get hot. Especilly when overclocked to high frequencies with like 1.2,5- 1.3 voltages you can see the card goes up to 100-110 vrm heat and throttles. NOW TAKE A DEEP BREATH cause I find the solution.
> 
> Sapphire uses 2 different vram types in their cards . And one of them gets hot after a certain level of overclock. and this bios ( the latest one which came with the latest part of these series) is not suitable for especially that vram users. So in a forum (a gpu mining forum actually ) I found the solution. people there mods the bad bios for the new series and makes the vrm's run 20-30 c cooler. for example mine was 90-95 at stress test and now it is 70-75:thumb:
> 
> you just need to go the forum and find the correct bios in the topic:
> 
> https://litecointalk.org/index.php?topic=10313.0
> 
> and also here
> 
> https://litecointalk.org/index.php?topic=12369.225
> 
> or just use the bios I use for sapphire vapor-x r9 280x:
> 
> http://www.upload.gen.tr/d.php/www/eTyTg/Sapphire_E21004_V44_K2_NT_AGR.rom.html
> 
> you can use "winflash" to flash bios in windows 8
> 
> now I am enjoying full overclock and throttleless and cool and very high performance card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am sure Sapphire is gonna make new bios like these people do and make it officially-solved . but if you dont want to wait sapphire to do it. just do as I tell you above
> 
> 
> 
> 
> 
> 
> 
> 
> 
> other 2xx card users of sapphire (and maybe other brands) can also use this as a guide especially this link gives moded bioses for each cards. :
> 
> https://litecointalk.org/index.php?topic=12369.225


tried the MSI 280X gaming version for my card, and i can confirm with guimining both VRM temp in my card drop from before 74c / 59c to 62c/46c, and the Mhash rate is also increasing... thanks for the share man


----------



## Necrochain

Sapphire R9 270x Dual-X

Stock 1070/1400
Overclock 1200/1600

http://www.3dmark.com/fs/1561271


----------



## Inacoma79

Quote:


> Originally Posted by *hamzta09*
> 
> I have a fan at the front running at full blast and one in the rear suck the air out. One at the bottom blowing cold air in. I will buy more fans, but I cant doanything on the sidepanel, whenver I place a fan on it, it causes this annoying weeeeeeeeeeeeeeeeeeeeeeeee whine noise.


I had that problem too with the side panel fans. To fix it the noise, you need to put spacers and 1" screws with similar thread pattern as fan's mounting holes. Here's a pic of what I did. Fan is silent. Spacers and screws from home depot cost me $2-3. You can even take a black sharpie to the screw head to match the color of panel.


Spoiler: Warning: Spoiler!


----------



## Farih

Check this:



Rank 1 270x









1362/1550mhz @ 1.35V (56 degrees max at 84% fan)


----------



## nubki11a

Quote:


> Originally Posted by *Farih*
> 
> Check this:
> 
> 
> 
> Rank 1 270x
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1362/1550mhz @ 1.35V (56 degrees max at 84% fan)


Congratz


----------



## Devildog83

Necrochain has been added, welcome.

You have shed more light on an issue I have been having with Firestrike. If you look at our 2 scores my Graphics are way higher as is my Physics but for some unknown reason my combined score is extremely low, I have not been able to figure this out. I get about 10 FPS in the combined test no matter what I do.

My score,


http://www.3dmark.com/3dm/2240189

Here is yours,


http://www.3dmark.com/fs/1561271


----------



## kersoz2003

Quote:


> Originally Posted by *madorax*
> 
> tried the MSI 280X gaming version for my card, and i can confirm with guimining both VRM temp in my card drop from before 74c / 59c to 62c/46c, and the Mhash rate is also increasing... thanks for the share man


not at all







I heard alot of people cant find solution to the wrong bios which causes vrm overheat and searched for it and find the solution









Some of the most recent version seem to alter the configuration of the VRM controller in way it shouldn't be done. It does not raise voltages but changes the VRM parameters instead. It causes the VRM temperatures to raise over 20 celcius and therefore the card throttles pretty easily. 7970 and 280X cards are affected, might be 7950 too (if equipped with CHL8228 controller).

So this bios modding rules the vrm controllers as the way it should be


----------



## kersoz2003

Quote:


> Originally Posted by *hamzta09*
> 
> 
> Why does VRAM and Core show the SAME temperatures just VRM #1 and #2 inverted?
> 
> Much temps.
> Very 97c.
> Wow.
> 
> GPU 1 is crazy hot.
> GPU 2's max is 80c.
> 
> This was accomplished running Valley at Ultra with 8x AA.
> The more AA the hotter they get.
> 
> And apparently, afterburner shows the Diode temp, shame really, when you think its 10+c cooler than it really is.


*Solution is found to Sapphire vapor-x R9 280X "VRM" over heating and throttling problem







GOOD NEWS !!







*

After searching alot for my new Sapphire vapor-x R9 280X card's vrm overheating problem. I finally found the solution







It is actually a bios problem. Old bios cards of this series never have vrm overheat , however with new bios of this series vrm get hot. Especilly when overclocked to high frequencies with like 1.2,5- 1.3 voltages you can see the card goes up to 100-110 vrm heat and throttles. NOW TAKE A DEEP BREATH cause I find the solution.

Sapphire uses 2 different vram types in their cards . And one of them gets hot after a certain level of overclock. and this bios ( the latest one which came with the latest part of these series) is not suitable for especially that vram users. So in a forum (a gpu mining forum actually ) I found the solution. people there mods the bad bios for the new series and makes the vrm's run 20-30 c cooler. for example mine was 90-95 at stress test and now it is 70-75:thumb:

you just need to go the forum and find the correct bios in the topic:

https://litecointalk.org/index.php?topic=10313.0

and also here

https://litecointalk.org/index.php?topic=12369.225

or just use the bios I use for sapphire vapor-x r9 280x:

http://www.upload.gen.tr/d.php/www/eTyTg/Sapphire_E21004_V44_K2_NT_AGR.rom.html

you can use "winflash" to flash bios in windows 8

now I am enjoying full overclock and throttleless and cool and very high performance card.









I am sure Sapphire is gonna make new bios like these people do and make it officially-solved . but if you dont want to wait sapphire to do it. just do as I tell you above









other 2xx card users of sapphire (and maybe other brands) can also use this as a guide especially this link gives moded bioses for each cards. :

https://litecointalk.org/index.php?topic=12369.225


----------



## Farih

Quote:


> Originally Posted by *Devildog83*
> 
> Necrochain has been added, welcome.
> 
> You have shed more light on an issue I have been having with Firestrike. If you look at our 2 scores my Graphics are way higher as is my Physics but for some unknown reason my combined score is extremely low, I have not been able to figure this out. I get about 10 FPS in the combined test no matter what I do.
> 
> My score,
> 
> 
> http://www.3dmark.com/3dm/2240189
> 
> Here is yours,
> 
> 
> http://www.3dmark.com/fs/1561271


What i have noticed is just that we dont have a strange score its the others that do.

My score's are higher then all 270x's by rather alot.
But 2 systems with stock CPU and GPU out do me thousands of points in the graphicsscore.
In theory there cards should be like 2ghz on the core to have such a difference with mine.
Meanwhile there are thousands of people underneath me and just a very few getting close to me.

I bet those few systems have cheated and Futuremark hasnt seen/noticed this.
They have more performance with 1 stock card and stock CPU then some results with 2 cards lol... Imppossible !


----------



## JCH979

I think it's just the Futuremark hardware monitoring being weird and not detecting people's hardware or OC's correctly. It's happened with a couple results of mine where it showed that my cpu and gpu where at stock clocks even though they were not. I just rebooted the app and everything was detected as it should have been.

Example:

My result with 1200/1666: http://www.3dmark.com/fs/1507352

same OC as above showing stock clocks: http://www.3dmark.com/fs/1507378


----------



## Necrochain

Quote:


> Originally Posted by *Devildog83*
> 
> Necrochain has been added, welcome.
> 
> You have shed more light on an issue I have been having with Firestrike. If you look at our 2 scores my Graphics are way higher as is my Physics but for some unknown reason my combined score is extremely low, I have not been able to figure this out. I get about 10 FPS in the combined test no matter what I do.
> 
> My score,
> 
> 
> http://www.3dmark.com/3dm/2240189
> 
> Here is yours,
> 
> 
> http://www.3dmark.com/fs/1561271


To be honest I don't know why either. I've looked up the top results for my hardware and seen people with 200+ points in Fire Strike on much lower clocks. Again it does depend on the cooling and if you have a good chip.


----------



## ricko99

so i just received my new asus r9 280x TOP but it seems like i can't overclock the card from amd overdrive or msi afterburner. everytime i change the slider for core clock or memory clock, it changes but when i check on GPU Z, both value do not change. But i was managed to alter the fan speed











anyone knows what is the problem here?


----------



## Farih

Quote:


> Originally Posted by *JCH979*
> 
> I think it's just the Futuremark hardware monitoring being weird and not detecting people's hardware or OC's correctly. It's happened with a couple results of mine where it showed that my cpu and gpu where at stock clocks even though they were not. I just rebooted the app and everything was detected as it should have been.
> 
> Example:
> 
> My result with 1200/1666: http://www.3dmark.com/fs/1507352
> 
> same OC as above showing stock clocks: http://www.3dmark.com/fs/1507378


Yes, i know that does happen sometime's
You see alot of benches stating idle clocks aswell.

Thing is they get a few thousands point more while my card is allready at 1360mhz, such a difference should mean they have there card clocked to something like 2000mhz.
Its only 3 people btw, thats why i think maybe something was wrong/different on there end or futuremark messed up.
My physics score is also much higher then those 3, it make's no sence those 3 people can score that much higher.

Here have a look:
http://www.3dmark.com/compare/fs/1562726/fs/1447148
It just make's no sense, look at the difference in graphics while mine is allready at 1360mhz.
On his physics score you can tell his CPU runs stock.


----------



## JCH979

I am pretty sure I have seen other scores like that but that is rather odd. I'm not sure what is going on with that guys score though, they either have some really good OC'ing cards or something isn't being read correctly..

Nice overclock btw









Just noticed that person's graphic score is as high as my 280X's max OC of 1290/1830: http://www.3dmark.com/compare/fs/1490857/fs/1447148


----------



## [CyGnus]

ricko99 try to use Asus Gpu Tweak with me it works fine


----------



## Devildog83

Quote:


> Originally Posted by *ricko99*
> 
> so i just received my new asus r9 280x TOP but it seems like i can't overclock the card from amd overdrive or msi afterburner. everytime i change the slider for core clock or memory clock, it changes but when i check on GPU Z, both value do not change. But i was managed to alter the fan speed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> anyone knows what is the problem here?


Go into the settings and click extend overclocking limits and disable ULPS and restart the computer, see if that fix's it.


----------



## Devildog83

Quote:


> Originally Posted by *Necrochain*
> 
> To be honest I don't know why either. I've looked up the top results for my hardware and seen people with 200+ points in Fire Strike on much lower clocks. Again it does depend on the cooling and if you have a good chip.


My temps for the CPU core, socket and the NB temps stay very nice. NB in the high 40's and CPU socket and core in the high 50's. Don't know if the NB temps cover the VRM's but they are under that same heatsink.. The GPU temps barely go over 70c and the bottom card max is low 60's. It might be how I am overclocking the CPU, I thought about going back to stock and seeing what happens there.

ricko99 has been added, Welcome!!!


----------



## ricko99

Quote:


> Originally Posted by *Devildog83*
> 
> Go into the settings and click extend overclocking limits and disable ULPS and restart the computer, see if that fix's it.


tried both but same result. the weird thing is when i start any game the clock actually shows the one that i modified but gpu z stays the same

Quote:


> Originally Posted by *[CyGnus]*
> 
> ricko99 try to use Asus Gpu Tweak with me it works fine


Don't really fancy asus tweak guide, it doesn't have on screen display for temp and stuff


----------



## hamzta09

I just modified by BIOS on both cards and lowered the Performance VDDC #2 from 1.200 down to 1.168 and it still runs, and ~10c cooler so far.

BUt I dont know what to use to monitor the VDDCs as Afterburner shows 1.225 and 1.275 and HWInfo shows 1.200.
GPU-Z Fluctuates constantly between 0.850 and 1.154. Max is 1.154 which GPU #2 reached and 1.146 which GPU #1 reached.


----------



## Farih

Slowly discovering why some score's are so far off.

I noticed first that all the way to high score's are from people running systeminfo version 4.23 instead of 4.24.
I opened a threat about this on Futuremark.com and got a quik responce.

We kinda ruled out that it isnt the systeminfo version but the new series AMD cards sometime's beeing mislabeled.
270x's showing up as 280x or 290x wich explains why they score so high.

Look at this 15K+ graphics score of a 270x lol
Impossible !
http://www.3dmark.com/3dm11/7754850

Futuremark hope's to get back on this monday.

Hope they solve it fast, i want my NR1 positions in 3D11 and Fire Strike like i should have


----------



## Farih

Quote:


> Originally Posted by *hamzta09*
> 
> I just modified by BIOS on both cards and lowered the Performance VDDC #2 from 1.200 down to 1.168 and it still runs, and ~10c cooler so far.
> 
> BUt I dont know what to use to monitor the VDDCs as Afterburner shows 1.225 and 1.275 and HWInfo shows 1.200.
> GPU-Z Fluctuates constantly between 0.850 and 1.154. Max is 1.154 which GPU #2 reached and 1.146 which GPU #1 reached.


MSI AB is messed up on voltage reading for me to.
GPU-Z does show me proper value's though and they are the same as in HWinfo.


----------



## hamzta09

Quote:


> Originally Posted by *Farih*
> 
> MSI AB is messed up on voltage reading for me to.
> GPU-Z does show me proper value's though and they are the same as in HWinfo.


My afterburner now after a couple reboots, now shows the VDDC on both GPUs at 1.169v constant.
GPUZ fluctuates between 1.050 and 1.154
HWINfo still is stuck at 1.200


----------



## Jawwwwsh

Out of interest, how are people overclocking their R9 cards nowadays?


----------



## Farih

Quote:


> Originally Posted by *hamzta09*
> 
> My afterburner now after a couple reboots, now shows the VDDC on both GPUs at 1.169v constant.
> GPUZ fluctuates between 1.050 and 1.154
> HWINfo still is stuck at 1.200


Strange...
In doubt i would just go by the highest reading then.
Quote:


> Originally Posted by *Jawwwwsh*
> 
> Out of interest, how are people overclocking their R9 cards nowadays?


I do with changing bios and adjust clocks in MSI AB


----------



## hamzta09

Does anyone here have Witcher 2 and can run it at Ultra with Ubersampling turned Off and Mem size second Largest?

I get blackscreen when accessing the various menus you have in the game, after playing for 10-15minutes. I wanna know if its related to teh voltages or just the game.


----------



## Jawwwwsh

BIOS GPU OC?! I have never seen that as an option before!! I'd love a BIOS overclocking ability for a GPU like with CPUs, but not sure my mobo supports that!


----------



## Farih

Quote:


> Originally Posted by *Jawwwwsh*
> 
> BIOS GPU OC?! I have never seen that as an option before!! I'd love a BIOS overclocking ability for a GPU like with CPUs, but not sure my mobo supports that!


You change the BIOS on the GPU, not the motherboard.
You change it to increase voltage when you cant in software like MSI AB or want more then what they give.

You can change clocks in the BIOS to but i use MSI AB for that, i only change the voltage in the BIOS and the powerstate.


----------



## cookiesowns

Interesting.

I've been having some problems with CrossFire ( really should have just stuck with 1x 290 option ). In BF4 I will constantly get OOM errors, even though 8GB of RAM should be plentiful. Latest 13.12 drivers. I even raised my page file to 8GB and BF4 instantly will commit to the full 8+4GB setup.

Another thing is these cards run HOT! Looks like the first card gets heat soaked. 2x R9 280X Powercolor PowerDUO. VRM temps are quite good though, < 80C under heavy load and around 1.2VDDC. It's just unfortunate that MSI afterburner can't interface with these cards so I still need to find a utility that can.

I think I'm maybe one of the first people with cards that run hotter on diode than VRM lol.


----------



## Jawwwwsh

Quote:


> Originally Posted by *Farih*
> 
> You change the BIOS on the GPU, not the motherboard.
> You change it to increase voltage when you cant in software like MSI AB or want more then what they give.
> 
> You can change clocks in the BIOS to but i use MSI AB for that, i only change the voltage in the BIOS and the powerstate.


Well now that makes a lot more sense! I'm only running an R9 270 (single 6 pin) so I probably wont bother modding the voltage. I'll try out MSI Afterburner though, as I'm not a fan of the catalyst suite one bit!


----------



## hamzta09

Quote:


> Originally Posted by *cookiesowns*
> 
> Interesting.
> 
> I've been having some problems with CrossFire ( really should have just stuck with 1x 290 option ). In BF4 I will constantly get OOM errors, even though 8GB of RAM should be plentiful. Latest 13.12 drivers. I even raised my page file to 8GB and BF4 instantly will commit to the full 8+4GB setup.
> 
> Another thing is these cards run HOT! Looks like the first card gets heat soaked. 2x R9 280X Powercolor PowerDUO. VRM temps are quite good though, < 80C under heavy load and around 1.2VDDC. It's just unfortunate that MSI afterburner can't interface with these cards so I still need to find a utility that can.
> 
> I think I'm maybe one of the first people with cards that run hotter on diode than VRM lol.


Raise your pagefile and it is solved.

I got OOM after just a couple minutes when using a 200-1024MB pagefile on the SSD. Raised it to 400-4096 (or 3072 not sure) and problem got away. Just the OS being terrible.

If you need to lower your voltage, just use VBE7 (.rom tweaker) and lower the voltages by ~.010-30. On the performance tab, the one that runs at highest clocks.


----------



## JCH979

Quote:


> Originally Posted by *Farih*
> 
> Slowly discovering why some score's are so far off.
> 
> I noticed first that all the way to high score's are from people running systeminfo version 4.23 instead of 4.24.
> I opened a threat about this on Futuremark.com and got a quik responce.
> 
> We kinda ruled out that it isnt the systeminfo version but the new series AMD cards sometime's beeing mislabeled.
> 270x's showing up as 280x or 290x wich explains why they score so high.
> 
> Look at this 15K+ graphics score of a 270x lol
> Impossible !
> http://www.3dmark.com/3dm11/7754850
> 
> Futuremark hope's to get back on this monday.
> 
> Hope they solve it fast, i want my NR1 positions in 3D11 and Fire Strike like i should have


Great investigative work. Yeah that 15k score seems pretty ridiculous loL. I barely manage 13k @ 1290/1830: http://www.3dmark.com/3dm11/7783545
Quote:


> Originally Posted by *Jawwwwsh*
> 
> Out of interest, how are people overclocking their R9 cards nowadays?


I just disabled OverDrive in CCC and use GPU Tweak.


----------



## cookiesowns

Quote:


> Originally Posted by *hamzta09*
> 
> Raise your pagefile and it is solved.
> 
> I got OOM after just a couple minutes when using a 200-1024MB pagefile on the SSD. Raised it to 400-4096 (or 3072 not sure) and problem got away. Just the OS being terrible.
> 
> If you need to lower your voltage, just use VBE7 (.rom tweaker) and lower the voltages by ~.010-30. On the performance tab, the one that runs at highest clocks.


Looks like it did. Meh. The first card is quite a poor clocker. Using around 1.26V VDDC 1.2V Actual, it can do just barely 1180 @ 1700 mem. Seems to be stable in BF4.

Second card has a higher ASIC, so I'll most likely swap them around. It's unfortunate the fans don't move much air, but the heatsink seems to be doing quite a nice job wicking heat away ( get's pretty hot )

Wonder why AfterBurner doesn't support Voltage monitoring / control on these cards.


----------



## cinnamoncider

I recently bought a ASUS R9 280x last December, and I love this bad boy.

At stock speed and with DCUII cooler - on load running Furmark for 30 minutes - it runs at 77°C with 31°C ambient room temperature.


----------



## Devildog83

Quote:


> Originally Posted by *cinnamoncider*
> 
> I recently bought a ASUS R9 280x last December, and I love this bad boy.
> 
> At stock speed and with DCUII cooler - on load running Furmark for 30 minutes - it runs at 77°C with 31°C ambient room temperature.


Show me a pic and some clocks and I will add you to the club.


----------



## Devildog83

This looks like it would be a great addition to My Man-cave/Office/Hide-out.

http://www.newegg.com/Product/Product.aspx?Item=9SIA05F05X9402&nm_mc=EMC-MP011914&cm_mmc=EMC-MP011914-_-EMC-011914-Index-_-index-_-9SIA05F05X9402


----------



## Devildog83

Quote:


> Originally Posted by *cookiesowns*
> 
> Interesting.
> 
> I've been having some problems with CrossFire ( really should have just stuck with 1x 290 option ). In BF4 I will constantly get OOM errors, even though 8GB of RAM should be plentiful. Latest 13.12 drivers. I even raised my page file to 8GB and BF4 instantly will commit to the full 8+4GB setup.
> 
> Another thing is these cards run HOT! Looks like the first card gets heat soaked. 2x R9 280X Powercolor PowerDUO. VRM temps are quite good though, < 80C under heavy load and around 1.2VDDC. It's just unfortunate that MSI afterburner can't interface with these cards so I still need to find a utility that can.
> 
> I think I'm maybe one of the first people with cards that run hotter on diode than VRM lol.


How do I raise the pagefile? - I only use 2Gbs max right now. It could be leading to my low combined scores.


----------



## goldswimmerb

My XFX R9 270X Best Buy DD Edition. Im overclocking the core by 50Mhz, anything more causes the card to crash, The memory wont overclock (To a point where there is a performance increase)... Im also running 3 1280x1024 monitors off this card


----------



## cinnamoncider

Quote:


> Originally Posted by *Devildog83*
> 
> Show me a pic and some clocks and I will add you to the club.


I still use MSI Afterburner from my previous R5670. I felt to lazy to replace it with the ASUS counterpart - still looks the same to me.


Running inside my rig.


Pic before installation.

R9 280x DCUII, Y SO SEXY?


----------



## cremelo

Anyone know if already have Backplate to Asus R9 280x DCUII TOP???
I'm almost thinking of making a custom order =/


----------



## hamzta09

Quote:


> Originally Posted by *Devildog83*
> 
> How do I raise the pagefile? - I only use 2Gbs max right now. It could be leading to my low combined scores.


Controlpanel - System - Advanced - Advanced - Performance - Advanced - Virtual Memory


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> Controlpanel - System - Advanced - Advanced - Performance - Advanced - Virtual Memory


^beat me to it


----------



## neurotix

Does enabling the page file really cause an increase in performance in Fire Strike?

I usually run with mine off as it makes the OS faster overall.


----------



## cinnamoncider

@cremelo
My Asus R9 280x has no backplate. It's kind of frustrating to know that the triple slotted R9 280x that is similar to Matrix has one.


----------



## cremelo

Quote:


> Originally Posted by *cinnamoncider*
> 
> @cremelo
> My Asus R9 280x has no backplate. It's kind of frustrating to know that the triple slotted R9 280x that is similar to Matrix has one.


Yes I find it somewhat nonsensical.... and the biggest problem is finding someone here where I live to do or even import...
Ever thought about doing acrylic though if I would have her do crossfire maybe in 6 months (until you import another for here does not have official reseller) but I believe that in time would give any problems, or think we have to wait or I'll personally go for Vapor-X our Toxic


----------



## graycat

Rock solid, stock voltages, still has more room to go.


----------



## benjamen50

So it turns out that the Gigabyte AMD R9 270X has VRM sensors, that is, with checking with HWInfo64, i'll check the VRM temperatures on full load. (probably with mining)


----------



## Pesmerrga

I've had the Sapphire R9 270x 4GB Dual-X card for a while now. I was intending on getting a R9 290 or 290x but I refuse to pay $100-150+ more than MSRP on those. So I figured grab one of these for the time being. Maybe I'll grab another one and CF them or throw this in my GF's PC and grab a 290(x) if the prices come back down.

Highest stable run on FS - 5654

http://www.3dmark.com/fs/1473225

*Pics*


Spoiler: Warning: Spoiler!


----------



## Majentrix

Can I flash a 270 to a 270x BIOS?

Overclocking on a 270 is currently extremely limited.


----------



## cinnamoncider

Quote:


> Originally Posted by *cremelo*
> 
> Yes I find it somewhat nonsensical.... and the biggest problem is finding someone here where I live to do or even import...
> Ever thought about doing acrylic though if I would have her do crossfire maybe in 6 months (until you import another for here does not have official reseller) but I believe that in time would give any problems, or think we have to wait or I'll personally go for Vapor-X our Toxic


I have never thought of creating a custom backplate for that with an acrylic. It may just trap the heat making the PCB more hotter.


----------



## nubki11a

Apparently the shop I ordered my

EDIT: Dunno *** happened here, looks like I made a mistake by posting with my mobile phone


----------



## Farih

Good news guys and girls









At Futuremark they think they have found a ppproblem with some people posting CF results but get counted as single GPU results.
Thats why some score's are so far away of our own overclocks.

here a qoute from a Futuremark rep:
"
EMPLOYEE
*FM_Jarnis (Official Rep) about 4 hours ago*

Okay, first thing we found is that there are some results here that claim to have 1 GPU (no crossfire), but in reality the results were run with 2x card crossfire.

This would indicate a bug of some sort in result processing. We're working to track it down and will look into correcting any mis-labeled results as part of that."


----------



## Devildog83

Cinnamoncider, Goldswimmerb and Pesmerrga have been added. Welcome to the club.

I am sorry for my virtual absence yesterday, I was watching my Seattle Seahawks slap down the 49ers and send them home for the season. On to the Superbowl.


----------



## Delphi

Great results Farih! I should really get a bios to you so I can up my volts!


----------



## goldswimmerb

I wouldn't recommend it, unless you know who makes your VRAM modules, and you know for a fact that you can flash those BIOS to your card. My 270x is elphida vram so i cant get a good overclock, yet when i had an R7 240 with hinyx vram i could get a 25-30% OC.


----------



## Farih

Quote:


> Originally Posted by *Delphi*
> 
> Great results Farih! I should really get a bios to you so I can up my volts!


Just get GPU-Z and save your own bios. (you have a good original back-up to then)
Then change your bios with VBE7, its really easy.

If you want me to do it thats fine to but know everything is always on your own risks.


----------



## Delphi

Quote:


> Originally Posted by *Farih*
> 
> Just get GPU-Z and save your own bios. (you have a good original back-up to then)
> Then change your bios with VBE7, its really easy.
> 
> If you want me to do it thats fine to but know everything is always on your own risks.


Oh I know, I modded my 8800GT years back. I'll take a look at VBE7. If I can't figure it out I'll come and ask you


----------



## mikemykeMB

XFX R9 270x 1050M Boost 2GB 5D


----------



## Devildog83

Quote:


> Originally Posted by *mikemykeMB*
> 
> XFX R9 270x 1050M Boost 2GB 5D


mikemykeMB has been added. Welcome !!

Rainy Washington - well at least some of it.









*GO HAWKS !!!*


----------



## mikemykeMB

Thanks for the welcome, and yeah some places are less rainy...And...of course we are SB bound to Jerzee we go!!

Now... just some tweeks to get the card fine tuned in and to see where these temps reach.


----------



## MLJS54

Is there anything similar to Nvidia's Shadowplay available for my R9 280x?

Fraps and Bandicam really drop my FPS. Wonder if there's a more ATI-friendly recording program, or is it my PC?

Thanks in advance


----------



## rdr09

Quote:


> Originally Posted by *MLJS54*
> 
> Is there anything similar to Nvidia's Shadowplay available for my R9 280x?
> 
> Fraps and Bandicam really drop my FPS. Wonder if there's a more ATI-friendly recording program, or is it my PC?
> 
> Thanks in advance


you record to the same HDD where the os or game is installed? cpu should not be an issue unless you turn off HT.


----------



## gibby1690

hi people just a few wee questions about the r9 280x which i am currently looking at buying

my budget is preferably £250/260 i have been looking at these

http://www.overclockers.co.uk/showproduct.php?prodid=GX-244-MS&groupid=701&catid=56&subcat=2750
http://www.overclockers.co.uk/showproduct.php?prodid=GX-160-PC&groupid=701&catid=56&subcat=2750

i would be able to push closer to £300 if it was in my interest for a better card as i wont be doing much OCing so really looking for something that will runn bf4/ cod at preferably ultra settings straight out the box.

also how much of a bottleneck will my i3 3240 @ 3.4ghz with HT be?

it ran my hd 7870 oc 2gb no bother but cant believe it will handle a r9 280x

i will be running a 1080p tv and vga monitor to start but will soon be getting a dedicated monitor probably just 1080p aswell

all 3 will be hooked up but only monitor will be used for gaming


----------



## hamzta09

Managed 111c on VRM #1 in cgminer.

Why do they insist on passively cooling...well not putting heatsinks at all on the VRMs?


----------



## Delphi

Quote:


> Originally Posted by *hamzta09*
> 
> Managed 111c on VRM #1 in cgminer.
> 
> Why do they insist on passively cooling...well not putting heatsinks at all on the VRMs?


Because XFX. They are the worst for that. My HIS card never goes above 70c at stock voltage.


----------



## hamzta09

Quote:


> Originally Posted by *Delphi*
> 
> Because XFX. They are the worst for that. My HIS card never goes above 70c at stock voltage.


Well you got a 270X, less heat cause less power?

Anyway, gonna see if I can undervolt the cards even more, but I doubt it has an affect on the VRM temps because isnt the ram voltage affecting VRM?


----------



## King Nothing

Previous setup: XFX HD5830 1GB Crossfire 900/1300

Current setup: Asus DCUII TOP R9 270x 1120/1400

A major thing I noticed was the temps and noise. The CF setup would get up to around 85c at about 80% fan. The Asus got to 67c at around 53% fan and was barely noticeable.


----------



## nubki11a

Unfortunately the retailer I was buying from just informed me after a week of waiting that he can't get a new supply anytime soon. So now I have to pick a different brand (not Gigabyte). I can get the MSI for 270 and the Asus TOP or Sapphire Toxic for 290, which is a bit over my budget.

So would the MSI be the best choice? Or are the more expensive models really worth it?


----------



## hamzta09

Quote:


> Originally Posted by *nubki11a*
> 
> Unfortunately the retailer I was buying from just informed me after a week of waiting that he can't get a new supply anytime soon. So now I have to pick a different brand (not Gigabyte). I can get the MSI for 270 and the Asus TOP or Sapphire Toxic for 290, which is a bit over my budget.
> 
> So would the MSI be the best choice? Or are the more expensive models really worth it?


Doesnt toxic perform like a 780?

Anyway I undervolted even more, down to 1.100 from 1.2.


----------



## dexterprog

Hey, I'm new here. I got a HIS 270X IceQ X² Turbo Boost. It seems I'm the only one listed with this card, but perhaps the list is not up to date. I was wondering if anyone tried overclocking this card or anyone similar to this one (stock clocks are 1140 / 1400)


----------



## Devildog83

*dexterprog* has been added. Welcome!! Haven't heard of anyone else in here with that card.


----------



## neurotix

Quote:


> Originally Posted by *dexterprog*
> 
> Hey, I'm new here. I got a HIS 270X IceQ X² Turbo Boost. It seems I'm the only one listed with this card, but perhaps the list is not up to date. I was wonder if anyone tried overclocking this card or anyone similar to this one (stock clocks are 1140 / 1400)


repo_man has this card but it's a 270, not a 270X.


----------



## Deicidium

Sapphire Radeon R9-270x Toxic user here..









will soon change to 280x if my budget permits me


----------



## Delphi

Just a thanks to Farih,

I got both my cards flashed with new bios's. New voltage setting is 1.275. Been mining them all night at 1210/1500 so it is safe to say it is decently stable! Thanks for pointing me in the right direction with VBE7.


----------



## Devildog83

Quote:


> Originally Posted by *Deicidium*
> 
> Sapphire Radeon R9-270x Toxic user here..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> will soon change to 280x if my budget permits me


If you wish to join post a pic and clocks and I will add you.


----------



## repo_man

Quote:


> Originally Posted by *dexterprog*
> 
> Hey, I'm new here. I got a HIS 270X IceQ X² Turbo Boost. It seems I'm the only one listed with this card, but perhaps the list is not up to date. I was wonder if anyone tried overclocking this card or anyone similar to this one (stock clocks are 1140 / 1400)


Quote:


> Originally Posted by *neurotix*
> 
> repo_man has this card but it's a 270, not a 270X.


Indeed! I just picked up the 270 version. Pic below. I'm still working on the OC. Right now I have it up to 1000/1440. Really nice cards it seems like. I LOVE the cooler. I've had it folding 24/7 for two days now and it hasn't broken 53C.


----------



## dexterprog

apparently "Delphi" also has it (the 270X)

On the card, the cooler is great, it's very quiet and effective. My card doesn't go above 60 degrees on full load (currently we are having 30 degrees or more outside, so that is really good).

What are the stock speeds of the 270?


----------



## repo_man

Quote:


> Originally Posted by *dexterprog*
> 
> apparently "Delphi" also has it (the 270X)
> 
> On the card, the cooler is great, it's very quiet and effective. My card doesn't go above 60 degrees on full load (currently we are having 30 degrees or more outside, so that is really good).
> 
> What are the stock speeds of the 270?


The stock speeds on my 270 were 950 core/1400mem. I've got mine pushed to 1050/1440 though. It could probably go higher than 1050, but that's all Afterburner or Catalyst will let me give it. Would flashing it to a 270X let me go higher?


----------



## dexterprog

Quote:


> Originally Posted by *repo_man*
> 
> The stock speeds on my 270 were 950 core/1400mem. I've got mine pushed to 1050/1440 though. It could probably go higher than 1050, but that's all Afterburner or Catalyst will let me give it. Would flashing it to a 270X let me go higher?


I don't really know, I didn't even know you could flash cards to another model. My 270X is 1140/1400 stock and I haven't tried OC it because this is the first VGA I get in more than 10 years, lol, back to gaming, you know, so I am not sure how to test for stability.


----------



## Farih

Quote:


> Originally Posted by *Delphi*
> 
> Just a thanks to Farih,
> 
> I got both my cards flashed with new bios's. New voltage setting is 1.275. Been mining them all night at 1210/1500 so it is safe to say it is decently stable! Thanks for pointing me in the right direction with VBE7.


Great !
Watch out though, 1.275 is rather high, i can reach over 1300mhz with that.
Max reference card voltage for the 270x is 1.219V
I bet you can run 1210mhz to with just around 1.235V
Quote:


> Originally Posted by *repo_man*
> 
> The stock speeds on my 270 were 950 core/1400mem. I've got mine pushed to 1050/1440 though. It could probably go higher than 1050, but that's all Afterburner or Catalyst will let me give it. Would flashing it to a 270X let me go higher?


Just change your own BIOS with VBE7.

You can also upload yours here (use GPU-Z) and me or Devildog wil gladly help you out.


----------



## nubki11a

About to order the 280X Vapor-X as it seems to be a very good card and superior to the MSI (I can get them at the same price). I just have one more question; my PSU, a Corsair TX650 v2, only has 2 6+2 PCI-E pins. The Vapor-X has 2 8 pin connections. Those 6+2 pins will fit the Vapor-X without a problem right?


----------



## Lisjak

IF 6+2=8 then there shouldn't be any problems







And in case you were wondering, yes, a 650W PSU is enough for that card


----------



## Farih

Hooray !

Nr1 270x in Fire Strike now








Futuremark has deleted all those wierd and invalid score's

Now wait till they do the same for 3DM11 and get me Nr1 position there to


----------



## nubki11a

Quote:


> Originally Posted by *Lisjak*
> 
> IF 6+2=8 then there shouldn't be any problems
> 
> 
> 
> 
> 
> 
> 
> And in case you were wondering, yes, a 650W PSU is enough for that card


Alright thanks for the confirmation, just wanted to be 100% sure haha


----------



## hamzta09

Think Ive solved my VRM problem slightly on GPU 1.
Reducing the Voltage to 1.1 from 1.2 made quite a difference.

Going from 111c to 96 whilst mining.


----------



## Delphi

Quote:


> Originally Posted by *Farih*
> 
> Great !
> Watch out though, 1.275 is rather high, i can reach over 1300mhz with that.
> Max reference card voltage for the 270x is 1.219V
> I bet you can run 1210mhz to with just around 1.235V
> Just change your own BIOS with VBE7.


I think I should be fine, I have been keeping an eye on temps. Mining VRM's peaked at 72C, core temp peaked at 68C. I think that is in the safe zone. What do you think? I'll push the clocks more eventually


----------



## neurotix

Quote:


> Originally Posted by *repo_man*
> 
> The stock speeds on my 270 were 950 core/1400mem. I've got mine pushed to 1050/1440 though. It could probably go higher than 1050, but that's all Afterburner or Catalyst will let me give it. Would flashing it to a 270X let me go higher?


You might need to use the HIS software, I think it's called iTurbo. http://www.hisdigital.com/un/news_show-237.shtml

If that doesn't work you could try Sapphire Trixx: https://www.sapphireselectclub.com/ssc/TriXX/

1050 is low and those cards can go much higher. My 270X supports up to 1600/2100 in Trixx. No way it'll ever do it, but it's possible.

Personally, I would try out iTurbo first and see if it lets you raise the clocks higher before you consider flashing the card.


----------



## Farih

Quote:


> Originally Posted by *Delphi*
> 
> I think I should be fine, I have been keeping an eye on temps. Mining VRM's peaked at 72C, core temp peaked at 68C. I think that is in the safe zone. What do you think? I'll push the clocks more eventually


Temps are in the safezone but i just dont know about the voltage.
I run mine on 1.243V 24/7 and that gets me around 1250mhz on the core.
1.275 is really high for 1210mhz.


----------



## repo_man

Quote:


> Originally Posted by *neurotix*
> 
> You might need to use the HIS software, I think it's called iTurbo. http://www.hisdigital.com/un/news_show-237.shtml
> 
> If that doesn't work you could try Sapphire Trixx: https://www.sapphireselectclub.com/ssc/TriXX/
> 
> 1050 is low and those cards can go much higher. My 270X supports up to 1600/2100 in Trixx. No way it'll ever do it, but it's possible.
> 
> Personally, I would try out iTurbo first and see if it lets you raise the clocks higher before you consider flashing the card.


I found the option in Afterburner that "unlocks" the OC slider. I'll be playing with it for the next night or so and see how high I can get it. It handles 1050 really easily, so I was assuming the cards would run higher. Thanks for the tips!


----------



## kersoz2003

I have a sapphire vapor-x 280x and I want to do gpu mining. My temps are 70 degrees gpu and vrm heat at max. So will gpu mining do harm on my card ?


----------



## Devildog83

*neurotix* has kindly pointed out that I might need SP1 for Windows 7 to solve my issue with low combined score for Firestrike. I will try that and pray it works. I thought I had it but do not. Thanks *neurotix !!*


----------



## Devildog83

Is this too red?


----------



## madorax

Quote:


> Originally Posted by *Devildog83*
> 
> Is this too red?


There's no such thing as "Too Red" for the Devil man









Great BTW


----------



## neurotix

Your system looks awesome Devil.

Really like the sleeved cables, I wish I knew how to do custom sleeving.

Red ftw:


Spoiler: Clicky







Hope that fix works for you like it did for me.

The other nice thing is that IBT AVX actually works now and gives the proper gflops from AVX instructions. Before I would only get 45 gflops and now I get 90.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Devildog83*
> 
> Is this too red?


It's funny, I love AMD but i hate the colour red........but i love Green and Blue but dislike Intel and Nvidia......seem odd?
Quote:


> Originally Posted by *Devildog83*
> 
> *neurotix* has kindly pointed out that I might need SP1 for Windows 7 to solve my issue with low combined score for Firestrike. I will try that and pray it works. I thought I had it but do not. Thanks *neurotix !!*


That might be my problem as well, My Graphics scores are fine but my Combined and Physics scores have dropped dramatically.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's funny, I love AMD but i hate the colour red........but i love Green and Blue but dislike Intel and Nvidia......seem odd?
> That might be my problem as well, My Graphics scores are fine but my Combined and Physics scores have dropped dramatically.


I am having an issue with Firestrike for the second test. when ever I put any OC to it it will load the second test but then fail the driver after it tries to run the test. this happens with the most mild Overclock and works in 3d11 and all games.. Not sure why this is happening.. Any one else having the same issue?


----------



## Sgt Bilko

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> I am having an issue with Firestrike for the second test. when ever I put any OC to it it will load the second test but then fail the driver after it tries to run the test. this happens with the most mild Overclock and works in 3d11 and all games.. Not sure why this is happening.. Any one else having the same issue?


Actually the couple of tests i tried to run tonight did that, but my ram timings probably aren't stable.


----------



## ANDR01D

Rig - SnowFlake
GPU - R9 270X
Brand - Sapphire
Model - Sapphire R9 270X Dual-X OC


----------



## Farih

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> I am having an issue with Firestrike for the second test. when ever I put any OC to it it will load the second test but then fail the driver after it tries to run the test. this happens with the most mild Overclock and works in 3d11 and all games.. Not sure why this is happening.. Any one else having the same issue?


When my OC isnt stable thats the test where it will show me first to


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Farih*
> 
> When my OC isnt stable thats the test where it will show me first to


I figured that however what stuck me off was that it didn't matter if it was a 20-200mhz OC lol Although I think I narrowed it down to vRAM I will test when I get home was just curious if any one else say it. I am kinda hoping that its not the vRAM as that would mean I can't bump it up to anything higher than 1500


----------



## ricko99

Guys I'm having artifacts with my ASUS R9 280x DCII TOP. It's still on its default factory clock 1070/1600. Everytime I play game like AC4 and BF4 (which only reaches max of 70c) sometimes I get chequered and black/disoriented artifacts. I have updated the BIOS using GPU Tweak to the latest 15.41.0.0.0.0 (according to GPU Tweak). I never touched the GPU voltage it's still at 1.2v. I wonder whether the card is defective or it's driver problem because I saw quite a lot of people having the same problem with this card and even other r9 280x brand. Anyone here have the same problem and managed to solve it? using 13.12 driver




This one is the chequered one




This one is the black artifact. It happened not only when playing games, but when doing benchmarks using heaven or valley

The videos are not mine but I'm having similar issues.

UPDATE: apparently i fixed it by reverting to catalyst 13.11 beta 5. amd and their crap whql driver


----------



## Devildog83

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's funny, I love AMD but i hate the colour red........but i love Green and Blue but dislike Intel and Nvidia......seem odd?
> That might be my problem as well, My Graphics scores are fine but my Combined and Physics scores have dropped dramatically.


false

That's OK cause I hate the color green, I like cobalt blue and that will be the theme for my next build now that this is almost done.


----------



## MissStinkyBum

hey, i'm hoping someone might be able to help me. I recently brought a MSi 280x and when purchased the manufactures code said it should be v277-053r however on the box it says 12-v277-068, i cant find anything if this on the internet, also on the pcb its version 1.2, i'm hoping i'm reading the wrong bit as the 053r is a reference and I was wanting to watercool.
cheers


----------



## cremelo

Quote:


> Originally Posted by *ricko99*
> 
> Guys I'm having artifacts with my ASUS R9 280x DCII TOP. It's still on its default factory clock 1070/1600. Everytime I play game like AC4 and BF4 (which only reaches max of 70c) sometimes I get chequered and black/disoriented artifacts. I have updated the BIOS using GPU Tweak to the latest 15.41.0.0.0.0 (according to GPU Tweak). I never touched the GPU voltage it's still at 1.2v. I wonder whether the card is defective or it's driver problem because I saw quite a lot of people having the same problem with this card and even other r9 280x brand. Anyone here have the same problem and managed to solve it? using 13.12 driver
> 
> 
> 
> 
> This one is the chequered one
> 
> 
> 
> 
> This one is the black artifact. It happened not only when playing games, but when doing benchmarks using heaven or valley
> 
> The videos are not mine but I'm having similar issues.
> 
> UPDATE: apparently i fixed it by reverting to catalyst 13.11 beta 5. amd and their crap whql driver


I have the same problem =/
When i get home i gonna try change my driver too.....


----------



## ricko99

Quote:


> Originally Posted by *cremelo*
> 
> I have the same problem =/
> When i get home i gonna try change my driver too.....


After i revert the driver, there is less artifacts in BF4, but same amount of artifacts still exist in ac4. Looks like the card is not friendly towards nvidia games. I assume the 1600mhz clock that asus provide can't be handled by the voltage. Try to increase the voltage of the card


----------



## cremelo

Quote:


> Originally Posted by *ricko99*
> 
> After i revert the driver, there is less artifacts in BF4, but same amount of artifacts still exist in ac4. Looks like the card is not friendly towards nvidia games. I assume the 1600mhz clock that asus provide can't be handled by the voltage. Try to increase the voltage of the card


To me it's funny because even in Google Chrome I have seen artifacts....
I contacted Asus support and in less than 2 hours I responded told me that the best solution would be to send for RMA but I could try to lower the clocks to see if the problems stop.....


----------



## ricko99

Quote:


> Originally Posted by *cremelo*
> 
> To me it's funny because even in Google Chrome I have seen artifacts....
> I contacted Asus support and in less than 2 hours I responded told me that the best solution would be to send for RMA but I could try to lower the clocks to see if the problems stop.....


I think it's better to RMA the card because you paid more for the asus card compared to other brands because of the higher clock speed. You wouldn't want to waste your money downclocking the card


----------



## cremelo

Quote:


> Originally Posted by *ricko99*
> 
> I think it's better to RMA the card because you paid more for the asus card compared to other brands because of the higher clock speed. You wouldn't want to waste your money downclocking the card


I reinstalled the driver 13.12 and apparently stopped giving problems, I'll wait another 2-5 days to see as follows


----------



## CptAsian

Quote:


> Originally Posted by *Devildog83*
> 
> Is this too red?
> 
> 
> Spoiler: Warning: Spoiler!


I know I'm not a part of this club; I'm just a lurker, but I have to add:

No.


Spoiler: Warning: Spoiler!


----------



## Delphi

Quote:


> Originally Posted by *Farih*
> 
> Temps are in the safezone but i just dont know about the voltage.
> I run mine on 1.243V 24/7 and that gets me around 1250mhz on the core.
> 1.275 is really high for 1210mhz.


Set it at 1.235. Running 1210 stable right now. Thanks for the pointers.


----------



## Devildog83

Quote:


> Originally Posted by *CptAsian*
> 
> I know I'm not a part of this club; I'm just a lurker, but I have to add:
> 
> No.
> 
> 
> Spoiler: Warning: Spoiler!


That is red!!


----------



## Durvelle27

To much red bud. Needs more blue


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> To much red bud. Needs more blue


Better?


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> Better?












So much better


----------



## jamponget9

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> Better?


WOW! we have the same case... I have few Led Fans installed in my case... Once i saw yours my head is screaming blue led lights...


----------



## Sgt Bilko

Quote:


> Originally Posted by *jamponget9*
> 
> WOW! we have the same case... I have few Led Fans installed in my case... Once i saw yours my head is screaming blue led lights...


This is the LED's around the side: http://www.pccasegear.com/index.php?main_page=product_info&cPath=1354_1356&products_id=16002

And these are the Fans in the front: http://www.pccasegear.com/index.php?main_page=product_info&cPath=9_1159&products_id=24577

You should be able to find them in places other than Aus pretty easily


----------



## eremos

You guys seem helpful and knowledgeable so I'd really appreciate your help narrowing down the cause of my problem.

My brand new PC is freaking out on me, and I suspect the graphics card, but I'd like to narrow it down.


Spoiler: Warning: Spoiler!















That's an example of what's happening, usually after 1-2 hours, but the corruption will happen on a variety of objects. If I look away from the affected object, everything is fine, but look back and it spazzes out. If I minimize GW2 and restore it, it's fixed for a little while, but it seems like it becomes a lot more frequent from that point on, and I've had BSODs twice.

My question is, is this "definitely" indicative of something specific, or could it really be anything? My suspicion is VRAM since it seems like a stored texture is getting corrupted, but for all I know any number of things could be the cause.

Specifics:
Motherboard: Intel DH87 Round Lake
CPU: Intel i5-4440
Graphics card: Sapphire Dual-X R9 270X
RAM: 2x 4GB Kingston HyperX Black 1333MHz
Graphics drivers: AMD Catalyst 13.12

No overclocking aside from the card which is Factory OCd to 1070MHz. I've taken it back to the stock 1000 and it was the same.

My CPU is running a little hot at up to 80 degrees under heavy load, so I'd like to know if that is a plausible cause or if an overheating CPU would cause a different kind of corruption. As far as I can tell, 80 degrees is quite hot for this CPU but not critically so.

GPU gets no higher than 65 degrees.

Thanks all!


----------



## ricko99

Quote:


> Originally Posted by *eremos*
> 
> You guys seem helpful and knowledgeable so I'd really appreciate your help narrowing down the cause of my problem.
> 
> My brand new PC is freaking out on me, and I suspect the graphics card, but I'd like to narrow it down.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's an example of what's happening, usually after 1-2 hours, but the corruption will happen on a variety of objects. If I look away from the affected object, everything is fine, but look back and it spazzes out. If I minimize GW2 and restore it, it's fixed for a little while, but it seems like it becomes a lot more frequent from that point on, and I've had BSODs twice.
> 
> My question is, is this "definitely" indicative of something specific, or could it really be anything? My suspicion is VRAM since it seems like a stored texture is getting corrupted, but for all I know any number of things could be the cause.
> 
> Specifics:
> Motherboard: Intel DH87 Round Lake
> CPU: Intel i5-4440
> Graphics card: Sapphire Dual-X R9 270X
> RAM: 2x 4GB Kingston HyperX Black 1333MHz
> Graphics drivers: AMD Catalyst 13.12
> 
> No overclocking aside from the card which is Factory OCd to 1070MHz. I've taken it back to the stock 1000 and it was the same.
> 
> My CPU is running a little hot at up to 80 degrees under heavy load, so I'd like to know if that is a plausible cause or if an overheating CPU would cause a different kind of corruption. As far as I can tell, 80 degrees is quite hot for this CPU but not critically so.
> 
> GPU gets no higher than 65 degrees.
> 
> Thanks all!


try reverting to catalyst 13.11. 80c for CPU seems too hot for me try to get a better cooler to keep it around 50 to 60ish


----------



## hamzta09

So I ran Unigine Heaven on Extreme 8x AA, Ultra.

I get scores lower than a single 780.. what.

FPS: 65.2
Score: 1641
Min FPS: 8.3
Max FPS: 119.1

I get lower scores than 2 7950s in crossfire.
http://forums.overclockers.co.uk/showpost.php?p=25214820&postcount=1742

Raising GPU Core from 1k to 1080 gave me a 120 score increase.. wat.


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> So I ran Unigine Heaven on Extreme 8x AA, Ultra.
> 
> I get scores lower than a single 780.. what.
> 
> FPS: 65.2
> Score: 1641
> Min FPS: 8.3
> Max FPS: 119.1
> 
> I get lower scores than 2 7950s in crossfire.
> http://forums.overclockers.co.uk/showpost.php?p=25214820&postcount=1742
> 
> Raising GPU Core from 1k to 1080 gave me a 120 score increase.. wat.


Are the cards at 100% usage when Heaven is running?


----------



## hamzta09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Are the cards at 100% usage when Heaven is running?


98-99%.


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> 98-99%.


That usually indicates a CPU Bottleneck but try running it with a single card only and see what you get then.

Crossfire might not be working properly.


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That usually indicates a CPU Bottleneck but try running it with a single card only and see what you get then.
> 
> Crossfire might not be working properly.


Hmmmm my 780 never hits 100%. Its always pegged at 99%


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> Hmmmm my 780 never hits 100%. Its always pegged at 99%


When i'm running one 290 i'll hit 100% in games, the 7970 i had previously paired with an FX 8150 ran at 98%

I've done some more reading on it now and anything that's above 95% means the GPU is at max.

Must be a Crossfire issue then, I've noticed that some things will run slower on Crossfire in mine compared to a single card.


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> When i'm running one 290 i'll hit 100% in games, the 7970 i had previously paired with an FX 8150 ran at 98%
> 
> I've done some more reading on it now and anything that's above 95% means the GPU is at max.
> 
> Must be a Crossfire issue then, I've noticed that some things will run slower on Crossfire in mine compared to a single card.


Note I'm running a single GTX 780. Usage ranges from 97-99% in BF4 & Crysis 3


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> Note I'm running a single GTX 780. Usage ranges from 97-99% in BF4 & Crysis 3


Yes, i understand that, I also said that after doing some more reading that means that the usage is fine.

i had the wrong information and i corrected myself.


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yes, i understand that, I also said that after doing some more reading that means that the usage is fine.
> 
> i had the wrong information and i corrected myself.


Lol


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> Lol


So in light of that it's Crossfire messing up the score then.


----------



## hamzta09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That usually indicates a CPU Bottleneck but try running it with a single card only and see what you get then.
> 
> Crossfire might not be working properly.


What are you talking about?

Bottleneck?
Afterburners usage limit is 99%........


----------



## Sgt Bilko

Quote:


> Originally Posted by *hamzta09*
> 
> What are you talking about?
> 
> Bottleneck?
> Afterburners usage limit is 99%........


Read the posts under that one, i corrected myself later.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> What are you talking about?
> 
> Bottleneck?
> Afterburners usage limit is 99%........


What are the temps you are getting on your VRMs now? I am trying to find a way to put a better cooling (maybe mod) on the inside metal bracket that mounts on the RAM and VRMs.

Also what is the highes stable ram timings that you are getting on that card.. I seem to be able to manage 1700 however fails here and there and I have isolated it to be that.


----------



## hamzta09

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> What are the temps you are getting on your VRMs now? I am trying to find a way to put a better cooling (maybe mod) on the inside metal bracket that mounts on the RAM and VRMs.
> 
> Also what is the highes stable ram timings that you are getting on that card.. I seem to be able to manage 1700 however fails here and there and I have isolated it to be that.


I havent exactly overclocked either of them other than adding +80 on cores (1080)

If I raise vram just by +50 I get lower frames in unigine heaven lol.

VRM temps in Unigine Heaven extreme reaches 80c on the card at top.


----------



## repo_man

I'm getting the weird flicker or stutter issue with my new HIS r9 270. I see that some manufacturers have offered an updated bios for this card. I haven't found anything that says HIS has. Will any other bios update work (I'm shooting in the wind here I know) or do I just need to send this card back to Newegg (I'm still in my 30 day period) and just buy a 270 from a manufacturer that has offered a bios update, like Asus or MSI?

Edit: I'm on the current drivers 13.12, and I'm downloading 13.10 right now to try. I'm getting this stutter at stock speeds (950/1400) and any type of OC as well.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> I havent exactly overclocked either of them other than adding +80 on cores (1080)
> 
> If I raise vram just by +50 I get lower frames in unigine heaven lol.
> 
> VRM temps in Unigine Heaven extreme reaches 80c on the card at top.


Ok and thank you


----------



## Devildog83

Word is AMD will be releasing new drivers (13.35), it will have mantle and true audio support along with a ton of fixes for the R9 and R7 and I hope some for X-Fire. Some may already know this but I thought I would throw it out there. I do not have an ETA just so you don't have to bother asking.


----------



## Widde

Quote:


> Originally Posted by *hamzta09*
> 
> So I ran Unigine Heaven on Extreme 8x AA, Ultra.
> 
> I get scores lower than a single 780.. what.
> 
> FPS: 65.2
> Score: 1641
> Min FPS: 8.3
> Max FPS: 119.1
> 
> I get lower scores than 2 7950s in crossfire.
> http://forums.overclockers.co.uk/showpost.php?p=25214820&postcount=1742
> 
> Raising GPU Core from 1k to 1080 gave me a 120 score increase.. wat.


Not 100% sure it's a fix but try making sure that you have disabled ULPS, Have read that it sometimes causes issues with crossfire


----------



## Pis

Limit 99%? Not really. =3


----------



## Farih

Not a 270 or 280x but thought i show you all the power of a 260x to









Now i got rank 1 in 3Dmark 11 with a 270x and a 260x









Seem to have won the sillicon lottery with this 260x.
It does 1275/1800mhz on stock voltage and auto fan and only used CCC.
Wonder what it can do when you overvolt it.......


----------



## Clexzor

Hey all picked up a MSI 280x at microcenter today for 340$ seemd like a solid deal...seems to also be a good overclocker im at a solid 1250/1650 @ 1.25v max temp 66c on air asic% 67%

quick question is it safe to run up to 1.3v on it?? I had to the msi afterburner unlock trick to unlock the voltage but haven't touched the bios at all and don't plan to....I think she will do around 1300 @ 1.3v

firestrike score 8718

http://www.3dmark.com/3dm/2302073


----------



## Devildog83

Quote:


> Originally Posted by *Clexzor*
> 
> Hey all picked up a MSI 280x at microcenter today for 340$ seemd like a solid deal...seems to also be a good overclocker im at a solid 1250/1650 @ 1.25v max temp 66c on air asic% 67%
> 
> quick question is it safe to run up to 1.3v on it?? I had to the msi afterburner unlock trick to unlock the voltage but haven't touched the bios at all and don't plan to....I think she will do around 1300 @ 1.3v
> 
> firestrike score 8718
> 
> http://www.3dmark.com/3dm/2302073


You have been added, looks like a nice card you got there. Do you have a pic for us?


----------



## repo_man

I can't handle the stutter anymore. I RMAd my HIS 270 for a refund from Newegg and ordered an ASUS 270X. ASUS has actually released a bios update to correct the stutter issue. And hey, a little higher stock core clock, can't go wrong with that I guess, lol.


----------



## Farih

Never put up a picture for proof so here goes.

Here is the Rank 1 270x (atm)










Crappy tablet pic









Has anyone overvolted a 260x yet or know somewhere they dit ?


----------



## Devildog83

Quote:


> Originally Posted by *repo_man*
> 
> I can't handle the stutter anymore. I RMAd my HIS 270 for a refund from Newegg and ordered an ASUS 270X. ASUS has actually released a bios update to correct the stutter issue. And hey, a little higher stock core clock, can't go wrong with that I guess, lol.


I have always looked at HIS as a budget brand. Maybe it's not fair but that's how I have always seen them. Asus, EVGA, XFX used to be tops but I think Asus, Powercolor and MSI are the most reliable brands now. EVGA and XFX seem to have lacked in there effort until just recently where XFX has started to step up a bit more.


----------



## nubki11a

Quote:


> Originally Posted by *Devildog83*
> 
> I have always looked at HIS as a budget brand. Maybe it's not fair but that's how I have always seen them. Asus, EVGA, XFX used to be tops but I think Asus, Powercolor and MSI are the most reliable brands now. EVGA and XFX seem to have lacked in there effort until just recently where XFX has started to step up a bit more.


What do you think about Sapphire? I've heard a lot of mixed stories about them.


----------



## Devildog83

Quote:


> Originally Posted by *nubki11a*
> 
> What do you think about Sapphire? I've heard a lot of mixed stories about them.


Geez' how could I forget about them, they are very strong right now for overclocking cards from what I have seen. They have come on very strong with there R9 line-up. I didn't really hear too much about them before because I always thought of them as kinda low end too but they changed my perception in a hurry. I think it was that I just didn't know them well.


----------



## ricko99

Anyone who has asus r9 280x dcII TOP and having artifacts problem can go to asus website and look for new BIOS released on 26 January 2014. It solves all my artifacts problem


----------



## Farih

Quote:


> Originally Posted by *Devildog83*
> 
> Geez' how could I forget about them, they are very strong right now for overclocking cards from what I have seen. They have come on very strong with there R9 line-up. I didn't really hear too much about them before because I always thought of them as kinda low end too but they changed my perception in a hurry. I think it was that I just didn't know them well.


The thing with Sapphire is they make and sell alot of budget cards and people base most of Sapphire on those.
If you want to spend some money you can get very good cards from them to.

Sapphire has been good for me for many years be it cheap low end or the more pricey higher end.

I have had my hands on ALOT of Sapphire cards and its the brand i sell the most i think (next to Asus and MSI)
There warranty/RMA handling is allright to, i have had far and far worse.


----------



## Jaffi

Quote:


> Originally Posted by *ricko99*
> 
> Anyone who has asus r9 280x dcII TOP and having artifacts problem can go to asus website and look for new BIOS released on 26 January 2014. It solves all my artifacts problem


Interesting:
Quote:


> R9280X-DC2T-3GD5 VBIOS update
> BIOS update to remove rare artifacting events during gaming, Please also update the Catalyst driver to 13.251 or above


Will flash later.


----------



## neurotix

Quote:


> Originally Posted by *Farih*
> 
> The thing with Sapphire is they make and sell alot of budget cards and people base most of Sapphire on those.
> If you want to spend some money you can get very good cards from them to.
> 
> Sapphire has been good for me for many years be it cheap low end or the more pricey higher end.
> 
> I have had my hands on ALOT of Sapphire cards and its the brand i sell the most i think (next to Asus and MSI)
> There warranty/RMA handling is allright to, i have had far and far worse.


Sapphire is all I'll buy if I'm buying an AMD card. I'm another person that will vouch for them.

My 270X Vapor-X and 290 Tri-X are both amazing overclockers and run very cool.

Their older generation cards were hit and miss and the coolers weren't as good. My 6870 from them only had one fan in the middle and the heatsink was badly designed.

I still have 2 4670s from them that I bought in 2009 and both work fine.

Their RMA department (Althon Micro) is amazing. We had a problem with a Sapphire 7970 Vapor-X and they got us a new card in 3 weeks over the holiday season. That card ended up having a missing capacitor, and it was crashing in my system so we RMA'ed and they sent us another within 2 weeks. Their tech support was friendly and easy to work with and went out of their way to troubleshoot and help us get our cards working. When that failed, they RMA'ed for us. Now we have a working card again. The 7970 Vapor-X even had the cooler changed on it, and some of the thermal pads were even torn/ripped, we put the stock cooler back on it and they covered it with no issues.

Considering I hear other people's horror stories about RMAing components and being out a piece of their system for 2-3 months, Sapphire's isn't bad at all. Waiting 2 weeks to get a card is really good.

The downside is that their warranty is only 2 years, but these things are usually obsolete and needing replacement in that time anyway.


----------



## nubki11a

Heh, that's really good to hear, expecting a 280X Vapor-X by the end of the week to replace my Sapphire HD6870


----------



## F3ERS 2 ASH3S

I'm thinking of picking up another xfx 280x when my taxes come back. Part of the reason is i already have one and it would be nice to have a matching set.


----------



## Clexzor

Quote:


> Originally Posted by *Devildog83*
> 
> You have been added, looks like a nice card you got there. Do you have a pic for us?


sweet thanks ill get apic of my rig etc soon heres a pic of gpuz and afterburner and 3dmark









also I was able to go higher 1275/1850 1.275vish and +50v on the memory is this safe for 24/7? card maxes around 69c appears to be in the top 15 280x's

and tried 1.3v but yielded no gain above

http://imgur.com/PuSKgoT


----------



## nubki11a

Anyone got any experience with the 280x vapor-x? Im worried that the backplate might touch my mobo. Anyone know how much further it sticks out than a normal gfx card?


----------



## repo_man

Quote:


> Originally Posted by *Devildog83*
> 
> I have always looked at HIS as a budget brand. Maybe it's not fair but that's how I have always seen them. Asus, EVGA, XFX used to be tops but I think Asus, Powercolor and MSI are the most reliable brands now. EVGA and XFX seem to have lacked in there effort until just recently where XFX has started to step up a bit more.


I wouldn't have an issue buying HIS again. This issue was a 270 problem, not really a HIS problem. They just aren't updating the bios, like other companies. I don't want people to think I was bashing HIS. I really liked the card other than that. It was really cool and OCed really well.

That said, I agree ASUS and MSI are great brands. I used to like XFX a lot due to their warranty policy, but I was soured on them after I bought an XFX GTX 260 that had the cheapest cooler I've ever seen on a GPU. It hardly kept the card cool at stock clocks. I didn't have any issues with it, I was just shocked at how cheap it was put together. Though I'm sure XFX might be like Sapphire as you said, they make some cheap cards and some really nice cards so it depends on what you get.


----------



## neurotix

Quote:


> Originally Posted by *nubki11a*
> 
> Anyone got any experience with the 280x vapor-x? Im worried that the backplate might touch my mobo. Anyone know how much further it sticks out than a normal gfx card?


I have a 7970 Vapor-X, I looked at the 280x Vapor-X and it looks similar except it has a backplate.

I don't think it will touch the motherboard at all. You should be fine.


----------



## hamzta09

Quote:


> Originally Posted by *neurotix*
> 
> I have a 7970 Vapor-X, I looked at the 280x Vapor-X and it looks similar except it has a backplate.


I wonder why a backplate would touch the motherboard, isnt it the same size as the PCB?


----------



## blueskybluesky1

Hello

A new VGA BIOS was released yesterday for ASUS Radeon R9 280X R9280X-DC2T-3GD5 - TOP Edition : https://www.asus.com/Graphics_Cards/R9280XDC2T3GD5/#support

_"R9280X-DC2T-3GD5 VBIOS update
BIOS update to remove rare artifacting events during gaming, Please also update the Catalyst driver to 13.251 or above
File Size 1,19 MBytesupdate"_

Anyone tried it yet? I have the R9 280X TOP from Asus but the only problem with it was a few times in Battlefield 4 the sky was flickering (artifacts). But recovered after game restart. I am still not decided to update. I afraid it will ruin it more








If somebody had more issues in games and updated the BIOS please share your feedback about improvements/or lets hope no other supplementary bugs









Thank you. Also, ASUS R9 280X TOP owners if you want to share experience with this card I am open to any questions.


----------



## crazysoccerman

I just RMA'd my DC2T 280x. I'm pretty sure the problem is hardware based because the video will fail even in my motherboard's bios menu. I tried many different drivers and reverted to the older video bios version. 80% of the time I can't boot into windows before the display goes out. Removing the video card and using integrated graphics resolves all issues.

I decided to RMA through newegg rather than asus. However, they are out of stock and selling at massively inflated prices so who knows when I will get my replacement.

Regardless, as soon as soon as I get my replacement I'm selling it. The failure in less than 3 months of the first video card I've ever owned has spooked me away from AMD and ASUS.

Edit: to the post above me, I wouldn't update the video bios. I cannot be 100% sure that my reflashing of the old bios completely undid whatever the new bios did. I updated the video bios last week and started experiencing these issues in the past few days. Like I've said, reflashing the old bios didn't resolve the issue, but I can't guarantee that the new bios didn't cause the problem.


----------



## nubki11a

Quote:


> Originally Posted by *hamzta09*
> 
> I wonder why a backplate would touch the motherboard, isnt it the same size as the PCB?


A couple of reviews I checked mentioned it. They said that if you have a high end mobo like an Asus Rampage, the heatsink will touch the backplate (apparently it sticks out a little, idk). I checked with my extreme3 gen3 /w Sapphire HD 6870 and its already pretty close.


----------



## blueskybluesky1

Quote:


> Originally Posted by *crazysoccerman*
> 
> The failure in less than 3 months of the first video card I've ever owned has spooked me away from AMD and ASUS.


But until 3 months of usage it worked OK? And after 3 months started to generate problems?


----------



## cremelo

Quote:


> Originally Posted by *blueskybluesky1*
> 
> Hello
> 
> A new VGA BIOS was released yesterday for ASUS Radeon R9 280X R9280X-DC2T-3GD5 - TOP Edition : https://www.asus.com/Graphics_Cards/R9280XDC2T3GD5/#support
> 
> _"R9280X-DC2T-3GD5 VBIOS update
> BIOS update to remove rare artifacting events during gaming, Please also update the Catalyst driver to 13.251 or above
> File Size 1,19 MBytesupdate"_
> 
> Anyone tried it yet? I have the R9 280X TOP from Asus but the only problem with it was a few times in Battlefield 4 the sky was flickering (artifacts). But recovered after game restart. I am still not decided to update. I afraid it will ruin it more
> 
> 
> 
> 
> 
> 
> 
> 
> If somebody had more issues in games and updated the BIOS please share your feedback about improvements/or lets hope no other supplementary bugs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you. Also, ASUS R9 280X TOP owners if you want to share experience with this card I am open to any questions.


I plan on doing the update but the problem is if I give something to send for warranty, as mine had to import the store will take much responsibility to return it ....
I'll wait to see if Asus says I will be guaranteed 100% ....


----------



## Jaffi

Quote:


> Originally Posted by *blueskybluesky1*
> 
> Hello
> 
> A new VGA BIOS was released yesterday for ASUS Radeon R9 280X R9280X-DC2T-3GD5 - TOP Edition : https://www.asus.com/Graphics_Cards/R9280XDC2T3GD5/#support
> 
> _"R9280X-DC2T-3GD5 VBIOS update
> BIOS update to remove rare artifacting events during gaming, Please also update the Catalyst driver to 13.251 or above
> File Size 1,19 MBytesupdate"_
> 
> Anyone tried it yet? I have the R9 280X TOP from Asus but the only problem with it was a few times in Battlefield 4 the sky was flickering (artifacts). But recovered after game restart. I am still not decided to update. I afraid it will ruin it more
> 
> 
> 
> 
> 
> 
> 
> 
> If somebody had more issues in games and updated the BIOS please share your feedback about improvements/or lets hope no other supplementary bugs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you. Also, ASUS R9 280X TOP owners if you want to share experience with this card I am open to any questions.


I updated through latest version of GPU Tweak and got no problems at all. Running fine.


----------



## blueskybluesky1

Asus GPU Tweak freezes my computer if I scan for "new bios"! I have to manually reset the PC by hardware button.
So I will flash the new bios using the .exe file from the Asus site.

Before BIOS update you had any issues that were solved in the new one? Do you play Battlefield 4 for example?


----------



## cremelo

Quote:


> Originally Posted by *blueskybluesky1*
> 
> Asus GPU Tweak freezes my computer if I scan for "new bios"! I have to manually reset the PC by hardware button.
> So I will flash the new bios using the .exe file from the Asus site.
> 
> Before BIOS update you had any issues that were solved in the new one? Do you play Battlefield 4 for example?


I get the same problem, i put to run Asus GPU Tweak as Admin. and worked
 










Now i just need test but i'm on work


----------



## Scott1541

Ordered myself a R9 270X earlier..... it should come tomorrow







I'm hoping it will be a significant upgrade from my GTX 460, which is showing it's age now.


----------



## [CyGnus]

Scott1541 you will be happy its a good boost over a 460


----------



## Scott1541

Quote:


> Originally Posted by *[CyGnus]*
> 
> Scott1541 you will be happy its a good boost over a 460


I was looking on anandtech and it looks like it will give a nice performance boost


----------



## [CyGnus]

Its about double or more of the performance of a 460


----------



## Devildog83

Quote:


> Originally Posted by *crazysoccerman*
> 
> I just RMA'd my DC2T 280x. I'm pretty sure the problem is hardware based because the video will fail even in my motherboard's bios menu. I tried many different drivers and reverted to the older video bios version. 80% of the time I can't boot into windows before the display goes out. Removing the video card and using integrated graphics resolves all issues.
> 
> I decided to RMA through newegg rather than asus. However, they are out of stock and selling at massively inflated prices so who knows when I will get my replacement.
> 
> Regardless, as soon as soon as I get my replacement I'm selling it. The failure in less than 3 months of the first video card I've ever owned has spooked me away from AMD and ASUS.
> 
> Edit: to the post above me, I wouldn't update the video bios. I cannot be 100% sure that my reflashing of the old bios completely undid whatever the new bios did. I updated the video bios last week and started experiencing these issues in the past few days. Like I've said, reflashing the old bios didn't resolve the issue, but I can't guarantee that the new bios didn't cause the problem.


You can't blame Asus for the card being bad if you flashed to a new bios and now the cards bad. It's never guaranteed that if you flash to a new bios that your card will come out the other end OK. Any time you flash a bios, even on a motherboard, there is a certain amount of risk involved. You are more than welcome to be spooked from AMD and or Asus but that would spook me from flashing my bios if anything, not the manufacturers. Since it's the first video card you have ever owned it might have been prudent to let the bios flashing go and find an old one to try it on before you bricked your first card.


----------



## Devildog83

Quote:


> Originally Posted by *Scott1541*
> 
> I was looking on anandtech and it looks like it will give a nice performance boost


Depending on which one you get and how well it overclocks it will be near, even with or a bit better than a 760. Post a pick and your clocks as soon as you can please.


----------



## Scott1541

Quote:


> Originally Posted by *Devildog83*
> 
> Depending on which one you get and how well it overclocks it will be near, even with or a bit better than a 760. Post a pick and your clocks as soon as you can please.


It's the MSI Gaming one. In the end it came down to the MSI gaming Vs the Sapphire Vapor-X because I could get them delivered next day from different retailers for more or less the same price. I've never had a Sapphire card before so I went with the MSI one with BF4. If I get it tomorrow (won't be too happy if I don't) then I'll probably post pics/clocks in the evening some time, I have a fairly busy day at uni tomorrow


----------



## [CyGnus]

Scott1541 you did ok cant go wrong with MSi









New Asus GPU Tweak 2.5.2 http://support.asus.com/download.aspx?SLanguage=en&m=gpu+tweak&os=30
Now the update can be done by the app


----------



## Devildog83

Quote:


> Originally Posted by *Scott1541*
> 
> It's the MSI Gaming one. In the end it came down to the MSI gaming Vs the Sapphire Vapor-X because I could get them delivered next day from different retailers for more or less the same price. I've never had a Sapphire card before so I went with the MSI one with BF4. If I get it tomorrow (won't be too happy if I don't) then I'll probably post pics/clocks in the evening some time, I have a fairly busy day at uni tomorrow


Great to here, if I seemed impatient I am sorry. Post your pics and clocks when it is convenient.


----------



## cremelo

Well whit the new GPU BIOS After 20 minutes of play in BF4 (little time for a lunch break) I had no problem, Dirt 3 does not, then you get off work I will test more sink and do some stress tests.....
I received email from Asus where the seller informed me that if anything wrong happens the 2nd Bios will be activated and if the problem GPU Tweak or BIOS could send for RMA but without updating on the GPU Tweak not have RMA....
How would they know whether or not the GPU Tweak?


----------



## [CyGnus]

Just updated to the new bios all is fine no issues what so ever







I had some FP drops in Skyrim now it seems all is OK dont really play anything else at the moment some BF4 but that did not gave me any issues.


----------



## ricko99

ASUS r9 280x DCII TOP here factory clock and voltage. Tested the new bios with furmark about 30min, ac4 1 hour, bf4 1 hour so far artifacts issue is gone







(catalyst 13.25)


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *[CyGnus]*
> 
> Its about double or more of the performance of a 460


Quote:


> Originally Posted by *Scott1541*
> 
> I was looking on anandtech and it looks like it will give a nice performance boost


I went for dual 460's to a single 280x so to put it will be a good bump for you even going to the 270x


----------



## crazysoccerman

Quote:


> Originally Posted by *Devildog83*
> 
> You can't blame Asus for the card being bad if you flashed to a new bios and now the cards bad. It's never guaranteed that if you flash to a new bios that your card will come out the other end OK. Any time you flash a bios, even on a motherboard, there is a certain amount of risk involved. You are more than welcome to be spooked from AMD and or Asus but that would spook me from flashing my bios if anything, not the manufacturers. Since it's the first video card you have ever owned it might have been prudent to let the bios flashing go and find an old one to try it on before you bricked your first card.


You make it sound like I was trying to hack into a mainframe or something. All I did was update the bios via asus gpu tweak.

And like I said, I'm not sure the new bios has anything to do with the problem. The random failures, even in the motherboard BIOS menu, seem more like a hardware issue.

And to answer an earlier question, the issue suddenly appeared. I was browsing the web when I suddenly got vertical stripes and then the card stopped displaying all together. Since then it has been difficult to change bios settings or boot into windows without the display going out.


----------



## Jaffi

Quote:


> Originally Posted by *[CyGnus]*
> 
> Just updated to the new bios all is fine no issues what so ever
> 
> 
> 
> 
> 
> 
> 
> I had some FP drops in Skyrim now it seems all is OK dont really play anything else at the moment some BF4 but that did not gave me any issues.


Newest version is .41 bios, right?


----------



## [CyGnus]

Quote:


> Originally Posted by *Jaffi*
> 
> Newest version is .41 bios, right?


Yes


----------



## dmfree88

270x giga 4gb in the mail now. Cant wait to get it!


----------



## Gabkicks

my 2 r9 280x DC2T i ordered *A MONTH AGO* are supposed to ship tomorrow... hopefully they do


----------



## Devildog83

Quote:


> Originally Posted by *crazysoccerman*
> 
> You make it sound like I was trying to hack into a mainframe or something. All I did was update the bios via asus gpu tweak.
> 
> And like I said, I'm not sure the new bios has anything to do with the problem. The random failures, even in the motherboard BIOS menu, seem more like a hardware issue.
> 
> And to answer an earlier question, the issue suddenly appeared. I was browsing the web when I suddenly got vertical stripes and then the card stopped displaying all together. Since then it has been difficult to change bios settings or boot into windows without the display going out.


You are right, I made it sound like it was absolutely a bios flash issue. Sorry about that. I would just hate to see you completely ditch AMD or Asus because issues with one card. Even if it is hardware related and not flashing, electronic hardware fails all of the time. Asus has as good a track record with AMD and Nvidia alike as any card manufacturer out there. You will find I am sure that just as many Nvidia cards fail as AMD too. All I am saying is RMA the card and see what happens. If you still have issues then I could see baling on them.


----------



## nubki11a

Quote:


> Originally Posted by *dmfree88*
> 
> 270x giga 4gb in the mail now. Cant wait to get it!


Quote:


> Originally Posted by *Gabkicks*
> 
> my 2 r9 280x DC2T i ordered *A MONTH AGO* are supposed to ship tomorrow... hopefully they do


Haha same here, should get mine Saturday!


----------



## Devildog83

Quote:


> Originally Posted by *dmfree88*
> 
> 270x giga 4gb in the mail now. Cant wait to get it!


Looking forward to having you here. dmFree88 You too Gabkicks - You have more patience than I. Congrats !! nubki11a - you too. soooo much patience.


----------



## dmfree88

Ya im still on top of 7870 but i doubt the giga will be a super good ocer like my hawk. Maybe ill get lucky tho. This would be my first giga gpu. Hopefully it pairs well with the ud5


----------



## crazysoccerman

Quote:


> Originally Posted by *Devildog83*


If I still have issues I will eat my foot. If I no longer have issues I will stay away from ASUS and AMD, but I will suspect ASUS's board design/construction (specifically power delivery) is at fault rather than AMD who only manufacture the chip.

Regardless, I'll post the outcome here.


----------



## neurotix

Love this thread. You guys are awesome.

My 270X is now relocated to my sister's system, and my brother uses it for gaming.

Funny to see an 84% asic 270X @ 1250mhz on a $29 Biostar budget AM3 board with a Phenom II dual core in a $39 Raidmax case.

It really deserves better than that, it does. But I have something much better now.

It maxes out everything my brother has played... because the monitor is an old LCD that's not widescreen with 1280x1024 resolution xDDDD

I really think a 270 or 270X is probably the best card in it's price bracket right now. I've even heard that miners really like them because they provide the best khash to dollar spent ratio.

Don't count out the 270. The 280x is great too. If I could I think I'd get a Sapphire 280X Vapor-X... man that backplate looks good. Not sure what I'd do with it besides bench it. And my girlfriend's rig already has a 7970 Vapor-X in it. But it has no backplate


----------



## cremelo

After nearly 7 hours playing Battlefield 4 (which had more artifacts) I had no problem as soon as get home from work I will start testing Overclocked again and hopefully this time did not take issue


----------



## blueskybluesky1

Quote:


> Originally Posted by *blueskybluesky1*
> 
> Hello
> 
> A new VGA BIOS was released yesterday for ASUS Radeon R9 280X R9280X-DC2T-3GD5 - TOP Edition : https://www.asus.com/Graphics_Cards/R9280XDC2T3GD5/#support
> 
> _"R9280X-DC2T-3GD5 VBIOS update
> BIOS update to remove rare artifacting events during gaming, Please also update the Catalyst driver to 13.251 or above
> File Size 1,19 MBytesupdate"_
> 
> Anyone tried it yet? I have the R9 280X TOP from Asus but the only problem with it was a few times in Battlefield 4 the sky was flickering (artifacts). But recovered after game restart. I am still not decided to update. I afraid it will ruin it more
> 
> 
> 
> 
> 
> 
> 
> 
> If somebody had more issues in games and updated the BIOS please share your feedback about improvements/or lets hope no other supplementary bugs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you. Also, ASUS R9 280X TOP owners if you want to share experience with this card I am open to any questions


Finally I updated







I used the .exe file in the Asus link. Strange is that after you start the executable it directly starts to flash the BIOS without warning, anylike all other BIOS update procedures I did on other cards or mobos. Anyway, I used this method because as I told on my rig Asus GPU Tweak freezes system when searching new BIOS online.

Battlefield 4 issues (artifacts) all gone. Good job Asus









I will test more bluray or mkv movies to see If i still have issues when using "Splash Player EX" on my LED TV trough HDMI. There is small flickering of contrast-brightness in semi-dark scenes sometimes. Note that the GPU harwdare acceleration is activated in the above mentioned software.


----------



## Delphi

Okay well I am having some Crossfire issues with my R9 270x's. I wont get full usage in any game anymore. It was actually worse when I had the bios's overvolted. Like it Furmark I would get the same FPS with Crossfire disabled and Enabled. Changed back to stock bios and it fixed the furmark issues. But Skyrim I would say is the worst, Bioshock Infinite is not much better though. I get much better fps with crossfire off than on in Skyrim. It never used to be like that. I have scrubbed the drivers tons of times using ATIman uninstaller. Could it be that, that program isn't doing the job correctly?

On a positive note, I got some really good settings in for mining. Both cards getting 470kh/s


----------



## Scott1541

My R9 270X is installed and running nicely, haven't had a chance to play any games yet though. Once I'm back from this next 2 hour lecture BF4 should be done downloading and I can play all night


----------



## mAs81

Hey guys,
I downloaded the 13.12 drivers/catalyst version from AMD for my R9 280X msi radeon,and now my idle temps are about 4-5c higher..
I haven't encountered any problem gaming though..Even though I haven't had a long gaming session since i saw this..
Has anyone else got this??
Before:


Spoiler: Warning: Spoiler!






And now:


Spoiler: Warning: Spoiler!


----------



## rdr09

Quote:


> Originally Posted by *mAs81*
> 
> Hey guys,
> I downloaded the 13.12 drivers/catalyst version from AMD for my R9 280X msi radeon,and now my idle temps are about 4-5c higher..
> I haven't encountered any problem gaming though..Even though I haven't had a long gaming session since i saw this..
> Has anyone else got this??
> Before:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> And now:
> 
> 
> Spoiler: Warning: Spoiler!


that's the reason i reverted back to 13.11. the 12 raised my 290's idle voltage just a tiny bit along with the temp. 13.11 works well, so . . .


----------



## mAs81

Quote:


> Originally Posted by *rdr09*
> 
> that's the reason i reverted back to 13.11. the 12 raised my 290's idle voltage just a tiny bit along with the temp. 13.11 works well, so . . .


Did you notice any change while gaming?I thought about reverting back to 13.11,but I was waiting to see if it was just my card that had that problem...


----------



## Devildog83

More delays for BF4/W Mantle - http://www.extremetech.com/gaming/175429-amds-mantle-api-debut-postponed-again-as-battlefield-4-struggles-to-fix-bugs

Having said that, I am playing Battlefield 4 now and I have no issues even in X-Fire with game play which means the fixes to it are coming around and that's more important to me. Someday soon we will know for sure what Mantle will bring, I hope it's not too long a wait though.


----------



## hamzta09

Quote:


> Originally Posted by *Devildog83*
> 
> More delays for BF4/W Mantle - http://www.extremetech.com/gaming/175429-amds-mantle-api-debut-postponed-again-as-battlefield-4-struggles-to-fix-bugs
> 
> Having said that, I am playing Battlefield 4 now and I have no issues even in X-Fire with game play which means the fixes to it are coming around and that's more important to me. Someday soon we will know for sure what Mantle will bring, I hope it's not too long a wait though.


What are people talking about?

http://www.sweclockers.com/nyhet/18236-amd-mantle-for-bf4-redo-innan-manadsskiftet


----------



## Scott1541

Here be my submission











GPU-Z


Spoiler: GPU-Z







It's the MSI R9 270X Gaming 2GB (The one that comes with BF4







)


----------



## repo_man

Here's some photos of the new ASUS 270X up against my RMAd HIS 270. The ASUS shipment was 4.8lbs and the HIS was 2.2lbs. Clearly all the extra weight is in the heatpipes/cooler. Very impressed.









Also, one other question. I'm running ASUS' Gpu Tweak program because I had some weird issues with Afterburner when I switched cards. Is there any way to get Gpu Tweak to display the card temp in the system tray like Afterburner does?


----------



## Jaffi

No, not possible with GPU Tweak!


----------



## cremelo

Quote:


> Originally Posted by *repo_man*
> 
> Here's some photos of the new ASUS 270X up against my RMAd HIS 270. The ASUS shipment was 4.8lbs and the HIS was 2.2lbs. Clearly all the extra weight is in the heatpipes/cooler. Very impressed.
> 
> Also, one other question. I'm running ASUS' Gpu Tweak program because I had some weird issues with Afterburner when I switched cards. Is there any way to get Gpu Tweak to display the card temp in the system tray like Afterburner does?


No, i use MSI Afterburner to show temps on games and GPU Tweak to overclock :c


----------



## rdr09

Quote:


> Originally Posted by *mAs81*
> 
> Did you notice any change while gaming?I thought about reverting back to 13.11,but I was waiting to see if it was just my card that had that problem...


i never had a chance to use it for any game. upon install immediately checked my vcore 'cause i noticed my temp went up by about 4C. i had that happened to my 7970 in a previous driver (beta), so i knew what's up.


----------



## RyanR

Hey I was wondering if anyone knew what water blocks are compatible with the gigabyte windforce r9 270x and if they water block from the HD7870/50 is compatible because I have read that it is but just want to be sure before I buy anything.

Thanks


----------



## crazysoccerman

In case anyone wanted to buy a 280x DC2T from Newegg, they're back in stock:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814121803

At the new price of $460. Insane.


----------



## Scott1541

Quote:


> Originally Posted by *crazysoccerman*
> 
> In case anyone wanted to buy a 280x DC2T from Newegg, they're back in stock:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814121803
> 
> At the new price of $460. Insane.


They are selling for pretty much the same price over here at the minute, and UK prices are normally a bit higher than US prices.


----------



## cremelo

Someone with Asus 280X DCU2 Top can give me some advice:
What caught his maximum overclocking 280x without changing the voltage???


----------



## mAs81

Well I reverted back to Catalyst 13.11(beta) and still my idle temps are the same(42-43c)!What should i do??Do I take it down to 13.9 that my card came with?I know that the new Bios that my msi has,result in more heat,but I don't think that that's the issue here..After all I haven't flashed it.Perhaps I'm being paranoid here, but I'd really prefer to have my idle temps under 40c..Has anybody else came by this problem??


----------



## dmfree88

Quote:


> Originally Posted by *mAs81*
> 
> Well I reverted back to Catalyst 13.11(beta) and still my idle temps are the same(42-43c)!What should i do??Do I take it down to 13.9 that my card came with?I know that the new Bios that my msi has,result in more heat,but I don't think that that's the issue here..After all I haven't flashed it.Perhaps I'm being paranoid here, but I'd really prefer to have my idle temps under 40c..Has anybody else came by this problem??


Are you sure the update didnt just revert your fan profile or change it?


----------



## [CyGnus]

Maybe your ambient temp changed... 5ºc will not kill your card. Is the cooler clean (free of dust)?


----------



## mAs81

Quote:


> Originally Posted by *dmfree88*
> 
> Are you sure the update didnt just revert your fan profile or change it?


Well,when the card is idle I have my fans on auto,so about 20%..That didn't change..I don't mine for bit/litecoins,and I only oc it when I'm gaming..
Quote:


> Originally Posted by *[CyGnus]*
> 
> Maybe your ambient temp changed... 5ºc will not kill your card. Is the cooler clean (free of dust)?


I"m positive..This is a new system (I've had it for4 months tops),and I cleaned it yesterday anyway , because I installed a new PSU..
I understand that the temps are low and won't burn my card,but still,why the sudden change??
Again,this is my card on idle:


Spoiler: Warning: Spoiler!


----------



## [CyGnus]

Drivers should not affect temps, with me they never did but strange things happen dont be too worried about it but keep your eye on those temps.

What are your ambient temps?


----------



## mAs81

Quote:


> Originally Posted by *[CyGnus]*
> 
> Drivers should not affect temps, with me they never did but strange things happen dont be too worried about it but keep your eye on those temps.
> 
> What are your ambient temps?


They are about the same,nothing's changed..At least last time I checked..My other temps are the same,and better in fact...Together with the psu I installed a new CPU cooler and fan controller..Anyway,thanks man,I'll keep an eye on my temps,and perhaps I'll do some heavy gaming on the weekend,and see how it plays out...


----------



## staryoshi

I'd like to join the club









Picked up an Asus DCU II R9 270X. Haven't played with her too much yet, but she's a beauty - both in terms of performance and looks. You can't see it in the picture, but the heat pipes glisten with an enchanting purple hue because of my LED strip. Love it









Sorry for the crummy picture and disheveled cabling - my computer is still a WIP - I moved from a SST-PS07 to a 350D and it's getting purply all up in here.


----------



## bardacuda

Question:

Do some of the 270 models only have 1024 stream processors? Specifically the Gigabyte GV-R927OC-2GD?
I just ordered one and after I confirmed payment I noticed on newegg it says 1024 stream processors instead of 1280!? Some models say 1280, others say 1024, yet they're all classified as R9 270s. ***?

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20600481061&IsNodeId=1&name=Radeon%20R9%20270

Then according to the Gigabyte news at launch they say it has 1280 stream processors.

http://www.gigabyte.com/press-center/news-page.aspx?nid=1255

But on the Guru3d news article at launch again it says 1024 stream processors

http://www.guru3d.com/news_story/gigabyte_radeon_r9_270_overclock_edition.html

Argh! WTH! Can't seem to find any confirmation one way or the other....maybe my Google-fu is weak. I hope I didn't just buy a card that's only 4/5ths what I thought I was buying. Please help meh.


----------



## jason387

Quote:


> Originally Posted by *bardacuda*
> 
> Question:
> 
> Do some of the 270 models only have 1024 stream processors? Specifically the Gigabyte GV-R927OC-2GD?
> I just ordered one and after I confirmed payment I noticed on newegg it says 1024 stream processors instead of 1280!? Some models say 1280, others say 1024, yet they're all classified as R9 270s. ***?
> 
> http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20600481061&IsNodeId=1&name=Radeon%20R9%20270
> 
> Then according to the Gigabyte news at launch they say it has 1280 stream processors.
> 
> http://www.gigabyte.com/press-center/news-page.aspx?nid=1255
> 
> But on the Guru3d news article at launch again it says 1024 stream processors
> 
> http://www.guru3d.com/news_story/gigabyte_radeon_r9_270_overclock_edition.html
> 
> Argh! WTH! Can't seem to find any confirmation one way or the other....maybe my Google-fu is weak. I hope I didn't just buy a card that's only 4/5ths what I thought I was buying. Please help meh.


I think that there are are 2 models, the R9 270X and the R9 270X Boost.


----------



## bardacuda

Quote:


> Originally Posted by *jason387*
> 
> I think that there are are 2 models, the R9 270X and the R9 270X Boost.


I'm talking about the R9 270 (non-X)


----------



## Devildog83

*staryoshi* has been added. Welcome !!! Nice purple people eater there. When you get your high overclocks let me know and I will post them. Is that a Corsair H75, if so how is it working for you?


----------



## Devildog83

Quote:


> Originally Posted by *bardacuda*
> 
> I'm talking about the R9 270 (non-X)


Very weird that the 2 reviews would list different shaderclocks about the very same card.

On the AMD website it says that both the 270 and 270x have up to 1280 shaders. I guess it's up to the manufacturers but most of them should be there. The 270 only uses 1 6 pin connector instead of 2 and cannot clock near as well but should not be hugely different, kinda like a 7850 to a 7870.


----------



## staryoshi

Quote:


> Originally Posted by *Devildog83*
> 
> *staryoshi* has been added. Welcome !!! Nice purple people eater there. When you get your high overclocks let me know and I will post them. Is that a Corsair H75, if so how is it working for you?


It's a great little unit, however mine is defective and it creates excessive pump noise. I'm going to have to get in touch with Corsair in the near future (it's annoying because I just replaced a H100 that died on me after I switched cases). I can fold on my 3770K @ 4.2Ghz and stay under 70C with a single intake fan at inaudible volume, so the performance is A-OK.

I have another AF140 coming in tomorrow, so it's going to get even purple-y-er


----------



## Devildog83

Quote:


> Originally Posted by *staryoshi*
> 
> It's a great little unit, however mine is defective and it creates excessive pump noise. I'm going to have to get in touch with Corsair in the near future (it's annoying because I just replaced a H100 that died on me after I switched cases). I can fold on my 3770K @ 4.2Ghz and stay under 70C with a single intake fan at inaudible volume, so the performance is A-OK.
> 
> I have another AF140 coming in tomorrow, so it's going to get even purple-y-er


That's kind of a bummer, Corsair will take care of you though.I bought my stepson one for his MITX build with a 2500k and he says it works fine so far. No overclocking or heavy usage there though.


----------



## staryoshi

Quote:


> Originally Posted by *Devildog83*
> 
> That's kind of a bummer, Corsair will take care of you though.I bought my stepson one for his MITX build with a 2500k and he says it works fine so far. No overclocking or heavy usage there though.


I've always had good experiences with Corsair support, I'd expect this to be no different. I wouldn't keep buying their products, otherwise


----------



## Tugz

Officially Switched from my 2 x Sapphire 7950's to 2 x MSI R9 280x just today =)





3DMark Firestrike 280x's results
http://www.3dmark.com/fs/1629213

3DMark Firestrike 7950's results
http://www.3dmark.com/fs/1538618

Huge improvement/jump from my 7950s. These cards are dope!!!


----------



## GuestVeea

Hello all. Quick question; If i'm mining around 8hrs/day on my R9 270x, How much will that decrease the life of my card? it doesnt get above 55 degrees C, but it is under 95-100% load. Should I mine less? or maybe get a 7770 or somethin cheap dedicated to mining?


----------



## hyp3rtraxx

Quote:


> Originally Posted by *Tugz*
> 
> Officially Switched from my 2 x Sapphire 7950's to 2 x MSI R9 280x just today =)
> 
> 
> 
> 
> 
> 3DMark Firestrike 280x's results
> http://www.3dmark.com/fs/1629213
> 
> 3DMark Firestrike 7950's results
> http://www.3dmark.com/fs/1538618
> 
> Huge improvement/jump from my 7950s. These cards are dope!!!


Offtopic, i just got to say i really like your rig! The case and cablemanagement and such looked really nice!


----------



## neurotix

I agree, that's a dope looking system.

The huge side panel window is cool. What case is that?


----------



## cremelo

Well, I find it strange 3D Mark did not tell overclock the GPU....
http://www.3dmark.com/3dm/2333540


----------



## Tugz

Quote:


> Originally Posted by *hyp3rtraxx*
> 
> Offtopic, i just got to say i really like your rig! The case and cablemanagement and such looked really nice!


Thank you!
Quote:


> Originally Posted by *neurotix*
> 
> I agree, that's a dope looking system.
> 
> The huge side panel window is cool. What case is that?


Thanks, The case is a Corsair Air flow a540


----------



## kersoz2003

AMD communicated that:

"Mantle performance for the AMD Radeon HD 7000/HD 8000 Series GPUs and AMD Radeon, R9 280X and R9 270X GPUs will be optimized for BattleField 4 in future AMD Catalyst releases. These products will see limited gains in BattleField 4 and AMD is currently investigating optimizations for them as they are the next highest priority products."


----------



## kersoz2003

Quote:


> Originally Posted by *kersoz2003*
> 
> AMD communicated that:
> 
> "Mantle performance for the AMD Radeon HD 7000/HD 8000 Series GPUs and AMD Radeon, R9 280X and R9 270X GPUs will be optimized for BattleField 4 in future AMD Catalyst releases. These products will still see gains in BattleField 4 with 14.1 driver and but AMD is currently investigating full optimizations for them as they are the next highest priority products."


----------



## cremelo

Quote:


> Originally Posted by *kersoz2003*
> 
> AMD communicated that:
> 
> "Mantle performance for the AMD Radeon HD 7000/HD 8000 Series GPUs and AMD Radeon, R9 280X and R9 270X GPUs will be optimized for BattleField 4 in future AMD Catalyst releases. These products will see limited gains in BattleField 4 and AMD is currently investigating optimizations for them as they are the next highest priority products."


Already updated my BF4 now just wait AMD Driver


----------



## Dortheleus

Hi gang, I'm thinking of buying a R9 270X to put in my next Mini-ITX build and I was wondering if anyone new of a full water-block for the 270X with an 8" PCB?

Cheers,


----------



## Devildog83

*Tugz* has been added, Welcome!! That is a very nice rig. Ya'll should check out the new Corsair 760T coming out, full window and great for water cooling too. Very adjustable.

That sucks about Mantle for the 7000, 270x and 280x cards. I was really looking forward to seeing that jump. Since that are all Mantle ready I am sure they will figure something out.

https://www.youtube.com/watch?v=TSBIu-_ImOI


----------



## hamzta09

http://www.sweclockers.com/nyhet/18247-battlefield-4-uppdateras-med-mantle-dice-slapper-testresultat

Mantle with results.


----------



## nubki11a

Quote:


> Originally Posted by *kersoz2003*
> 
> AMD communicated that:
> 
> "Mantle performance for the AMD Radeon HD 7000/HD 8000 Series GPUs and AMD Radeon, R9 280X and R9 270X GPUs will be optimized for BattleField 4 in future AMD Catalyst releases. These products will see limited gains in BattleField 4 and AMD is currently investigating optimizations for them as they are the next highest priority products."


Limited gains, does that mean Mantle's performance boost is lower than expected?


----------



## kersoz2003

Quote:


> Originally Posted by *nubki11a*
> 
> Limited gains, does that mean Mantle's performance boost is lower than expected?


no..

only for the first 14.1 driver hd 7900 and 280x and 270x series will gain limited performance , but with 14.2 or whatever , It will gain full


----------



## kersoz2003




----------



## Devildog83

Here is more info -

http://battlelog.battlefield.com/bf4/forum/threadview/2979150493815503479/

and this from frostbite team -

http://battlelog.battlefield.com/bf4/news/view/bf4-mantle-live/


----------



## crazysoccerman

"CPU: AMD FX-8350, 8 cores @ 4 GHz
GPU: AMD Radeon 7970 3 GB

64 players on the large Battlefield levels is really demanding of the CPU so this test case is primarily CPU-bound... 25.1% faster"

I'm amazed an 8 core cpu @ 4 ghz with a single 7970 can still be cpu bound.

It sounds like the performance increase will be negligible unless your performance was bottlenecked by the cpu.


----------



## gibby1690

Quote:


> Originally Posted by *kersoz2003*
> 
> no..
> 
> only for the first 14.1 driver hd 7900 and 280x and 270x series will gain limited performance , but with 14.2 or whatever , It will gain full


what is this 14.1 and 14.2 your talking about? i though drivers were something you updated?


----------



## Devildog83

Quote:


> Originally Posted by *gibby1690*
> 
> what is this 14.1 and 14.2 your talking about? i though drivers were something you updated?


Well kinda, right now the highest driver you can get is 13.12, 14.1 will be coming very soon. You uninstall the current driver and install the newer so you are updating by changing to the newer one, more or less. It was recommended to me to get a driver sweeper to make sure you do not have anything left behind from the old driver when you install the new one.


----------



## neurotix

Since I don't play BF4, I'm more excited for Thief, which is supposed to use Mantle.

Also, for some reason I'm really excited for TrueAudio considering my 290 has it. Since I have a relatively nice pair of budget headphones I use for gaming (Turtle Beach z22) I'm excited to see if it works with them, and makes gaming more immersive.

Additionally, it seems like Mantle scales really well the better the system you have. 78 vs 121 fps is a huge improvement. I plan on getting another 290 eventually with mining money so anything that would improve fps in a Crossfire setup sounds excellent to me.

CUDA (which doesn't matter because AMD cards are still twice as strong in compute) and PhysX (which almost no games use) vs Mantle and TrueAudio.... I'll take the latter


----------



## hamzta09

https://twitter.com/AMDGaming/status/428958040046702592


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> Since I don't play BF4, I'm more excited for Thief, which is supposed to use Mantle.
> 
> Also, for some reason I'm really excited for TrueAudio considering my 290 has it. Since I have a relatively nice pair of budget headphones I use for gaming (Turtle Beach z22) I'm excited to see if it works with them, and makes gaming more immersive.
> 
> Additionally, it seems like Mantle scales really well the better the system you have. 78 vs 121 fps is a huge improvement. I plan on getting another 290 eventually with mining money so anything that would improve fps in a Crossfire setup sounds excellent to me.
> 
> CUDA (which doesn't matter because AMD cards are still twice as strong in compute) and PhysX (which almost no games use) vs Mantle and TrueAudio.... I'll take the latter


One of the main things about true audio is that it takes that load off of the CPU and moves it to the GPU which will help with CPU performance and especially in games the are heavily CPU bound along with the better audio experience.


----------



## neurotix

Quote:


> Originally Posted by *Devildog83*
> 
> One of the main things about true audio is that it takes that load off of the CPU and moves it to the GPU which will help with CPU performance and especially in games the are heavily CPU bound along with the better audio experience.


That's awesome. I didn't know that. Makes me want it even more.


----------



## mtcn77

Quote:


> Originally Posted by *crazysoccerman*
> 
> "CPU: AMD FX-8350, 8 cores @ 4 GHz
> GPU: AMD Radeon 7970 3 GB
> 
> 64 players on the large Battlefield levels is really demanding of the CPU so this test case is primarily CPU-bound... 25.1% faster"
> 
> I'm amazed an 8 core cpu @ 4 ghz with a single 7970 can still be cpu bound.
> 
> It sounds like the performance increase will be negligible unless your performance was bottlenecked by the cpu.


That is the communication layer between the cpu and the gpu causing the bottleneck which is the reason Mantle had to be made. You see, when one device such as a gpu waits for cpu accessed transfer of memory, the alternative in which there is no such wait will definitely cause some speed bump that is not related with the cpu, but the speed of electron motion between memory interfaces.


----------



## Tugz

Quote:


> Originally Posted by *Devildog83*
> 
> *Tugz* has been added, Welcome!! That is a very nice rig. Ya'll should check out the new Corsair 760T coming out, full window and great for water cooling too. Very adjustable.
> 
> That sucks about Mantle for the 7000, 270x and 280x cards. I was really looking forward to seeing that jump. Since that are all Mantle ready I am sure they will figure something out.
> 
> https://www.youtube.com/watch?v=TSBIu-_ImOI


Thanks! That case looks beautiful! I might have to do another build with that case.


----------



## Devildog83

Quote:


> Originally Posted by *Tugz*
> 
> Thanks! That case looks beautiful! I might have to do another build with that case.


I am thinking of doing a 4770k build, I know that might make some of my AMD friends upset, in black and gold with one of the new Asus Z87 boards. I am already wondering how I am going to get gold lights.









Won't go to Nvidia graphics though. AMD there, all of the way but I want to try MAC OS and it's kinda tough on an AMD machine plus I could pit the 2 against each other and see who comes out on top.

The 760T might be a good place to start.


----------



## gibby1690

Quote:


> Originally Posted by *Devildog83*
> 
> Well kinda, right now the highest driver you can get is 13.12, 14.1 will be coming very soon. You uninstall the current driver and install the newer so you are updating by changing to the newer one, more or less. It was recommended to me to get a driver sweeper to make sure you do not have anything left behind from the old driver when you install the new one.


Cool cool so it is nothing to do with the card then? They way I was reading it it sounded like only certain r9 series cards were going to work with mantle and seeings as i will be buying an r9 series soon I thought there was something I was missing while looking at what kind of card to get.


----------



## Devildog83

Quote:


> Originally Posted by *gibby1690*
> 
> Cool cool so it is nothing to do with the card then? They way I was reading it it sounded like only certain r9 series cards were going to work with mantle and seeings as i will be buying an r9 series soon I thought there was something I was missing while looking at what kind of card to get.


Mantle will work with any card that has GCN Architecture, including most 7000, R7 and R9 series cards. The big question will be how much of a boost on what cards. Looks like the higher end cards like the 290/290x will get the most boost for now.


----------



## Delphi

After doing many different things with my computer to try and stop ati related driver BSOD I have narrowed the cause to bios modding. At least with the HIS rom it doesn't like it. I may try an Asus or MSI rom to see how it goes but as it stands I have had no luck with increasing the voltage on the HIS rom for these cards.


----------



## Jaydev16

Hey,someone know if I'll get a game with the Powercolor Devil R9 270X?Seems like a good very value for money card.


----------



## Devildog83

Quote:


> Originally Posted by *Delphi*
> 
> After doing many different things with my computer to try and stop ati related driver BSOD I have narrowed the cause to bios modding. At least with the HIS rom it doesn't like it. I may try an Asus or MSI rom to see how it goes but as it stands I have had no luck with increasing the voltage on the HIS rom for these cards.


Good luck boss, bios modding scares the heck outa' me. I do not have the $ to go there and don't want to make Powercolor pay for something I did by RMA'ing a card. It seems to work great for a ton of folks but I am shiverin' in my slippers.


----------



## Jaydev16

I just noticed the name of the thread starter and his name on the list of 270X owners.So,is the card good?And what about the bundled mousepad?And did you get any game with it when you bought it?Please answer.


----------



## Devildog83

Quote:


> Originally Posted by *Jaydev16*
> 
> I just noticed the name of the thread starter and his name on the list of 270X owners.So,is the card good?And what about the bundled mousepad?And did you get any game with it when you bought it?Please answer.


I love the Devil cards, they overclock pretty well, perform great and look awesome. The mouse pad is huge but I like it and no I did not get any games with it.


----------



## Jaydev16

Thanks!No games but a free mousepad nontheless.That kinda makes up for it I guess.Yeah,looks killer and it offers something like HD 7950 performance at something like half the cost.
PS:I'm planning to order it from Amazon to India.Any tips?


----------



## Devildog83

Quote:


> Originally Posted by *Jaydev16*
> 
> Thanks!No games but a free mousepad nontheless.That kinda makes up for it I guess.Yeah,looks killer and it offers something like HD 7950 performance at something like half the cost.
> PS:I'm planning to order it from Amazon to India.Any tips?


The 270x Devil should get very close to 1600 on the memory but the core will most likely be limited to around 1230 or 1235, kinda weird as the 7870 Devil is the opposite, 1275 core and 1500 memory, but non the less it will perform great there. I keep the pair at 1200/1400 24/7 and clock them up a bunch when benching. Other than that enjoy.

Hopefully the 14.1 drivers will be out by the time you get it so you wont have to update.


----------



## Jaydev16

What update?


----------



## Devildog83

This is a ton off topic girls and boys but I am getting extremely pumped up about my *Seahawks* in the SuperBowl. #gohawks #LOB


----------



## Devildog83

Quote:


> Originally Posted by *Jaydev16*
> 
> What update?


There is a new driver update coming out soon with mantle support and HSA plus some bug fixes. The 13.12 is good but the new one should be better.


----------



## Krusher33

Quote:


> Originally Posted by *Devildog83*
> 
> This is a ton off topic girls and boys but I am getting extremely pumped up about my *Seahawks* in the SuperBowl. #gohawks #LOB


I put $65 down for the Seahawks because I feel they're a better well rounded team. Denver's defense just doesn't seem like they make the best adjustments at halftime and so Peyton and O is having to work extra hard to keep his team ahead the other team.

BTW I'm just following thread because I have 2 Dual X 280X's but they're only mining at the moment. I don't feel like it's appropriate to join the club since they're not really being used for gaming.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Krusher33*
> 
> I put $65 down for the Seahawks because I feel they're a better well rounded team. Denver's defense just doesn't seem like they make the best adjustments at halftime and so Peyton and O is having to work extra hard to keep his team ahead the other team.
> 
> BTW I'm just following thread because I have 2 Dual X 280X's but they're only mining at the moment. I don't feel like it's appropriate to join the club since they're not really being used for gaming.


Mining gaming.. pegging GPU usage to 99% all the same to me one is work the other is pleasure


----------



## Devildog83

Quote:


> Originally Posted by *Krusher33*
> 
> I put $65 down for the Seahawks because I feel they're a better well rounded team. Denver's defense just doesn't seem like they make the best adjustments at halftime and so Peyton and O is having to work extra hard to keep his team ahead the other team.
> 
> BTW I'm just following thread because I have 2 Dual X 280X's but they're only mining at the moment. I don't feel like it's appropriate to join the club since they're not really being used for gaming.


It doesn't matter to me as long as you have them.

Good call, I think Manning will get some yards but getting into the end zone on this Seattle's D is is a whole other matter. Beast-Mode will roll, Wilson will run for 75 and Percy Harvin added to the mix will allow Seattle hold the ball too much for Denver to score a lot of points. I see 31/20 Seattle.


----------



## staryoshi

Sorry guys, but without the home field my Broncos are going to make the Hawks very unhappy. Broncos have one of the best run D's and Wilson isn't a good enough passer/ doesn't have the weapons to out-gun Manning. By the second half Manning will start to pick apart your D.

Also, the Direct CU R9 270X has been nothing but a joy to work with. I can't recommend it enough


----------



## Devildog83

Quote:


> Originally Posted by *staryoshi*
> 
> Sorry guys, but without the home field my Broncos are going to make the Hawks very unhappy. Broncos have one of the best run D's and Wilson isn't a good enough passer/ doesn't have the weapons to out-gun Manning. By the second half Manning will start to pick apart your D.
> 
> Also, the Direct CU R9 270X has been nothing but a joy to work with. I can't recommend it enough


Here's the problem with that theory, if anyone thinks Manning is just going to go crazy all over Seattle's D they are nuts, just because they are not household names doesn't mean Wilson doesn't have weapons, Denver has not faced a running game like this or a D anywhere near as good as Seattle's all year and Seattle has played much better defenses just in the last 6 weeks than Denver has faced for most of the year. I think Denver is in for a huge surprise when they take the field. Watching on film cannot prepare them for what is coming. Seattle will get pressure with 4 guys which will put Manning where he doesn't like to be, uncomfortable in the pocket.


----------



## Civrock

Update by AMD Gaming on Facebook a minute ago...
Quote:


> Mantle Update: Our driver team is still working on putting the final touches on the Catalyst 14.1 Beta. We will keep you updated on our progress.


----------



## Devildog83

Quote:


> Originally Posted by *Civrock*
> 
> Update by AMD Gaming on Facebook a minute ago...


Coming soon to an AMD card near you, *More Power.*


----------



## hamzta09

Quote:


> Originally Posted by *Civrock*
> 
> Update by AMD Gaming on Facebook a minute ago...


They said that yesterday on twitter.
Exact phrasing, dont trust Facebook, its always behind.

Anyway the starswarm benchmark is out on steam now.


----------



## Devildog83

Quote:


> Originally Posted by *hamzta09*
> 
> They said that yesterday on twitter.
> Exact phrasing, dont trust Facebook, its always behind.
> 
> Anyway the starswarm benchmark is out on steam now.


If you have tried the Starswarm benchmark could you post a pic of what it looks like? Thanks.


----------



## hamzta09

Quote:


> Originally Posted by *Devildog83*
> 
> If you have tried the Starswarm benchmark could you post a pic of what it looks like? Thanks.






Havent tried it yet no.

But this is how they list the perf/settings.
(Not mine)


Spoiler: Warning: Spoiler!



Version 0.95
01/31/2014 16:45
===========================================================

== Hardware Configuration =================================
GPU: AMD Radeon HD 7800 Series
CPU: AuthenticAMD
AMD Phenom(tm) II X4 955 Processor
Physical Cores: 4
Logical Cores: 4
Physical Memory: 8503504896
Allocatable Memory: 140737488224256
===========================================================

== Configuration ==========================================
API: DirectX
Scenario: ScenarioAttract.csv
User Input: Disabled
Resolution: 1920x1080
Fullscreen: True
GameCore Update: 16.6 ms
Bloom Quality: High
PointLight Quality: High
ToneCurve Quality: High
Glare Overdraw: 16
Shading Samples: 64
Shade Quality: Mid
Motion Blur Frame Time: 16
Motion Blur InterFrame Time: 2
Detailed Frame Info: Off
===========================================================

== Results ================================================
Test Duration: 120 Seconds
Total Frames: 2077

Average FPS: 17.29
Average Unit Count: 3527
Maximum Unit Count: 5319
Average Batches/MS: 531.89
Maximum Batches/MS: 844.14
Average Batch Count: 37927
Maximum Batch Count: 209881
===========================================================


----------



## staryoshi

I'll save my retort for another forum, lest we stray too far off topic









I'm pretty impressed with how well the 270X folds. I'm pulling about 60K at the factory OC.


----------



## Devildog83

Quote:


> Originally Posted by *hamzta09*
> 
> 
> 
> 
> 
> Havent tried it yet no.
> 
> But this is how they list the perf/settings.
> (Not mine)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Version 0.95
> 01/31/2014 16:45
> ===========================================================
> 
> == Hardware Configuration =================================
> GPU: AMD Radeon HD 7800 Series
> CPU: AuthenticAMD
> AMD Phenom(tm) II X4 955 Processor
> Physical Cores: 4
> Logical Cores: 4
> Physical Memory: 8503504896
> Allocatable Memory: 140737488224256
> ===========================================================
> 
> == Configuration ==========================================
> API: DirectX
> Scenario: ScenarioAttract.csv
> User Input: Disabled
> Resolution: 1920x1080
> Fullscreen: True
> GameCore Update: 16.6 ms
> Bloom Quality: High
> PointLight Quality: High
> ToneCurve Quality: High
> Glare Overdraw: 16
> Shading Samples: 64
> Shade Quality: Mid
> Motion Blur Frame Time: 16
> Motion Blur InterFrame Time: 2
> Detailed Frame Info: Off
> ===========================================================
> 
> == Results ================================================
> Test Duration: 120 Seconds
> Total Frames: 2077
> 
> Average FPS: 17.29
> Average Unit Count: 3527
> Maximum Unit Count: 5319
> Average Batches/MS: 531.89
> Maximum Batches/MS: 844.14
> Average Batch Count: 37927
> Maximum Batch Count: 209881
> ===========================================================


That looks like it will tax the heck out of your CPU.


----------



## hamzta09

Quote:


> Originally Posted by *Devildog83*
> 
> That looks like it will tax the heck out of your CPU.


Hence why Mantle is coming, to reduce that stress and put it on GPU!


----------



## gibby1690

Quote:


> Originally Posted by *Devildog83*
> 
> Mantle will work with any card that has GCN Architecture, including most 7000, R7 and R9 series cards. The big question will be how much of a boost on what cards. Looks like the higher end cards like the 290/290x will get the most boost for now.


yeh i knew that those cards would work must just have been the way i was reading it but it sounded to me like the cards had a driver built in aswell and seeing as id never seen this before it got me a bit comfused, but all good now.

im hopefully going to get a 280x not sure which kind yet, or if moneys tight it will be the 270x toxic which seems to be a very good card for the price.

but preferably the 280 x


----------



## [CyGnus]

gibby1690 go for the 280x the 270x is almost the same as your 7870 kind of side grade


----------



## Krusher33

Well then here they are, mining away in my cave. They're both hashing at about 700-710khps. Slot 0 is at about 74c and slot 1is at 62c with fans at 75%. Somewhat loud but nowhere near how loud my 290X was with blower type cooler at 65%.

Might add a 3rd later this year depending on how the market is.


----------



## gibby1690

Quote:


> Originally Posted by *[CyGnus]*
> 
> gibby1690 go for the 280x the 270x is almost the same as your 7870 kind of side grade


haha i dont have the 7870 any more so its kind of an upgrade either way lol, but yeh the 280x is top of my wish list lol


----------



## hamzta09

Update (Jan 31): At the last minute AMD identified an installation issue in the Catalyst 14.1 driver that renders it unsuitable for distribution. This delays the entire launch as we haven't been able to run any tests with the beta software yet and public release is also expected to be delayed by at least a few more days. As soon as we can get our hands on the proofed driver, we'll be reporting back with our own benchmarks.


----------



## Krusher33

Waa waa waa.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> Update (Jan 31): At the last minute AMD identified an installation issue in the Catalyst 14.1 driver that renders it unsuitable for distribution. This delays the entire launch as we haven't been able to run any tests with the beta software yet and public release is also expected to be delayed by at least a few more days. As soon as we can get our hands on the proofed driver, we'll be reporting back with our own benchmarks.


Boo to the news that is.


----------



## neurotix

Hey, this is why I didn't install the leaked Toshiba 13.35 driver. Yes I know, it doesn't have Mantle, I don't know if it has TrueAudio either, but I didn't want to risk my system on a beta driver from a shifty source.

Give them time to fix it. It will be ready when it's ready. I'd rather wait and not have issues, like potential black screens or something, then get the driver now and have it cause problems.

I'm really excited too but I don't want a broken driver.


----------



## nubki11a

I'll be receiving my 280X tomorrow morning if everything goes according to plan. But I want to compare the 3DMark11 score of my current card to my new one. I'd also like to see how my 280X's OC will compare to others, which benchmark should I use for that and with what settings?


----------



## Devildog83

Quote:


> Originally Posted by *nubki11a*
> 
> I'll be receiving my 280X tomorrow morning if everything goes according to plan. But I want to compare the 3DMark11 score of my current card to my new one. I'd also like to see how my 280X's OC will compare to others, which benchmark should I use for that and with what settings?


Unigen Valley is a good one. You will want to set it on Extreme HD.


----------



## crazysoccerman

Litecoin needs an ASIC to crush the average joe miners so there will be less talk about mining and more about gaming.

Go trade forex if you want a risky investment and leave the video cards to the gamers (and those that fold for science).


----------



## nubki11a

Just received my 280X Vapor-X









More than doubled my FPS in Heaven, 3DMark11 and Furmark, all while staying under 65C









Gonna try overclocking it now, any ideas of how to start/what programs to use?

Also, should I just use the 13.12 drivers from the OP or install Sapphire's own drivers (from the CD)?

Pics:


----------



## [CyGnus]

Install the latest ones 13.30 from www.guru3d.com


----------



## gibby1690

quick question

is my xfx 550w PSU going to be able to power a 280x? non xfire.

i was checking out the 280x toxic but then discovered that it had 2x8 pin connectors i only have 1x8 and 1x6

so i checked out the above posts vapour x and from what i seen that was a 700w minimum PSU

and also i looked at the gygabyte one were it said PSU Limitation 600W

so am i going to be able to use one of the cards?

i presume that limitation is the max it can use but not sure


----------



## [CyGnus]

gibby1690 yes it will work fine the XFX 550w can power a 290X no problem, about the connectors if needed use the bundled adapter to do 2x8pins


----------



## gibby1690

ah does the toxic come with a connector to switch the connections?


----------



## [CyGnus]

It should come all my cards have the molex to 8 pin adapter included (Gigabyte/Asus) so sapphire will be no different


----------



## gibby1690

i thought they were for and older system to connect or something as the 7870 i had was 2x6 so had no worries

probs wont go with the toxic anyway but its good to know


----------



## mAs81

Took some benchmarks today..
At stock 1020/1500(max temps 70c/fan profile auto)


Spoiler: Warning: Spoiler!






At 1150/1700(max temps 74c/fan profile on auto)


Spoiler: Warning: Spoiler!






Even though my Idle temps have been raised(for some reason),I think that my temps under load are pretty much okay,considering the fans being on auto..
Now the results,could be better,could be worse I guess..But overall I think I'm okay....What do you think??


----------



## Devildog83

Quote:


> Originally Posted by *mAs81*
> 
> Took some benchmarks today..
> At stock 1020/1500(max temps 70c/fan profile auto)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> At 1150/1700(max temps 74c/fan profile on auto)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Even though my Idle temps have been raised(for some reason),I think that my temps under load are pretty much okay,considering the fans being on auto..
> Now the results,could be better,could be worse I guess..But overall I think I'm okay....What do you think??


Yes, I think you are fine.


----------



## Devildog83

With bated breath we have been waiting, *nubk11la has finally been added*







Welcome to the club. Congrats on the Vapor X. I just posted the stock clocks according to NewEgg. when you get it overclocked post it and I will add it.


----------



## mAs81

Quote:


> Originally Posted by *Devildog83*
> 
> Yes, I think you are fine.


Thanks man..
I reverted back to 13.11 beta , but I think I'll go 13.12 now since there's no change in temps..Doesn't hurt having the latest drivers..


----------



## Devildog83

Quote:


> Originally Posted by *mAs81*
> 
> Thanks man..
> I reverted back to 13.11 beta , but I think I'll go 13.12 now since there's no change in temps..Doesn't hurt having the latest drivers..


Any day now there will be an official release of the 14.1 drivers, they are a bit delayed but will be here eventually. If your card is running well I would just leave it until then because you will want to get the new one for sure. If you have a good reason to try something else then go for it. Did you try reapplying the thermal paste yet?

One more question, are you using afterburner and if so do you have "force constant voltage selected"? If so I would un-check it and see if the idle temps go down.


----------



## mAs81

Quote:


> Originally Posted by *Devildog83*
> 
> Any day now there will be an official release of the 14.1 drivers, they are a bit delayed but will be here eventually. If your card is running well I would just leave it until then because you will want to get the new one for sure. If you have a good reason to try something else then go for it. Did you try reapplying the thermal paste yet?
> 
> One more question, are you using afterburner and if so do you have "force constant voltage selected"? If so I would un-check it and see if the idle temps go down.


Well yes,I am using afterburner and I have that option unchecked..As for the drivers you're right..Perhaps I'll try the 13.30 beta for now just to see what it does..
To tell you the truth,the only time my idle temps were below 40c were with the "stock" 13.9 drivers that the card came with,now that I double-checked..But then again drivers don't conflict with temps,right?
Perhaps I'm being paranoid,I know..It's just that I have some bad memories from my last OC'ed system(LGA 775-Asus 5770)which was fried(the M/B fried that is,my 5770 was always hot,so I never was worried







) This system is the first rig I did myself from scratch so that is why I'm acting like a noob








As for the thermal paste I'll leave it as a last resort since no major problems have occurred..Thanks again for your input man,I appreciate it..


----------



## Devildog83

Quote:


> Originally Posted by *mAs81*
> 
> Well yes,I am using afterburner and I have that option unchecked..As for the drivers you're right..Perhaps I'll try the 13.30 beta for now just to see what it does..
> To tell you the truth,the only time my idle temps were below 40c were with the "stock" 13.9 drivers that the card came with,now that I double-checked..But then again drivers don't conflict with temps,right?
> Perhaps I'm being paranoid,I know..It's just that I have some bad memories from my last OC'ed system(LGA 775-Asus 5770)which was fried(the M/B fried that is,my 5770 was always hot,so I never was worried
> 
> 
> 
> 
> 
> 
> 
> ) This system is the first rig I did myself from scratch so that is why I'm acting like a noob
> 
> 
> 
> 
> 
> 
> 
> 
> As for the thermal paste I'll leave it as a last resort since no major problems have occurred..Thanks again for your input man,I appreciate it..


I still think it's just fine, mine are idling now at 30c or so but I have the auto fan set to 40%, I can't hear them so I don't mind them there. What do you use for a monitor. I use HWinfo64, does me well.

Notice here I do have a VRM temp on my second card that is 41c but all else is low.


----------



## mAs81

Quote:


> Originally Posted by *Devildog83*
> 
> I still think it's just fine, mine are idling now at 30c or so but I have the auto fan set to 40%, I can't hear them so I don't mind them there. What do you use for a monitor. I use HWinfo64, does me well.
> Notice here I do have a VRM temp on my second card that is 41c but all else is low.


I use HWiNFO64 too









Spoiler: Warning: Spoiler!






I only arrange my fan speed when gaming..Around 50-55% is adequate for cooling my msi and isn't noisy..Any more than that though,and I can hear it..


----------



## Devildog83

Quote:


> Originally Posted by *mAs81*
> 
> I use HWiNFO64 too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> I only arrange my fan speed when gaming..Around 50-55% is adequate for cooling my msi and isn't noisy..Any more than that though,and I can hear it..


Cool, it still does seem weird that the core temp is higher than the VRM's. I know it's a pain but if it were me I would replace the TIM. Those temps are not harmful at all but just would bug the heck out of me.

I just noticed this, your voltage jumped to 1.185v in that shot as a max temp, was that at idle? Mine always stays at .8xx unless I put load on it. Could you voltage be jumping up there and causing the temps to rise a bit? Just a thought.


----------



## mAs81

Hmmm..I noticed it too now that you mention it..Yes it is on idle..
Here's another one:


Spoiler: Warning: Spoiler!






Well I did change my PSU the other day..Could that have anything to do with it??


----------



## Devildog83

Quote:


> Originally Posted by *mAs81*
> 
> Hmmm..I noticed it too now that you mention it..Yes it is on idle..
> Here's another one:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Well I did change my PSU the other day..Could that have anything to do with it??


I honestly do not know what would cause it. Hopefully someone else might know and help out. Do you have the graphics overdrive enabled in the CCC? If you do try disabling it and see if that helps. I did cause some conflict with mine.


----------



## mAs81

Quote:


> Originally Posted by *Devildog83*
> 
> I honestly do not know what would cause it. Hopefully someone else might know and help out. Do you have the graphics overdrive enabled in the CCC? If you do try disabling it and see if that helps. I did cause some conflict with mine.


No,I don't have it enabled..I change my clocks only with afterburner...
Hmm..It is quite baffling,isn't it..And it stays the same all the time..
I rebooted my system 2 times,and I still get the same number...


----------



## nubki11a

Haha thanks Devil.
I downloaded the 13.30 driver from guru. It worked fine in Crysis 3 where I had around 50 fps with everything on max. In League of Legends I only jad around 80-90 fps, whilst the card should easily get 144fps constant. Even my hd6870 hovered between 90 and 144 fps... Is this a driver issue? Or is something else wrong? Temps never exceed 65, so Ive got quite some headroom for OCing which Ill be doing tonight


----------



## Bruska

I'm pretty new with the new generation of graphics cards. I bought a Sapphire Vapor X r9 270x and I've been trying to overclock it, and I can't go further than 1150 core and 1500 on the memory. I've seen people that achived a higher overclock with this card, even with the r9 270 (without the x).

My temps are all right, I don't know why I can't overclock it more.

Here a pic:


(Sorry for the quality and for my english)


----------



## Devildog83

Quote:


> Originally Posted by *Bruska*
> 
> I'm pretty new with the new generation of graphics cards. I bought a Sapphire Vapor X r9 270x and I've been trying to overclock it, and I can't go further than 1150 core and 1500 on the memory. I've seen people that achived a higher overclock with this card, even with the r9 270 (without the x).
> 
> My temps are all right, I don't know why I can't overclock it more.
> 
> Here a pic:
> 
> 
> (Sorry for the quality and for my english)


Can you increase the core voltage? That will help some. What are you using for an overclocking tool?


----------



## Bruska

I have tried to increase it up to 1.3 and there's no difference. I am using GPU Tweet and I also tried with Sapphire Trixx.

When I increase the voltage it actually increase it because I can see it through GPU-z sensor and HWiNFO, from 1.63 to 1.3


----------



## Farih

Quote:


> Originally Posted by *Bruska*
> 
> I have tried to increase it up to 1.3 and there's no difference. I am using GPU Tweet and I also tried with Sapphire Trixx.
> 
> When I increase the voltage it actually increase it because I can see it through GPU-z sensor and HWiNFO, from 1.63 to 1.3


Have you tryed with a lower frequency on the memory ?
Always start clocking the core first with a lower or untouched memory frequency.


----------



## mtcn77

Quote:


> Originally Posted by *Bruska*
> 
> I'm pretty new with the new generation of graphics cards. I bought a Sapphire Vapor X r9 270x and I've been trying to overclock it, and I can't go further than 1150 core and 1500 on the memory. I've seen people that achived a higher overclock with this card, even with the r9 270 (without the x).
> 
> My temps are all right, I don't know why I can't overclock it more.
> 
> Here a pic:
> 
> 
> (Sorry for the quality and for my english)


I am keen on reading lanoc reviews, they specifically test if each step of the overclock results in positive gains. Thus, I recommend you don't waste your time by increasing gddr5 clock any further.


----------



## Devildog83

I finally got to battlefield4 - at high settings I was well over 110 FPS and with everything maxed out on ultra I was at about 65 average, the lowest that I saw was 50 and the highest was up over 100 for a second. I can live with that.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> I finally got to battlefield4 - at high settings I was well over 110 FPS and with everything maxed out on ultra I was at about 65 average, the lowest that I saw was 50 and the highest was up over 100 for a second. I can live with that.




So does that mean when the new drivers come out with mantle we expect to have nothing below 60fps at 1080p ultra?


----------



## nubki11a

Can anyone tell me what RadeonPro does exactly? Should I use it instead of CCC? Which settings should I use?


----------



## Farih

Quote:


> Originally Posted by *Devildog83*
> 
> I finally got to battlefield4 - at high settings I was well over 110 FPS and with everything maxed out on ultra I was at about 65 average, the lowest that I saw was 50 and the highest was up over 100 for a second. I can live with that.


Also 4xMSAA ?
Might get a second 270x and know what to expect


----------



## Devildog83

Quote:


> Originally Posted by *Farih*
> 
> Also 4xMSAA ?
> Might get a second 270x and know what to expect


Yep 4x, play was very smooth. I just ran at my normal 1200/1400 clocks and the everyday 4.7Ghx on the 8350. Huge difference from BF3. I was at 150+ FPS with it maxed out on BF3.


----------



## dmfree88

Quote:


> Originally Posted by *mAs81*
> 
> No,I don't have it enabled..I change my clocks only with afterburner...
> Hmm..It is quite baffling,isn't it..And it stays the same all the time..
> I rebooted my system 2 times,and I still get the same number...


If you have multiple monitors it wont always downclock. Generally mem clock and voltage do not decrease. Theres workarounds for this using 2d/3d clocks. Do u have multiple monitors?


----------



## Devildog83

Quote:


> Originally Posted by *nubki11a*
> 
> Can anyone tell me what RadeonPro does exactly? Should I use it instead of CCC? Which settings should I use?


I tried it and could not figure out how to use it.


----------



## nubki11a

Quote:


> Originally Posted by *Devildog83*
> 
> I tried it and could not figure out how to use it.


EDIT: Looks like something went wrong









Apparently RP is something that you can use in conjunction with CCC. It adds more advanced features like VDC and some other stuff. Anyone here that uses it?


----------



## Farih

Quote:


> Originally Posted by *Devildog83*
> 
> Yep 4x, play was very smooth. I just ran at my normal 1200/1400 clocks and the everyday 4.7Ghx on the 8350. Huge difference from BF3. I was at 150+ FPS with it maxed out on BF3.


Thats great








Now i really want a second 270x quik.
I can only play at High wth 4xMSAA now (running 1250/1500mhz 24/7)


----------



## Themisseble

Guys need help
i have Gigabyte R9 270X Windforce x3 OC R2.0

How can i unlock voltage?
Stock voltages gives me only about 1175/1500 stable


----------



## Farih

Quote:


> Originally Posted by *Themisseble*
> 
> Guys need help
> i have Gigabyte R9 270X Windforce x3 OC R2.0
> 
> How can i unlock voltage?
> Stock voltages gives me only about 1175/1500 stable


Up to now only with a BIOS mod i think.
You can use VBE7 to change your BIOS.


----------



## MoNtHeR2

hey guys
will my 280x benefit from the mantle and 14.1 drivers at release ...or it will supported in the future ?
i have r9 280x windforce and fx 8350 @4.4 cpu......and i have alow gpu usage problem in bf4 that makes some maps unplayable ....and i was exited about mantle thinking this will resolve my problem -_-


----------



## Themisseble

So i have to flash bios.... tnx


----------



## Themisseble

Quote:


> Originally Posted by *MoNtHeR2*
> 
> hey guys
> will my 280x benefit from the mantle and 14.1 drivers at release ...or it will supported in the future ?
> i have r9 280x windforce and fx 8350 @4.4 cpu......and i have alow gpu usage problem in bf4 that makes some maps unplayable ....and i was exited about mantle thinking this will resolve my problem -_-


R9 280X not by much (5-10%) but you CPu will get decent boost (until your GPU will bottleneck - BF4)


----------



## hamzta09

Quote:


> Originally Posted by *Farih*
> 
> Thats great
> 
> 
> 
> 
> 
> 
> 
> 
> Now i really want a second 270x quik.
> I can only play at High wth 4xMSAA now (running 1250/1500mhz 24/7)


But you have 2 7950s?


----------



## Bruska

Quote:


> Originally Posted by *Farih*
> 
> Have you tryed with a lower frequency on the memory ?
> Always start clocking the core first with a lower or untouched memory frequency.


Quote:


> Originally Posted by *Devildog83*
> 
> Can you increase the core voltage? That will help some. What are you using for an overclocking tool?


I also tried that, and still the core doesn't go further than 1150. Maybe the card isn't capable to support a good OC.


----------



## Devildog83

Quote:


> Originally Posted by *MoNtHeR2*
> 
> hey guys
> will my 280x benefit from the mantle and 14.1 drivers at release ...or it will supported in the future ?
> i have r9 280x windforce and fx 8350 @4.4 cpu......and i have alow gpu usage problem in bf4 that makes some maps unplayable ....and i was exited about mantle thinking this will resolve my problem -_-


By all accounts you should get up to a 15% increase in overall performance in BF4 because mantle should free up the CPU some and BF4 is a heavy load on your CPU.


----------



## Devildog83

Quote:


> Originally Posted by *Themisseble*
> 
> Guys need help
> i have Gigabyte R9 270X Windforce x3 OC R2.0
> 
> How can i unlock voltage?
> Stock voltages gives me only about 1175/1500 stable


I would start out by setting the memory at 1400 and try and clock the core as high as it will go, then you can try increasing the memory from there. I think you get higher performance from higher core clocks and not as much from memory. My 270x will clock near 1600 memory and maybe more if I was to keep the core at 1200 or less but I get worse performance that way.


----------



## nubki11a

Quote:


> Originally Posted by *Devildog83*
> 
> I would start out by setting the memory at 1400 and try and clock the core as high as it will go, then you can try increasing the memory from there. I think you get higher performance from higher core clocks and not as much from memory. My 270x will clock near 1600 memory and maybe more if I was to keep the core at 1200 or less but I get worse performance that way.


Yeah I noticed the same. I was able to increase the memory clock up to 1725, but my Furmark scores wouldn't increase at all lol.


----------



## [CyGnus]

14.1 out guys and they are really good i scored this with my 24/7 clocks

http://www.3dmark.com/3dm11/7914082

P12927 not bad, about 500/600 marks bump from the 13.30


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *[CyGnus]*
> 
> 14.1 out guys and they are really good i scored this with my 24/7 clocks
> 
> http://www.3dmark.com/3dm11/7914082
> 
> P12927 not bad, about 500/600 marks bump from the 13.30


downloading now, here is a link

http://support.amd.com/en-us/download/desktop?os=Windows%207%20-%2064

Here are my scores
http://www.3dmark.com/3dm11/7914658


----------



## mAs81

Quote:


> Originally Posted by *dmfree88*
> 
> If you have multiple monitors it wont always downclock. Generally mem clock and voltage do not decrease. Theres workarounds for this using 2d/3d clocks. Do u have multiple monitors?


Yes I have a dual monitor setup..One 22 inches and the main one 24 inches..
So how can I use 2D/3D clocks for this???


----------



## Devildog83

Quote:


> Originally Posted by *[CyGnus]*
> 
> 14.1 out guys and they are really good i scored this with my 24/7 clocks
> 
> http://www.3dmark.com/3dm11/7914082
> 
> P12927 not bad, about 500/600 marks bump from the 13.30


You got tessy off, I would like to see some score back to back to compare. I will do this myself today but I work this morning and the *Superbowl* is later.


----------



## M1kuTheAwesome

Count me in with my new Powercolor Radeon R9 280X TurboDuo. Picked it up on friday and already have a custom fan curve set for max performance, and GPU overclocked to 1155MHz.
A quick print screen + paint of AB:


Does the fact that Afterburner shows nothing about voltage mean that my card is voltage locked? If so, that's a shame. It has plenty of headroom in terms of heat for a slight voltage bump.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Count me in with my new Powercolor Radeon R9 280X TurboDuo. Picked it up on friday and already have a custom fan curve set for max performance, and GPU overclocked to 1155MHz.
> A quick print screen + paint of AB:
> 
> 
> Does the fact that Afterburner shows nothing about voltage mean that my card is voltage locked? If so, that's a shame. It has plenty of headroom in terms of heat for a slight voltage bump.


Yup it does, You can always try to alter the BIOS to see if you can unlock it that way


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Yup it does, You can always try to alter the BIOS to see if you can unlock it that way


Will probably do it at some point just to try it, but not anytime soon. I don't wanna do something that risky on my 48 hours old card.


----------



## Devildog83

*M1kuTheAwesome* has been added, Welcome!!


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *Devildog83*
> 
> *M1kuTheAwesome* has been added, Welcome!!


Thank you, thank you.


----------



## mAs81

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Does the fact that Afterburner shows nothing about voltage mean that my card is voltage locked? If so, that's a shame. It has plenty of headroom in terms of heat for a slight voltage bump.


Have you tried unlocking it in the settings? Try settings>General>and select unlock voltage control/monitoring.
Restart afterburner and check it out..


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *mAs81*
> 
> Have you tried unlocking it in the settings? Try settings>General>and select unlock voltage control/monitoring.
> Restart afterburner and check it out..


Whoa. A reading of 1144 just popped up. And seems it can be changed. I'm a first timer at GPU overvolting though. How much should I add at a time to prevent instantly overheating my card?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *mAs81*
> 
> Have you tried unlocking it in the settings? Try settings>General>and select unlock voltage control/monitoring.
> Restart afterburner and check it out..










wow I missed that one


----------



## Devildog83

New review for Mantle - http://wccftech.com/amd-mantle-api-performance-preview-radeon-r7-260x-radeon-r9-270x-radeon-r9-280x/

I don't think we are going to see much increases in 3DMark, Mark11 or Unigen bench's though as they do not use Mantle as for as I know. I could be wrong.


----------



## mAs81

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Whoa. A reading of 1144 just popped up. And seems it can be changed. I'm a first timer at GPU overvolting though. How much should I add at a time to prevent instantly overheating my card?


That depends on the overclocking you're planning to do,and its a hit n miss procedure actually..Try changing your clocks without overvolting it,and check the stability/temps while heavy load...If ithe results are to your liking,keep it that way or else if it crashes the game/benchmark then try to ovevolt it..At least thats what I'm doing till now..To tell you the truth I haven't overvolted it yet but I'm playing games with upped clocks with no problem so far..Don't forget to adjust a fan profile too to help your card with cooling..
This is my profile when I'm heavy gaming by the way:


Spoiler: Warning: Spoiler!






Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> 
> 
> 
> 
> 
> 
> 
> wow I missed that one


Yeah,you and me both








I thought that my card was voltage locked for a long time too..


----------



## F3ERS 2 ASH3S

In afterburner I am seeing Aux Voltage. What is that exactly? (its locked for me) was unsure if that is something I should check out and unlock


----------



## mAs81

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> In afterburner I am seeing Aux Voltage. What is that exactly? (its locked for me) was unsure if that is something I should check out and unlock


No idea...Mine is locked too....


----------



## ricko99

Updated to catalyst 14.1 and it sucks big time (at least for me). It gave me artifacts that were fixed by asus BIOS update.Also, lost some point in valley and heaven. Reverting back to 13.12


----------



## SkateZilla

QUestion:

Does anyone know which models of the R9-280X Cards support 2x DVI 1x HDMI Eyefinity?

I have a HD7950 Now, and the 3rd Screen Tearing due to display port adapter bugs me in flights sims.
Using DVI, HDMI-> DVI, MiniDP1 -> DVI for the 3 Main Screens, then MiniDP->Full DP-> DVI for 4th Screen.
(Yes I purposely bought the FullDP to DVI for the 4th screen just in case I upgraded to a Card with only a Full DP for the 4th screen.)

I was looking at upgrading to a R9-290, but I'd Rather spend less and XFire my HD7950 as a secondary to the R9-280x and use the 280X Outputs
(2x DVI 1x HDMI for 3 Screen main screens, 1x DP to Exported Gauges).

I was looking at the SAPPHIRE DUAL-X 100363BF4L Radeon R9 280X, as it has DVI-I and DVI-D ports.


----------



## dmfree88

Quote:


> Originally Posted by *mAs81*
> 
> Yes I have a dual monitor setup..One 22 inches and the main one 24 inches..
> So how can I use 2D/3D clocks for this???


Msi afterburner has settings for it. Im on my phone so i cant check exactly how its been awhile. But you save your gaming clock to profile one and your 2d desktop clock on profile 2. Then in the settings there should be enable 2d/3d clocks. If u cant figure it out let me know ill pull it up on my pc later and find exact location/instructions.

Also auxillary voltage is the power limit for memory. Very few cards have access to this. My 7870 hawk is the only card ive seen personally with aux control


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *dmfree88*
> 
> Msi afterburner has settings for it. Im on my phone so i cant check exactly how its been awhile. But you save your gaming clock to profile one and your 2d desktop clock on profile 2. Then in the settings there should be enable 2d/3d clocks. If u cant figure it out let me know ill pull it up on my pc later and find exact location/instructions.
> 
> Also auxillary voltage is the power limit for memory. Very few cards have access to this. My 7870 hawk is the only card ive seen personally with aux control


Very good to know, I am curious as I can up the memory voltage but not the aux guess I have some reading to do


----------



## Archea47

As per the Mantle bug thread, the current drivers make BF4 with mantle on crossfire after a period of time skip and pause unbearably


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Archea47*
> 
> As per the Mantle bug thread, the current drivers make BF4 with mantle on crossfire after a period of time skip and pause unbearably


Is anyone reporting this to AMD? I submitted a screen tearing bug when overclocked I get little squares


----------



## mAs81

Quote:


> Originally Posted by *dmfree88*
> 
> Msi afterburner has settings for it. Im on my phone so i cant check exactly how its been awhile. But you save your gaming clock to profile one and your 2d desktop clock on profile 2. Then in the settings there should be enable 2d/3d clocks. If u cant figure it out let me know ill pull it up on my pc later and find exact location/instructions.
> 
> Also auxillary voltage is the power limit for memory. Very few cards have access to this. My 7870 hawk is the only card ive seen personally with aux control


Well,I did what you said,I saved my 2d desktop clock on profile 2(2D Profile),and my gaming one on profile(3D profile) in settings>profiles..But still I can't enable 2d/3d clocks.
And I can't find any other settings either..Did I do something wrong,or is it just my card??


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *mAs81*
> 
> Well,I did what you said,I saved my 2d desktop clock on profile 2(2D Profile),and my gaming one on profile(3D profile) in settings>profiles..But still I can't enable 2d/3d clocks.
> And I can't find any other settings either..Did I do something wrong,or is it just my card??


In after burner go to settings > profiles There you will have the option to assign the 2d profile and the 3d profile


----------



## mAs81

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> In after burner go to settings > profiles There you will have the option to assign the 2d profile and the 3d profile


That's what I did..But there's no enable 2d/3d clock configuration in the settings...









Spoiler: Warning: Spoiler!


----------



## dmfree88

http://gyazo.com/cc933bbb67987f5030b003e85bc130da

if only lol.. MSI AB newer beta lets me push to 2000 core 2000 mem.... Crazy...
Quote:


> Originally Posted by *mAs81*
> 
> That's what I did..But there's no enable 2d/3d clock configuration in the settings...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


I think you might have to reinstall afterburner with the server thing included. You may already have it I cant remember what its called but its required to use 2d/3d clocks. I cant remember what it is exactly but i think its called the riva tuner statistics server or something like that. It basically runs and monitors which games pop up and when to switch to different clocks or settings. You can change clocks for any game but if you just want one for desktop one for gaming you can enable the 2d/3d clock thing. Have to use the server side to enable it though.


----------



## mAs81

Quote:


> Originally Posted by *dmfree88*
> 
> http://gyazo.com/cc933bbb67987f5030b003e85bc130da
> 
> if only lol.. MSI AB newer beta lets me push to 2000 core 2000 mem.... Crazy...
> I think you might have to reinstall afterburner with the server thing included. You may already have it I cant remember what its called but its required to use 2d/3d clocks. I cant remember what it is exactly but i think its called the riva tuner statistics server or something like that. It basically runs and monitors which games pop up and when to switch to different clocks or settings. You can change clocks for any game but if you just want one for desktop one for gaming you can enable the 2d/3d clock thing. Have to use the server side to enable it though.


You mean this:


Spoiler: Warning: Spoiler!


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *mAs81*
> 
> You mean this:
> 
> 
> Spoiler: Warning: Spoiler!


That would be a win for you


----------



## mAs81

Quote:


> Originally Posted by *dmfree88*
> 
> http://gyazo.com/cc933bbb67987f5030b003e85bc130da
> If only lol.. MSI AB newer beta lets me push to 2000 core 2000 mem.... Crazy...


Is the Power limit settings available only on the newer beta?!?Doesn't show up on my AB..
But it does show up on the ccc that came with the 14.1 drivers









Spoiler: Warning: Spoiler!






Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> That would be a win for you


Mm,I thought so too,so I checked it...
But,still,I can't seem to grasp the Riva tuner's function yet?Isn't it for displaying my FPS during gaming?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *mAs81*
> 
> Mm,I thought so too,so I checked it...
> But,still,I can't seem to grasp the Riva tuner's function yet?Isn't it for displaying my FPS during gaming?


think of it as an extension that allows you to fully modify settings furth. Like AMD ccc and nvidia display settings


----------



## mAs81

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> think of it as an extension that allows you to fully modify settings furth. Like AMD ccc and nvidia display settings


Got it..Thanks man , I guess I'll have to look into it some more..









EDIT:
That's why I love OCN







:
http://www.overclock.net/a/how-to-use-rivatuner-afterburner-on-screen-display-and-more


----------



## JCH979

I went ahead and updated to the 14.1 and also updated my bios while I was at it.

Quick 3DMark11 run: http://www.3dmark.com/3dm11/7916469 and Firestrike: http://www.3dmark.com/fs/1650539


----------



## circadianmosfet

What is the recommended power supply to support two msi r9 280x oc?


----------



## uaedroid

Quote:


> Originally Posted by *circadianmosfet*
> 
> What is the recommended power supply to support two msi r9 280x oc?


Any PSU with a 80+ certification having 750W above from reputable companies like Seasonic, Corsair, XFX, EVGA etc.


----------



## mAs81

Well here's my results with 14.1
at stock clocks 1020/1500:


Spoiler: Warning: Spoiler!






At my gaming clocks 1150/1620:


Spoiler: Warning: Spoiler!






My old results..


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *mAs81*
> 
> CCC 13.11 beta:
> At stock 1020/1500
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> At 1150/1700
> 
> 
> Spoiler: Warning: Spoiler!





Doing a little better








EDIT:
Plus,I noticed that my idle temps decreased about 1-2c,so I guess that's good..
They're still not under 40c(that they were with 13.9), but its an improvement I guess....


----------



## Devildog83

What was that about Peyton was going to tear up the Hawks, what what. Like I said before, Denver had no idea of the storm that was coming. Go Hawks !!! Champs of SB48. #12's. #LOB

OK, I'm done.


----------



## Jaffi

Quote:


> Originally Posted by *Devildog83*
> 
> What was that about Peyton was going to tear up the Hawks, what what. Like I said before, Denver had no idea of the storm that was coming. Go Hawks !!! Champs of SB48. #12's. #LOB
> 
> OK, I'm done.


What if broncos wouldn't have scored at all, they were close to a total disgrace... What a game it was


----------



## Devildog83

Quote:


> Originally Posted by *Jaffi*
> 
> What if broncos wouldn't have scored at all, they were close to a total disgrace... What a game it was


It actually was a disgrace but I tried t tell everyone what was going to happen.


----------



## dmfree88

Quote:


> Originally Posted by *circadianmosfet*
> 
> What is the recommended power supply to support two msi r9 280x oc?


highly recommend using this website:

http://www.tomshardware.com/reviews/power-supply-oem-manufacturer,2913.html

and making sure you get a seasonic or a superflower PSU.

best bang for buck is generally seasonic G series or Rosewill Capstone (capstone is better IMO and is superflower)


----------



## nubki11a

Alright guys, I tried my hand at overclocking my 280X. Tbh I'm a little disappointed with my chip, but I guess it's alright








I OC'd the Core to *1130*, 1150 and above gave me artifacts in Furmark. Even when I increased the Voltage to 1.3V (maximum of TriXX, temps were still good during gaming though, max was upper 70's). I underclocked the Memory to *1500* for now, I might lower it to 1450 later. Underclocking decreased my temps by about 4C with no loss in Furmark scores (I got around 4080, is that good?) I have yet to test the difference in Mark and Heaven).

What might be interesting to note, is that by increasing the Power Limit to +10, I increased my FPS in Furmark from 60 to 68FPS, the downside of this is that the power consumption went up about 30W. It didn't affect my temps.

How bad is it if you have small artifacts in Furmark? I couldn't raise my Core clock to 1150 and run Furmark for more than 2 mins. without any artifacts. Even with the voltage at 1.3. Could/should I increase the voltage any more if my temps. allow it?

EDIT: This was on 13.12 btw. Forgot to add my Heaven score:


----------



## Devildog83

Quote:


> Originally Posted by *nubki11a*
> 
> Alright guys, I tried my hand at overclocking my 280X. Tbh I'm a little disappointed with my chip, but I guess it's alright
> 
> 
> 
> 
> 
> 
> 
> 
> I OC'd the Core to *1130*, 1150 and above gave me artifacts in Furmark. Even when I increased the Voltage to 1.3V (maximum of TriXX, temps were still good during gaming though, max was upper 70's). I underclocked the Memory to *1500* for now, I might lower it to 1450 later. Underclocking decreased my temps by about 4C with no loss in Furmark scores (I got around 4080, is that good?) I have yet to test the difference in Mark and Heaven).
> 
> What might be interesting to note, is that by increasing the Power Limit to +10, I increased my FPS in Furmark from 60 to 68FPS, the downside of this is that the power consumption went up about 30W. It didn't affect my temps.
> 
> How bad is it if you have small artifacts in Furmark? I couldn't raise my Core clock to 1150 and run Furmark for more than 2 mins. without any artifacts. Even with the voltage at 1.3. Could/should I increase the voltage any more if my temps. allow it?
> 
> EDIT: This was on 13.12 btw. Forgot to add my Heaven score:


Furmark is overly intense in my opinion but will check for stability. It may cause some artifacts but I would try a game like BF4 on ultra and see how that works. If you did not have crash's in Furmark you may be OK. It's just my personal opinion though.


----------



## srSheepdog

I'm a noob to OCN, so please bear with me. I have a couple questions about my new MSI R9 270X Gaming, currently OC'd to 1175/1550.

First of all, I'm running it with a Phenom II x4 940, MSI K9A2 Platinum mobo, and 8gb of 800ghz DDR2. The 940 is overclocked to 3.6Ghz, but I'm concerned that the CPU is bottlenecking my 270X. Is bottlenecking going to be a game to game thing, since some games are more CPU reliant than others? What is the best way to tell if I'm dealing with a bottleneck?

Second, if I don't have a bottleneck (and/or I move to a better CPU), I'd like to push this card a bit more. I found the below on the R9 290X thread, and I was wondering if it will work on Afterburner with ANY card, or just the R9 290 series?
Quote:


> How to give more volts on MSI Afterburner by OCN member sugarhell
> 
> ADD more volts to MSI AB Guide (Click to hide)
> 
> Source
> 
> Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)
> 
> The easy way to do changes:
> 
> Create a txt on desktop. Write
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,10
> 
> and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv
> 
> For 50mv: 8
> For 100mv:10
> For 125mv:14
> For 150mv:18
> For 175mv:1C
> For 200mv:20
> 
> I wouldn't go over this point because
> 1)You are close to leave the sweet spot of the ref pcb vrms efficiency
> 2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv
> 
> By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
> ex:MsiAfterburner.exe /sg0 /wi6,30,8d,10 /sg1 /wi6,30,8d,10


Thanks for any help/insight you can provide!


----------



## crazysoccerman

My RMA of my ASUS DC2T 280x went terribly.

They went out of stock between the time I shipped it back and the day they got it (today).

THEY DON'T DO RAIN CHECKS (even though it wouldn't have actually been a rain check -- just a freaking RMA).

So without any notification they switched the replacement to a refund. After I waited 10 minutes to talk to a rep, she told me I could just repurchase it when it came back in stock! (I bought it for $300).
.... Like she didn't know that wasn't possible.

1) That was the first underhanded and dirty trick.

Then I told them I wanted a replacement of a similar 280x. Any 280x, doesn't matter.

They wouldn't do it.

2) That was the second underhanded and dirty trick.

After more than 10 minutes of 'speaking with her supervisor', she told me the best they could do was guarantee to sell me the card at the price I bought it at AS LONG AS THERE WAS LESS THAN A $200 DIFFERENCE. I would have to constantly monitor the webpage, and then contact a customer service agent and explain my situation before they went out of stock, AS LONG AS IT WAS LESS THAN A $200 DIFFERENCE.

3) That was the third underhanded and dirty trick.

Not wanting to deal with the uncertainty of being completely screwed out of a card (likely possibility), I told her I just wanted my defective card back.

Now I'm sitting here waiting for the return of my defective card. Fantastic.

At least it got to experience warm california weather for a day.

THANKS NEWEGG

*Though I don't blame miners for newegg's terrible policy, they deserve some blame. Gamers should look down on them because they sacrifice the gaming community for a highly speculative attempt to make money.


----------



## Krusher33

Quote:


> Originally Posted by *Devildog83*
> 
> What was that about Peyton was going to tear up the Hawks, what what. Like I said before, Denver had no idea of the storm that was coming. Go Hawks !!! Champs of SB48. #12's. #LOB
> 
> OK, I'm done.


Yeah I was not expecting that. I did have money down on the seahawks but I was expecting either a close game, or Seahawks doing what Broncos did. Not the other way around. Congrats.

Back on topic. I got my EVGA G2 PSU today. I can take one of my 280X's off a juicebox that was making me nervous now. Quite happy about that.

Dunno why but a lot of people that I know that have the Dual X 280X's mining... they're having temperature issues and I'm not. So far 1 is at 74c and other is at 64c.

I will say this though, there's so much heat coming from the back of the top one, that it's heating up the heatsink closest to it on the Sabertooth board which is connected by a heatpipe to the VRM's. The VRM's gets warm and then the CPU gets warm... causing quite the fiasco. I may have to install YET another fan to blow over that area.


----------



## repo_man

Quote:


> Originally Posted by *Krusher33*
> 
> Yeah I was not expecting that. I did have money down on the seahawks but I was expecting either a close game, or Seahawks doing what Broncos did. Not the other way around. Congrats.
> 
> Back on topic. I got my EVGA G2 PSU today. I can take one of my 280X's off a juicebox that was making me nervous now. Quite happy about that.
> 
> Dunno why but a lot of people that I know that have the Dual X 280X's mining... they're having temperature issues and I'm not. So far 1 is at 74c and other is at 64c.
> 
> I will say this though, there's so much heat coming from the back of the top one, that it's heating up the heatsink closest to it on the Sabertooth board which is connected by a heatpipe to the VRM's. The VRM's gets warm and then the CPU gets warm... causing quite the fiasco. I may have to install YET another fan to blow over that area.


I have a small fan installed on the VRM sink on my sabertooth. Just felt warm one day when I shut the rig down, so I decided it needed one, lol. I'm not even mining.


----------



## neurotix

Quote:


> Originally Posted by *srSheepdog*
> 
> I'm a noob to OCN, so please bear with me. I have a couple questions about my new MSI R9 270X Gaming, currently OC'd to 1175/1550.
> 
> First of all, I'm running it with a Phenom II x4 940, MSI K9A2 Platinum mobo, and 8gb of 800ghz DDR2. The 940 is overclocked to 3.6Ghz, but I'm concerned that the CPU is bottlenecking my 270X. Is bottlenecking going to be a game to game thing, since some games are more CPU reliant than others? What is the best way to tell if I'm dealing with a bottleneck?
> 
> Second, if I don't have a bottleneck (and/or I move to a better CPU), I'd like to push this card a bit more. I found the below on the R9 290X thread, and I was wondering if it will work on Afterburner with ANY card, or just the R9 290 series?
> Thanks for any help/insight you can provide!


A Phenom II x4 at 3.6ghz MIGHT bottleneck an R9 270X, but you're right, it's on a case by case basis. The best way to tell is if your CPU is at 50% or more and your GPU isn't at 100%. I'm not sure what you use to monitor it, but you might want to check out a program called Playclaw 5, which allows for FPS, CPU and GPU monitoring in games (it's free). Take the games you want to play, open Playclaw and play them and monitor your FPS as well as your GPU load. If your GPU isn't at a constant 99-100% or has low usage, you have a bottleneck in that game.

Can you overclock your CPU any more? What is your CPU-Northbridge and HyperTransport at? Those settings should be bumped up from stock. On my old Phenom II Quads, on a decent motherboard I used to run them with 2600mhz CPU-NB. I believe stock HyperTransport is 2ghz, you might want to raise that to 2600mhz to match. You will most definitely need to increase voltage if your board allows it to overclock the CPUNB and HTT. Raising the CPUNB improves memory bandwidth. With that older memory, every bit helps. Getting your quad to 4ghz with a 2600mhz CPUNB will help reduce the bottleneck if there is one.

About the Afterburner thing, I have no idea if it will work. It's worth a try, but you need to remember that the R9 290 uses a voltage offset, while the 280X and 270X do not. With my 270X, in Trixx or Afterburner the max allowed voltage is 1.3v. (1.3v is more than enough for my golden 270X to do 1250mhz.) There's a modded version of Trixx we use on my girlfriend's 7970 that allows 1.381v and has memory voltage control, but that doesn't work with the 280X or 270X.

Hope this helps.


----------



## srSheepdog

Quote:


> Originally Posted by *neurotix*
> 
> A Phenom II x4 at 3.6ghz MIGHT bottleneck an R9 270X, but you're right, it's on a case by case basis. The best way to tell is if your CPU is at 50% or more and your GPU isn't at 100%. I'm not sure what you use to monitor it, but you might want to check out a program called Playclaw 5, which allows for FPS, CPU and GPU monitoring in games (it's free). Take the games you want to play, open Playclaw and play them and monitor your FPS as well as your GPU load. If your GPU isn't at a constant 99-100% or has low usage, you have a bottleneck in that game.
> 
> Can you overclock your CPU any more? What is your CPU-Northbridge and HyperTransport at? Those settings should be bumped up from stock. On my old Phenom II Quads, on a decent motherboard I used to run them with 2600mhz CPU-NB. I believe stock HyperTransport is 2ghz, you might want to raise that to 2600mhz to match. You will most definitely need to increase voltage if your board allows it to overclock the CPUNB and HTT. Raising the CPUNB improves memory bandwidth. With that older memory, every bit helps. Getting your quad to 4ghz with a 2600mhz CPUNB will help reduce the bottleneck if there is one.
> 
> About the Afterburner thing, I have no idea if it will work. It's worth a try, but you need to remember that the R9 290 uses a voltage offset, while the 280X and 270X do not. With my 270X, in Trixx or Afterburner the max allowed voltage is 1.3v. (1.3v is more than enough for my golden 270X to do 1250mhz.) There's a modded version of Trixx we use on my girlfriend's 7970 that allows 1.381v and has memory voltage control, but that doesn't work with the 280X or 270X.
> 
> Hope this helps.


Thank you very much for the response. I'm new to overclocking, so while I am adept at the simple bump-multiplier-and-voltage overclocking, I know nothing about OC'ing the CPU-NB. Guess it's time to do some more reading! I'm completely clueless about how to OC memory as well, but I'd definitely like to learn how.

In reviews of the 940, it seemed that there were seriously diminishing returns when OC'ing above 3.6ghz. I'm not too big on the idea of pushing too much higher, as my vcore is already at around 1.45v. Anything less than that and I'm not Prime95-stable.

I'll let you know about the Afterburner thing....VERY curious, as I definitely think this GPU has some headroom to work with.


----------



## neurotix

Quote:


> Originally Posted by *srSheepdog*
> 
> Thank you very much for the response. I'm new to overclocking, so while I am adept at the simple bump-multiplier-and-voltage overclocking, I know nothing about OC'ing the CPU-NB. Guess it's time to do some more reading! I'm completely clueless about how to OC memory as well, but I'd definitely like to learn how.
> 
> In reviews of the 940, it seemed that there were seriously diminishing returns when OC'ing above 3.6ghz. I'm not too big on the idea of pushing too much higher, as my vcore is already at around 1.45v. Anything less than that and I'm not Prime95-stable.
> 
> I'll let you know about the Afterburner thing....VERY curious, as I definitely think this GPU has some headroom to work with.


If it's an old kit of DDR2-800 it might not OC very well. OC'ing memory is mostly the same as OCing the CPU- raise the frequency, test if it's stable (use Prime95 blend and Memtest86), if it's not add voltage until stable, repeat until you find your highest clock. You can't just bump up the multiplier though, since I think DDR2-800 is the fastest DDR2 speed and the next step up is 1066, which is DDR3. So, you'll have to increase the front side bus (actually, the HT Reference Clock). Stock is 200. Every mhz you add to this will add mhz to your RAM at a certain ratio. Memory timings are also quite important, and lower timings will give you more bandwidth. http://en.wikipedia.org/wiki/Memory_timings The timing you really want to worry about is CAS latency, which is the first value. You'll have to find out the stock timings of your kit, and figure out if your BIOS lets you adjust them. Your best case scenario will to be overclock the RAM through raising the HT Ref Clock, while also lowering the timings.

Not sure about the reviews of the 940 and diminishing returns. I saw a big performance increase between 3.5ghz and 4ghz. If your voltage is that low for 3.6, you should still have some headroom. This depends on your cooling, what cooler are you using? If heat is under control and your cores are under 62C in Prime, you can definitely push further. This will only improve your performance. Even if you can't get stable at 4ghz, you should be able to do at least 3.8 @ 1.5v with good cooling. Once you add this in, and also overclock your CPU-NB and HTT, you should see a good improvement in performance. Where this will matter most is your minimum fps in gaming. Overall, your system will be much smoother in games. See, overclocking the CPU-NB not only improves RAM bandwidth, but also the speed of your L2 and L3 cache. These are loaded heavily during 3d games. I highly recommend pushing your cores higher if you can.

The best way to benchmark the performance difference, imo, is with AIDA64 cache and memory test. Do a before and after of your settings now and your settings once your overclock everything and you should notice an improvement. Good idea to use this software, and it even has a handy sidebar gadget and OSD functions.

It's an old guide and I'd never thought I'd post this again, but: http://www.overclockers.com/forums/showthread.php?t=596023 I used this guide when overclocking my Phenom IIs (I had 3) and it should help you a great deal. There's even a handy chart of the recommended voltage per CPU-NB multiplier.

Oh, when you get the time, you should fill out a rigbuilder and put your rig in your signature. It will help other people help you in the future. Click rigbuilder in the top right, fill it out, then go to "edit signature" under your profile, edit it, click "show stuff off in my signature" and add it from the dropdown box.

Good luck.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Krusher33*
> 
> Yeah I was not expecting that. I did have money down on the seahawks but I was expecting either a close game, or Seahawks doing what Broncos did. Not the other way around. Congrats.
> 
> Back on topic. I got my EVGA G2 PSU today. I can take one of my 280X's off a juicebox that was making me nervous now. Quite happy about that.
> 
> Dunno why but a lot of people that I know that have the Dual X 280X's mining... they're having temperature issues and I'm not. So far 1 is at 74c and other is at 64c.
> 
> I will say this though, there's so much heat coming from the back of the top one, that it's heating up the heatsink closest to it on the Sabertooth board which is connected by a heatpipe to the VRM's. The VRM's gets warm and then the CPU gets warm... causing quite the fiasco. I may have to install YET another fan to blow over that area.


I have a single one will be getting another one shortly however yeah I noticed that as well. I think it has to do with how the cooler is set up and that backside near the mother board has exposed heatsink on the card, I have a 80mm fan in the corner beteen NB and VRM heatsinks and that seems to do the trick.. I really don't think a back plate in this case would make much of a difference


----------



## srSheepdog

Quote:


> Originally Posted by *neurotix*
> 
> If it's an old kit of DDR2-800 it might not OC very well. OC'ing memory is mostly the same as OCing the CPU- raise the frequency, test if it's stable (use Prime95 blend and Memtest86), if it's not add voltage until stable, repeat until you find your highest clock. You can't just bump up the multiplier though, since I think DDR2-800 is the fastest DDR2 speed and the next step up is 1066, which is DDR3. So, you'll have to increase the front side bus (actually, the HT Reference Clock). Stock is 200. Every mhz you add to this will add mhz to your RAM at a certain ratio. Memory timings are also quite important, and lower timings will give you more bandwidth. http://en.wikipedia.org/wiki/Memory_timings The timing you really want to worry about is CAS latency, which is the first value. You'll have to find out the stock timings of your kit, and figure out if your BIOS lets you adjust them. Your best case scenario will to be overclock the RAM through raising the HT Ref Clock, while also lowering the timings.
> 
> Not sure about the reviews of the 940 and diminishing returns. I saw a big performance increase between 3.5ghz and 4ghz. If your voltage is that low for 3.6, you should still have some headroom. This depends on your cooling, what cooler are you using? If heat is under control and your cores are under 62C in Prime, you can definitely push further. This will only improve your performance. Even if you can't get stable at 4ghz, you should be able to do at least 3.8 @ 1.5v with good cooling. Once you add this in, and also overclock your CPU-NB and HTT, you should see a good improvement in performance. Where this will matter most is your minimum fps in gaming. Overall, your system will be much smoother in games. See, overclocking the CPU-NB not only improves RAM bandwidth, but also the speed of your L2 and L3 cache. These are loaded heavily during 3d games. I highly recommend pushing your cores higher if you can.
> 
> The best way to benchmark the performance difference, imo, is with AIDA64 cache and memory test. Do a before and after of your settings now and your settings once your overclock everything and you should notice an improvement. Good idea to use this software, and it even has a handy sidebar gadget and OSD functions.
> 
> It's an old guide and I'd never thought I'd post this again, but: http://www.overclockers.com/forums/showthread.php?t=596023 I used this guide when overclocking my Phenom IIs (I had 3) and it should help you a great deal. There's even a handy chart of the recommended voltage per CPU-NB multiplier.
> 
> Oh, when you get the time, you should fill out a rigbuilder and put your rig in your signature. It will help other people help you in the future. Click rigbuilder in the top right, fill it out, then go to "edit signature" under your profile, edit it, click "show stuff off in my signature" and add it from the dropdown box.
> 
> Good luck.


Thank you very, very much!!!


----------



## neurotix

If you need any more help feel free to pm me.


----------



## dmfree88

So the 270x arrived today







. Forgot to take a pic before i put it in but i will get one tomorrow. Great hasher at 495kh. Still waiting to do some gaming/benchmarks prob this weekend. Im excited to put it through some ddo and to buy bf4 eventually so im not the only loser without it lol.


----------



## boyagin

Not any of R9 series user myself, but please allow me to ask for your precious feedbacks.
I've been using AMD CPU and GPU ever since my first build. I'm thinkin to get 280x Toxic as an upgrade within this few days (chinese new year), but I have heard a lot of feedback (internet user reviews & friends) regarding the artifacts and things that happens on this card, which really put me on hold to buy this card. Not gonna consider 290/780 or higher because I don't really need those high end cards. I had a Sapphire card and I sent for RMA and it took about a month









I wanna have feedback from 280x Toxic users. How is your card doing lately?
FYI, 280x Toxic costs really cheap in my country compare to GTX 770. Hope I could join in to become a member of R9 series very soon.


----------



## Warl0rdPT

toxic is the top of 280x, so I would say people who get it are usually OCers that try to push it even further, so artifacts are expected if you push it too much. I'm almost sure it will work fine out of the box, if you OC it and get artifacts just step down your clocks


----------



## Devildog83

Quote:


> Originally Posted by *srSheepdog*
> 
> Thank you very, very much!!!


Could you create a sig-rig? It would help those who want to help. If you wish to be added to the club just post a pic of your card and I will do so.


----------



## nubki11a

Is it possible to increase voltage to higher than 1.3V that TriXX allows you to put it at? I feel that my card can easily take higher (Furmark 1080 preset gives 73C max and Heaven 70C max).


----------



## Devildog83

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> I have a single one will be getting another one shortly however yeah I noticed that as well. I think it has to do with how the cooler is set up and that backside near the mother board has exposed heatsink on the card, I have a 80mm fan in the corner beteen NB and VRM heatsinks and that seems to do the trick.. I really don't think a back plate in this case would make much of a difference


Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> I have a single one will be getting another one shortly however yeah I noticed that as well. I think it has to do with how the cooler is set up and that backside near the mother board has exposed heatsink on the card, I have a 80mm fan in the corner beteen NB and VRM heatsinks and that seems to do the trick.. I really don't think a back plate in this case would make much of a difference


I have this little guy, my NB temps are down ten degrees and it even keeps the socket temp at or below the core temp most of the time.


Spoiler: Warning: Spoiler!


----------



## Krusher33

I have a ram cooler for the main VRM heatsink but my issue is more to do with heat coming from the card itself. I'm going to set up another 120 fan blowing at the card's backside to quickly blow away the heat. Heck, it may even end up cooling down the card.

Makes me wonder if getting ek's backplate would help.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> I have this little guy, my NB temps are down ten degrees and it even keeps the socket temp at or below the core temp most of the time.
> 
> 
> Spoiler: Warning: Spoiler!


yeah.. I had 2 80mm fans from case that was 9 years old so well they work leds are dead but blades spin


----------



## Devildog83

Quote:


> Originally Posted by *Krusher33*
> 
> I have a ram cooler for the main VRM heatsink but my issue is more to do with heat coming from the card itself. I'm going to set up another 120 fan blowing at the card's backside to quickly blow away the heat. Heck, it may even end up cooling down the card.
> 
> Makes me wonder if getting ek's backplate would help.


Yes. I would think a backplate would help. They do dissipate heat. Mine do get warm but the area is not near as warm as it would be without it, plus they just plain look better. A lot of guys just stick a fan right on the backplate pointing at the NB for cooling and it seems to work well for them. Another choice would be to water-cool them.


----------



## Devildog83

Quote:


> Originally Posted by *dmfree88*
> 
> So the 270x arrived today
> 
> 
> 
> 
> 
> 
> 
> . Forgot to take a pic before i put it in but i will get one tomorrow. Great hasher at 495kh. Still waiting to do some gaming/benchmarks prob this weekend. Im excited to put it through some ddo and to buy bf4 eventually so im not the only loser without it lol.


Git-er-done.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> Yes. I would think a backplate would help. They do dissipate heat. Mine do get warm but the area is not near as warm as it would be without it, plus they just plain look better. A lot of guys just stick a fan right on the backplate pointing at the NB for cooling and it seems to work well for them. Another choice would be to water-cool them.


This is why it won't help that much..

Look at the heatsink and the openings then looks at the air flow this is an image of the side that connects to the motherboard 70c of heat adding to the the only possible airflow that the NB gets will raise the NB temps a lot

http://www.overclock.net/g/i/1822283/a/1097229/official-amd-r9-280x-280-270x-270-owners-club/


----------



## Devildog83

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> This is why it won't help that much..
> 
> Look at the heatsink and the openings then looks at the air flow this is an image of the side that connects to the motherboard 70c of heat adding to the the only possible airflow that the NB gets will raise the NB temps a lot
> 
> http://www.overclock.net/g/i/1822283/a/1097229/official-amd-r9-280x-280-270x-270-owners-club/


I don't know about his but here are 2 pics, the 1st I stuck a piece of paper on the backplate under the heatsink and the 2nd at the top of the card, as you can see the airflow is coming out the top and not the bottom but there is still heat from the PCB and the backplate helps with that. It will not eliminate heat but it will help. I turned the fans to over 80% for this test.


Spoiler: Warning: Spoiler!


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> I don't know about his but here are 2 pics, the 1st I stuck a piece of paper on the backplate under the heatsink and the 2nd at the top of the card, as you can see the airflow is coming out the top and not the bottom but there is still heat from the PCB and the backplate helps with that. It will not eliminate heat but it will help. I turned the fans to over 80% for this test.
> 
> 
> Spoiler: Warning: Spoiler!


I test mine in a minute

For some reason I thought he had the same card as me, but I checked and they are not so nvm on my posts about it haha


----------



## Devildog83

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> I test mine in a minute
> 
> For some reason I thought he had the same card as me, but I checked and they are not so nvm on my posts about it haha


I thought what you said made sense but then I thought that would be a ton of heat on the NB/VRM's and I did not see that at all in HWinfo64 or AI suite so I figured I would test it out. I love the fact that the air is more out of the top than the bottom because my case has great airflow and can get rid of the heat easily.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> I thought what you said made sense but then I thought that would be a ton of heat on the NB/VRM's and I did not see that at all in HWinfo64 or AI suite so I figured I would test it out. I love the fact that the air is more out of the top than the bottom because my case has great airflow and can get rid of the heat easily.


Does your card have open vents on the side that connects to the motherboard?

I tried to test then I realized I have too many fans and didn't feel like removing them


----------



## Devildog83

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Does your card have open vents on the side that connects to the motherboard?
> 
> I tried to test then I realized I have too many fans and didn't feel like removing them


Yes, it's pretty much the same as the top.


----------



## nubki11a

I'm finally done overclocking and the best I could get is:
*Core: 1150
Memory: 1600
Voltage: 1,275V*

Could've gone higher with the memory but wasn't worth it. Temps stay well below 70C in Heaven and 3DMark11; too bad I couldn't increase the core clock any higher. Even with the voltage at 1.3 I couldn't reach 1160 without getting artifacts. Does this just mean that my card isn't a great overclocker or is there still something I can do about it? After all, my temps are still pretty low. I did read somewhere that increasing the voltage anymore won't help anyway though, is that true? ):

EDIT: Could I improve the clocks by modding the BIOS? Or is this just the max. of what the silicon can handle?


----------



## staryoshi

I ordered another Asus DCU II R9 270X. It should be here on Friday







(I may do a quick write up on crossfire scaling when it comes)


----------



## neurotix

Quote:


> Originally Posted by *nubki11a*
> 
> I'm finally done overclocking and the best I could get is:
> *Core: 1150
> Memory: 1600
> Voltage: 1,275V*
> 
> Could've gone higher with the memory but wasn't worth it. Temps stay well below 70C in Heaven and 3DMark11; too bad I couldn't increase the core clock any higher. Even with the voltage at 1.3 I couldn't reach 1160 without getting artifacts. Does this just mean that my card isn't a great overclocker or is there still something I can do about it? After all, my temps are still pretty low. I did read somewhere that increasing the voltage anymore won't help anyway though, is that true? ):
> 
> EDIT: Could I improve the clocks by modding the BIOS? Or is this just the max. of what the silicon can handle?


If you artifact at 1160 with 1.3v then unfortunately your card tops out at 1150mhz. 1.3v should be more than enough for 1150mhz. Regarding your older post, I don't think there is a way to give more than 1.3v using Trixx or Afterburner with the R9 cards. A modded bios might work, if you lock the voltage at more than 1.3v. There used to be a version of Trixx with voltage up to 1.381v but it doesn't properly detect R9 cards.

My 270X will do 1250mhz @ 1.3v, but if I raise it to 1280mhz I eventually get total system lockups in games. It even passes Valley with no artifacts at 1300mhz, but will freeze up in any other game almost immediately instead. If you're giving your card 1.3v but it artifacts at 1160, then that's the max of your card.


----------



## nubki11a

Quote:


> Originally Posted by *neurotix*
> 
> If you artifact at 1160 with 1.3v then unfortunately your card tops out at 1150mhz. 1.3v should be more than enough for 1150mhz. Regarding your older post, I don't think there is a way to give more than 1.3v using Trixx or Afterburner with the R9 cards. A modded bios might work, if you lock the voltage at more than 1.3v. There used to be a version of Trixx with voltage up to 1.381v but it doesn't properly detect R9 cards.
> 
> My 270X will do 1250mhz @ 1.3v, but if I raise it to 1280mhz I eventually get total system lockups in games. It even passes Valley with no artifacts at 1300mhz, but will freeze up in any other game almost immediately instead. If you're giving your card 1.3v but it artifacts at 1160, then that's the max of your card.


Thanks for the confirmation! Too bad my card will only go that high :/ I'll look around for the TriXX version that let's me increase the voltage to 1.38V and see if it works. If it does, I'll post it here. How bad is it to permanently lock the card at 1.3V though? Doesn't really sound like it would be worth it


----------



## Devildog83

Quote:


> Originally Posted by *staryoshi*
> 
> I ordered another Asus DCU II R9 270X. It should be here on Friday
> 
> 
> 
> 
> 
> 
> 
> (I may do a quick write up on crossfire scaling when it comes)


That would be cool, I get around 185% of my 7870 alone, or at least that's what it seems.

3DMark11 - Graphics
7870 x 1 - 9353
X-Fire - 18651

3DMark - FireStrike - Graphics
7870 x 1 - 6421
X-Fire - 12630

Valley - FPS/Points
7870 x 1 - 37.5/1568
X-Fire - 68.0/2845


Spoiler: Warning: Spoiler!


----------



## repo_man

I don't know how many folders are here in this group, but if you do fold, seems like the 14.1 drivers have increased overall PPD for people (myself included).

Relevant thread here: http://www.overclock.net/t/1464599/amd-beta-14-1-drivers-tahiti-ppd-increase/0_20


----------



## nubki11a

A lil offtopic but could anyone explain how Folding works exactly? As far as I know a lot of computers (a team?) are combined to form a supercomputer that aids research for diseases etc?


----------



## Devildog83

Quote:


> Originally Posted by *nubki11a*
> 
> A lil offtopic but could anyone explain how Folding works exactly? As far as I know a lot of computers (a team?) are combined to form a supercomputer that aids research for diseases etc?


I don't know, I can't even fold a napkin.


----------



## repo_man

Quote:


> Originally Posted by *nubki11a*
> 
> A lil offtopic but could anyone explain how Folding works exactly? As far as I know a lot of computers (a team?) are combined to form a supercomputer that aids research for diseases etc?


Short answer: you run a program on your PC from Stanford which sends your rig a work unit, your rig then runs the simulation of the work unit, and then the program sends the results back to Stanford. Rinse and repeat.

Medium answer: various proteins "fold" on themselves. Sometimes thess folds go wrong and can cause cancer and other bad effects. Knowing in what ways these proteins fold helps researchers in various ways, among them how to build better drugs or combat cancer. Stanford's Folding at home program has helped science in documented ways. You can research it online (there's probably a thread or two here in the folding forum)

Longer answer: you might find more info here. http://www.overclock.net/t/67463/folding-faq-everything-you-need-to-know-in-one-location-for-new-and-old-folders/0_20










Edit: For anyone wondering, I started folding because a lot of the time during the day I had my PC on but either wasn't using it or was hardly using it (word processing or something). I felt like running this program would let my PC do "something useful" since I had it on a lot. And it helps fight cancer. I'm all for that.

Now though, I've lost friends and family members to cancer. Folding to me is a way to connect with those loved ones, to fight for them, to know that one day what they went through won't have to be gone through by someone else's father, son, mother, etc. It's a way in which I feel like I can *do something* for a problem that leaves many people helpless and hopeless.


----------



## nubki11a

Haha Devildog









Thanks repo_man, I think I get it. Might look into it a bit more when I'm not as sleepy


----------



## repo_man

Quote:


> Originally Posted by *nubki11a*
> 
> Haha Devildog
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks repo_man, I think I get it. Might look into it a bit more when I'm not as sleepy


Sounds good man! See my above edit as well. You really should look into it. It's a great program and a great cause.


----------



## nitroxyl

Hi there fellow R9 owners









Just have a quick question for cryptocurrency miners:
Do the new 14.1 drivers work with CGMiner 3.7.2?

I've got a Gigabyte R9 270 btw.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *nitroxyl*
> 
> Hi there fellow R9 owners
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just have a quick question for cryptocurrency miners:
> Do the new 14.1 drivers work with CGMiner 3.7.2?
> 
> I've got a Gigabyte R9 270 btw.


It works.. side note if any one has an xfx 270x there is a bios update for the older cards


----------



## neurotix

Quote:


> Originally Posted by *nubki11a*
> 
> Thanks for the confirmation! Too bad my card will only go that high :/ I'll look around for the TriXX version that let's me increase the voltage to 1.38V and see if it works. If it does, I'll post it here. How bad is it to permanently lock the card at 1.3V though? Doesn't really sound like it would be worth it


 SapphireTrixx4.4.0b-MOD.zip 3450k .zip file


There's the Trixx mod that allows for 1.381v and memory voltage control.

I'm telling you though, you won't be able to use it with any R9 series card because 4.4.0 of Trixx was made before the R9 series came out, so it doesn't read clocks and voltage correctly.

You MIGHT be able to use it in conjunction with Afterburner or Catalyst Control Center (AMD Overdrive) to control clocks, but set the voltage in Trixx. I haven't tried so I'm not sure.

You are correct, there is probably no point in locking the voltage at 1.3v, because this will make the card run much hotter and you really shouldn't need that voltage for 1150mhz. Personally, I use 5 minutes of OCCT GPU test with shader complexity 7, max RAM, and error checking on. If you can run it for 5 minutes without it detecting any errors, you are probably okay. I would try lowering your voltage to 1.25v and running OCCT. You should also probably run Unigine Valley and look for artifacts. The lower voltage you need for your card, the cooler it will run, and the better.

As far as folding goes, it's a great cause, but I've been bit by the mining bug. My current card can do 176k PPD in [email protected], and would make about 4 million points a month. However, that's irrelevant when I can just as soon use it to make $220 a month in litecoins for my bank account. The plan is to save this money for 3 months, and get another Sapphire R9 290 Tri-X for Crossfire. Additionally, I'll be able to then mine on BOTH cards and make $440 a month. As much as I support folding, I'd rather have the money instead to upgrade my rig.









I used to hate mining and miners for driving up the cost of these cards, but now that I understand how much money these things generate monthly, I guess I've joined the dark side/sold out.


----------



## Jaffi

Quote:


> Originally Posted by *neurotix*
> 
> If you artifact at 1160 with 1.3v then unfortunately your card tops out at 1150mhz. 1.3v should be more than enough for 1150mhz. Regarding your older post, I don't think there is a way to give more than 1.3v using Trixx or Afterburner with the R9 cards. A modded bios might work, if you lock the voltage at more than 1.3v. There used to be a version of Trixx with voltage up to 1.381v but it doesn't properly detect R9 cards.
> 
> My 270X will do 1250mhz @ 1.3v, but if I raise it to 1280mhz I eventually get total system lockups in games. It even passes Valley with no artifacts at 1300mhz, but will freeze up in any other game almost immediately instead. If you're giving your card 1.3v but it artifacts at 1160, then that's the max of your card.


Also, artifacting is very temperature sensitive with these cards. I see no artifacts with 1200 MHz if I stay below ~74° C, above it will start artifacting (280x here). So I think if you want stable clocks for 24/7 usage, you have to figure out a clock/voltage/temp/fan curve. I for myself want to stay with stock fan curve, so I stick with my 1150 @1,2v, because anything that lies beween this and 1200 MHz is not worth the effort for me. You won't notice it outside benchmarking. And for benchmarking I can ramp my card up to 1230 MHz with 1,3v and fans at 100% anyway.


----------



## devilangel

GV-R928XOC-3GD (2.0) 1100/1500


----------



## Archea47

Congrats on the acquisition and Welcome to the Gigabyte 280x club devilangel!


----------



## repo_man

Just realized I'm not on the OP. I can haz add please?









Asus 270X DirectCU II TOP edition:


----------



## [CyGnus]

Here are some pics of my rig and 280X did some case cleaning today and everyone love pics right?


----------



## Devildog83

*devilangel, Repo-Man and CyGnus* have been added. Welcome and and I am sorry if it was my fault that Repo-Man and CyGnus were not added earlier, I thought they were already members I guess. Great cards and yes we, or at least I love lost of pics. If I don't have clocks could you please post everyday or overclocks, whichever you wish to be posted.


----------



## repo_man

Quote:


> Originally Posted by *Devildog83*
> 
> *devilangel, Repo-Man and CyGnus* have been added. Welcome and and I am sorry if it was my fault that Repo-Man and CyGnus were not added earlier, I thought they were already members I guess. Great cards and yes we, or at least I love lost of pics. If I don't have clocks could you please post everyday or overclocks, whichever you wish to be posted.


I think I was or was meant to be added before because I first bought an HIS 270 but RMAd it for the Asus 270X. I'll be OCing today and will post clocks for sure!


----------



## staryoshi

Quote:


> Originally Posted by *repo_man*
> 
> I think I was or was meant to be added before because I first bought an HIS 270 but RMAd it for the Asus 270X. I'll be OCing today and will post clocks for sure!


Isn't the Asus R9 270X goooorgeous?







In my case the heat pipes and Asus logo on the support bracket reflect the light from my NZXT Hue for an amazing effect, too


----------



## [CyGnus]

Devildog83 i use my Asus at 1100/1600 1.17v 24/7 for benchmarks the max i could do was this:

1235/1850 @ 1.3v (GPU-Z reads 1.252v huge vdrop)

3DMark2K11: 12631 http://www.3dmark.com/3dm11/7648005 TESS ON
3DMark2K11: 14311 http://www.3dmark.com/3dm11/7656094 TESS OFF

Devildog83 i hope you dont mind but i used my 'super powers' and edited myself already


----------



## Hacker90

Hey add me too







XFX R9 280X DD Black Edition 1100/1550. Here are some pics


----------



## [CyGnus]

Hacker90 you are added


----------



## repo_man

Quote:


> Originally Posted by *staryoshi*
> 
> Isn't the Asus R9 270X goooorgeous?
> 
> 
> 
> 
> 
> 
> 
> In my case the heat pipes and Asus logo on the support bracket reflect the light from my NZXT Hue for an amazing effect, too


Oh yea, it's soooo pretty. I have a Phantom 820 and the integrated lights look a-mazing reflecting off those heatpipes.


----------



## Tiharo

Hello all! I would like to join this club of awesome! I just got 2 Gigabyte R9 270X 4GB OCs to replace my aging MSI GTX 580 Lightning Xtreme Edition. So much quieter and cooler!


Spoiler: Warning: Spoiler!








Currently mining FedoraCoin (TIPS) with it over 1000kh/s with CPU also!


----------



## mAs81

Hey guys,have anyone of you installed a back plate on their card??I ordered a custom one from here:
http://www.coldzero.eu/
Haven't got it yet,will post pics when I do..








Just wondering if anyone has got one already..


----------



## [CyGnus]

Tiharo you are added









mAs81 they look good but performance wise ppl are getting less memory OC with them not that it matters much since you gain pretty much nothing with memory OC


----------



## Tiharo

Cool! I wish they had 270X back-plates though


----------



## Tiharo

Quote:


> Originally Posted by *Hacker90*
> 
> Hey add me too
> 
> 
> 
> 
> 
> 
> 
> XFX R9 280X DD Black Edition 1100/1550. Here are some pics


Absolutely beautiful card! XFX has some awesome cards and cooler designs!


----------



## mAs81

Quote:


> Originally Posted by *[CyGnus]*
> 
> Tiharo you are added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> mAs81 they look good but performance wise ppl are getting less memory OC with them not that it matters much since you gain pretty much nothing with memory OC


I won't be watercooling it,If that's what you mean..It's just for looks as they say,but how is it possible for it to reduce the memory OC?!?


----------



## [CyGnus]

mAs81 i dont have any idea but the guys who live on benching are saying that but i am all for the looks hehehe


----------



## Krusher33

Quote:


> Originally Posted by *mAs81*
> 
> Hey guys,have anyone of you installed a back plate on their card??I ordered a custom one from here:
> http://www.coldzero.eu/
> Haven't got it yet,will post pics when I do..
> 
> 
> 
> 
> 
> 
> 
> 
> Just wondering if anyone has got one already..


I'm jealous of that site. I wish there was one in the US. My 280X's could use one.
Quote:


> Originally Posted by *[CyGnus]*
> 
> mAs81 i dont have any idea but the guys who live on benching are saying that but i am all for the looks hehehe


Yeah, that's curious, what the hell they on about?


----------



## mAs81

Quote:


> Originally Posted by *[CyGnus]*
> 
> mAs81 i dont have any idea but the guys who live on benching are saying that but i am all for the looks hehehe


You and me both , man








From the little I've heard though,they might improve the airflow in the case or even lower temps,but I'm not sure..I'll tell you first hand when I get it!!
Quote:


> Originally Posted by *Krusher33*
> 
> I'm jealous of that site. I wish there was one in the US. My 280X's could use one.


Well he is in Portugal,perhaps they send to the US?I understand that they use DHL,so who knows..There's a contact form,if you're interested.They respond fast..!


----------



## Devildog83

Thanks CyGnus for updating in my absence. I love backplates because I hate the back of PCB's. That's why I bought cards with backplates.


----------



## [CyGnus]

Devildog83 sorry for the highjack







but i did not had anything to do so i figured i could give a hand


----------



## staryoshi

Backplates can improve the aesthetic appeal of some cards, indeed. They also reduce the space between tight multi-GPU configs like mine







On cards of reasonable length, like the 270 series, a PCB support bracket is good enough for me. On gargantuan cards, however, a backplate can provide invaluable rigidity.

I could probably go on at length about backplates if I don't hit the brakes now


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Krusher33*
> 
> I'm jealous of that site. I wish there was one in the US. My 280X's could use one.
> Yeah, that's curious, what the hell they on about?


Yes?

http://www.frozencpu.com/cat/l3/g57/c599/s1931/list/p1/EK_Products-EK_Blocks_-_VGA_ATI-EK_ATI_Backplates-Page1.html


----------



## Krusher33

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krusher33*
> 
> I'm jealous of that site. I wish there was one in the US. My 280X's could use one.
> Yeah, that's curious, what the hell they on about?
> 
> 
> 
> Yes?
> 
> http://www.frozencpu.com/cat/l3/g57/c599/s1931/list/p1/EK_Products-EK_Blocks_-_VGA_ATI-EK_ATI_Backplates-Page1.html
Click to expand...

Does it work with stock cooling? I've got one on the back of my 290X with the EK's waterblock. The length is just right for it. Dunno if it's the right length and thread for stock coolers?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Krusher33*
> 
> Does it work with stock cooling? I've got one on the back of my 290X with the EK's waterblock. The length is just right for it. Dunno if it's the right length and thread for stock coolers?


i am not sure, I just saw them the other day. You can give them a call as they are very helpful though


----------



## Mr.N00bLaR

Does anyone know how I can get MSI Afterburner to allow for core voltage and/or memory voltage control for Sapphire 280Xs? I know the core can be adjusted through Sapphire Trixx. Sapphire Trixx is incredibly terrible software in my past and present experiences, much less reliable than MSI afterburner.

EDIT: I shouldn't say terrible but I don't like it. It is not particularly good if you want different configurations on the same model GPU. I always have issues with power limits, voltages, and fan profiles not being saved or applied correctly. I don't like the GUI for trixx either.

EDIT: I have already changed the config file to allow for unofficial overclocking support, this has not opened the voltage selections for me.


----------



## linuxfueled

Core 1202 mem 1603 Purchased at Micro Center in Cambridge MA Nov 2013 for $299.99



link to Furmark score 1080 preset "4341"

http://www.ozone3d.net/benchmarks/furmark_192_score.php?id=164371

Link to Furmark score 720 preset "6294"

http://www.ozone3d.net/benchmarks/furmark_192_score.php?id=164373

Please add me to the club, thanks!


----------



## Devildog83

*linuxfueled* has been added, Welcome to the club. Good looking card. What does it look like in the rig?

Is that in the C70? Here is mine -


----------



## hamzta09

Is there an overclock thread for these cards?

Im wondering what the average successful OC is on both core and mem, and what the safest 24/7 voltages are.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> Is there an overclock thread for these cards?
> 
> Im wondering what the average successful OC is on both core and mem, and what the safest 24/7 voltages are.


Max I got on mine is 1210 gpu and 1700 memory Or 1062 gpu and 1850 mem for mining


----------



## Hacker90

Quote:


> Originally Posted by *Tiharo*
> 
> Absolutely beautiful card! XFX has some awesome cards and cooler designs!


Thanks bud







I just love XFX cards.. I went to buy the Asus one, but when u put it in the case it looks ugly as hell. So went with this


----------



## goldswimmerb

Did I just get a bad overclocker with my card or all the 270x poor overclockers? Mine can't oc more than 50MHz on any clock


----------



## Devildog83

Quote:


> Originally Posted by *goldswimmerb*
> 
> Did I just get a bad overclocker with my card or all the 270x poor overclockers? Mine can't oc more than 50MHz on any clock


I have mine at 1200/1400 everyday, my max overclock so far due to voltage lock is 1235/1590.


----------



## KauBoy

Sapphire Radeon R9 280X Vapor-X 1550/1600


----------



## JCH979

I really like how those backplates look for some reason


----------



## Devildog83

Quote:


> Originally Posted by *KauBoy*
> 
> 
> 
> Sapphire Radeon R9 280X Vapor-X 1550/1600


Are you sure you have 1550 on the core or do you mean 1150/1600? I have added you but I will hold on the clocks for a conformation. Welcome to the club.


----------



## <({D34TH})>

I'll soon be joining this group (ASUS R9 270 DCII OC), but I already feel like a pleb will all these 280X owners.


----------



## <({D34TH})>

Also, anybody have any experience on how to get BF4 promo code from Newegg? Does it come on the box or through email?.


----------



## Archea47

Quote:


> Originally Posted by *<({D34TH})>*
> 
> Also, anybody have any experience on how to get BF4 promo code from Newegg? Does it come on the box or through email?.


In my experience your shipment from newegg should have, separate from the video cards and lose in the box, AMD branded post-card size cards with BF4 art and a scratch off code to redeem on AMD's website. They come from the retailer, not from the card manufacturer. The promo cards should also be separate line items in your newegg invoice that get added automatically when you add the cards to your cart


----------



## <({D34TH})>

Quote:


> Originally Posted by *Archea47*
> 
> In my experience your shipment from newegg should have, separate from the video cards and lose in the box, AMD branded post-card size cards with BF4 art and a scratch off code to redeem on AMD's website. They come from the retailer, not from the card manufacturer. *The promo cards should also be separate line items in your newegg invoice that get added automatically when you add the cards to your cart*


Huh, so I guess that's where my issue starts. Guess I won't be getting BF4 then.


----------



## Archea47

Quote:


> Originally Posted by *<({D34TH})>*
> 
> Huh, so I guess that's where my issue starts. Guess I won't be getting BF4 then.


If it says on the item listing that it comes with BF4 I would certainly make a stink about it. I've gone through the same thing with tigerdirect when I got a 7950 there. For my 280x's from newegg they came with the cards without issue but it did take a while to ship and I made a stink and they refunded my shipping no problem. I recall a number of people having the same issue with newegg and finally getting the BF4 copy


----------



## SkateZilla

it's usually stapled to the invoice.


----------



## linuxfueled

It is the C70, great case.


----------



## mAs81

Quote:


> Originally Posted by *linuxfueled*
> 
> It is the C70, great case.


Niceee








I really love blue-themed rigs!!!!Cool controller too!!


----------



## Devildog83

I was thinking of stripping the entire case down and painting it this blue color in and out.


----------



## alex490

Hi guys,

I bought a Gigabyte R9 280x Dual-X OC 3Go (UEFI Edition) last week and i would like to overclock it.
Does 1.3V is safe for a 24/7 overclock ?
What is the maximum temperature i should not reach ?
If i use TriXX for OC my GPU, it doesn't break the warranty ?

Thanks a lot


----------



## Devildog83

Quote:


> Originally Posted by *alex490*
> 
> Hi guys,
> 
> I bought a Gigabyte R9 280x Dual-X OC 3Go (UEFI Edition) last week and i would like to overclock it.
> Does 1.3V is safe for a 24/7 overclock ?
> What is the maximum temperature i should not reach ?
> If i use TriXX for OC my GPU, it doesn't break the warranty ?
> 
> Thanks a lot


I would use Afterburner instead, neither will void the warranty. 1.3v is OK for overclocking and bench's but I would worry about that 24/7. 1.27 or less I would say. I like to keep my cards at 70c or less under stress, you could go a bit higher but in the 70's is probably a good place to stay.


----------



## goldswimmerb

What are u guys using to change the voltages? On my card at stock voltages I can't get a good overclock. But it runs pretty cool


----------



## crazysoccerman

Word to the wise:

Don't RMA your 280x to newegg.

If your card isn't in stock the exact day they receive it they will refund your money. Unfortunately my DC2T was out of stock the Monday they received it (they received more on Tuesday).

Chances are if you bought it a week before, the price has since increased. Newegg won't switch yours for another brand because they are so valuable. You will be forced to pay more to get another one.

I'm kicking myself for losing my 280x. I tried so hard to keep it... I called customer service five different times. Five days later I'm still waiting for a new gift card to replace the one I used to pay for my 280x. This turned out to be the worst case scenario I thought would never happen.

I've learned how terrible their customer service actually is. Each customer service agent will contradict the other. They are nice but have a poor understanding of their company's policies and will promise things they don't actually know if they can deliver. *TAKE NAMES.* Not taking names was a rookie mistake on my part.

Oh well...

Now I'm on the search for a used GTX 780 for $400. Let the hunt begin.

Newegg and cryptocurrency miners can go and... well....


----------



## Archea47

Quote:


> Originally Posted by *crazysoccerman*
> 
> I've learned how terrible their customer service actually is. Each customer service agent will contradict the other. They are nice but have a poor understanding of their company's policies and will promise things they don't actually know if they can deliver. *TAKE NAMES.* Not taking names was a rookie mistake on my part.


Yep, I would rather talk to someone on the phone but have learnt to use the online chat whenever possible because you can copy the chat log

Just thought I would report - After a few hours of playing BF4 on ultra (dx11 because mantle in BF4 doesnt work well with xfire yet) with my crossfired Gigabyte R9 280x Rev2s max temps:

GPU0: 65*C
GPU1: 59*C

That's with a UT60 radiator (the hottest in my loop) blowing at them but with a fan profile so that at 65*C they get 80% fan

I currently have the F60 bios on the card, which it came with. I've been running the original 1100/1500 clocks. Is there a better BIOS for me to use? Not itching to overclock them at the moment, just as far as optimization goes


----------



## M1kuTheAwesome

Seems my Powercolor 280X is volltage locked up to 1200 mV. Even if I set it higher in AB it stays at 1200. Not too much of an overvolt from the stock 1144.


----------



## dmfree88

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Seems my Powercolor 280X is volltage locked up to 1200 mV. Even if I set it higher in AB it stays at 1200. Not too much of an overvolt from the stock 1144.


Thats strange. At least the let u oc it some. Does powercolor have a program tha might work better or let u push it further?


----------



## MissStinkyBum

Having some serious issues with my MSI OC 280x, getting artifacting all over the place and drivers constantly crashing, tried the latest driver, beta drivers and a older set of drivers and no improvement! Think its time for a Rma!


----------



## mAs81

Quote:


> Originally Posted by *MissStinkyBum*
> 
> Having some serious issues with my MSI OC 280x, getting artifacting all over the place and drivers constantly crashing, tried the latest driver, beta drivers and a older set of drivers and no improvement! Think its time for a Rma!


Perhaps a bios update might help you:
http://www.techpowerup.com/vgabios/index.php?manufacturer=MSI&model=R9+280X
Choose the one you have and try to update it using this guide:
http://www.overclock.net/t/1353325/tutorial-atiwinflash-how-to-flash-the-bios-of-your-ati-cards
Hope that helps...
EDIT:
That will void your warranty though,so tread carefully!!!!!!!


----------



## devilangel

Bios update F11 to F18 helped me or try it with another psu, always 600w+


----------



## Chubz727

With all the price inflation for r9 280x cards i bought me a 270X Pretty happy with it so far I used to have a GTX 560 ti 448


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Chubz727*
> 
> With all the price inflation for r9 280x cards i bought me a 270X Pretty happy with it so far I used to have a GTX 560 ti 448


Grats welcome..

I vroke down succumbed to the super high prices and bought another XFX R9 280x


----------



## Devildog83

Quote:


> Originally Posted by *Chubz727*
> 
> With all the price inflation for r9 280x cards i bought me a 270X Pretty happy with it so far I used to have a GTX 560 ti 448


Sweet card, I would like to see a pic of it in your rig. I am adding you now, let me know what clocks you will be running. I got mine to 1235/1590.


----------



## Hacker90

Quote:


> Originally Posted by *Devildog83*
> 
> Sweet card, I would like to see a pic of it in your rig. I am adding you now, let me know what clocks you will be running. I got mine to 1235/1590.


It really is a great card.. but It would have been better if they wrong the R9 270x other way so that u can read it properly when u put it in the case


----------



## repo_man

What's the easiest way to increase core voltage in the BIOS? Any how-tos/programs for this? I've tried editing the Afterburner setup text file but still don't have the voltage option. I currently have the card at 1200 core stable, and it will stress test at 1275, but then it fails in benchmarks. I think it might just need more voltage. Any tips/help?


----------



## mAs81

Quote:


> Originally Posted by *repo_man*
> 
> What's the easiest way to increase core voltage in the BIOS? Any how-tos/programs for this? I've tried editing the Afterburner setup text file but still don't have the voltage option. I currently have the card at 1200 core stable, and it will stress test at 1275, but then it fails in benchmarks. I think it might just need more voltage. Any tips/help?


Did you try the setting in Afterburner??
Go settings>general>unlock voltage control/monitoring


----------



## repo_man

Quote:


> Originally Posted by *mAs81*
> 
> Did you try the setting in Afterburner??
> Go settings>general>unlock voltage control/monitoring


Yea, I already tried that option as well ass editing the config file as mentioned before.


----------



## mAs81

Quote:


> Originally Posted by *repo_man*
> 
> Yea, I already tried that option as well ass editing the config file as mentioned before.


Hmm,than maybe you have to flash the Bios of your card,can't think of anything else to help you,sorry








Here's a How to If you don't already know...
http://www.overclock.net/t/1353325/tutorial-atiwinflash-how-to-flash-the-bios-of-your-ati-cards


----------



## repo_man

Quote:


> Originally Posted by *mAs81*
> 
> Hmm,than maybe you have to flash the Bios of your card,can't think of anything else to help you,sorry
> 
> 
> 
> 
> 
> 
> 
> 
> Here's a How to If you don't already know...
> http://www.overclock.net/t/1353325/tutorial-atiwinflash-how-to-flash-the-bios-of-your-ati-cards


Thanks for the link. However, I know how to flash it. I'm looking for info on where/how to edit the BIOS itself. I remember reading in here somewhere of people editing the BIOS entry for voltage to increase it: that's what I'm looking for.

Edit: Dug through this thread some more and found some talk of using VBE7 to edit the BIOS. Will look for a how-to/tips guide for that program. Thanks!


----------



## mAs81

Quote:


> Originally Posted by *repo_man*
> 
> Dug through this thread some more and found some talk of using VBE7 to edit the BIOS. Will look for a how-to/tips guide for that program. Thanks!


Found this when I was looking around..Hope it helps..Good luck repo..








http://www.techpowerup.com/forums/threads/vbe7-vbios-editor-for-radeon-hd-7000-series-cards.189089/


----------



## Devildog83

Quote:


> Originally Posted by *Hacker90*
> 
> It really is a great card.. but It would have been better if they wrong the R9 270x other way so that u can read it properly when u put it in the case


Yes it's the same with the 7870 Devil - pics


Spoiler: Warning: Spoiler!


----------



## Hacker90

Quote:


> Originally Posted by *Devildog83*
> 
> Yes it's the same with the 7870 Devil - pics
> 
> 
> Spoiler: Warning: Spoiler!


Damn! man, you need to clean that **** right up!


----------



## Chubz727

im having all type of trouble with this 270x i bought, constant driver stops responding, i launch some games like borderlands 2 and i bluescreen i had 0 problems with my old nvidia card.


----------



## Hacker90

Quote:


> Originally Posted by *Chubz727*
> 
> im having all type of trouble with this 270x i bought, constant driver stops responding, i launch some games like borderlands 2 and i bluescreen i had 0 problems with my old nvidia card.


Is it MSI 270x? Cause one of my friend has the Hawk version and the problem he is facing is that he cant install drivers above 13.9. He gets a black screen and has to restart go into safe mode and uninstall them and reinstall the old ones.

I guess the 270x have some weird problems.


----------



## Chubz727

nah its the powercolor devil edition


----------



## Hacker90

Quote:


> Originally Posted by *Chubz727*
> 
> nah its the powercolor devil edition


Ahh! Then I guess its universal. Every model of 270x has problems I guess


----------



## dmfree88

My giga 270x has had no issues. Works better then my 7870 hawk. Still havent enjoyed it much but no issues so far.


----------



## cookiesowns

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Seems my Powercolor 280X is volltage locked up to 1200 mV. Even if I set it higher in AB it stays at 1200. Not too much of an overvolt from the stock 1144.


Which powercolor 280X do you have?

I have 2 of the turboduo ones and afterburner won't even pick up on it. Sapphire Trixx did the trick though!, and cgminer vddc controls work just fine too.


----------



## Chubz727

Getting constant display driver stop responding when browsing, any game i launch like borderlands 2, battlefield 4, assassins creed 4 will just either, blue screen, black screen or white screen. Even low graphical indie games play for a min same thing happens. I did fresh windows. driver sweeper still same issue i may have a faulty card im gonna RMA it. Only thing i have to fall back on is a 9600GT works flawlessly tho


----------



## Scott1541

I'm getting graphical glitches in windows on 14.1 when mining, but not at any other time. I just exit the miner console and bang, no more glitches until I start it again.


----------



## Recr3ational

Hey guys,
Recently I sent my 7950 for Rma,
They lost it and offered me a Powercolour 280x turboduo.

I've got an EK 7970 acetal csq block.
On cooling config it says the it will fit.

I was wondering what the card is like?
Voltage locked? And problems? I've read that it has eat issues but as I'm going underwater i don't see if it will effect me.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Hey guys,
> Recently I sent my 7950 for Rma,
> They lost it and offered me a Powercolour 280x turboduo.
> 
> I've got an EK 7970 acetal csq block.
> On cooling config it says the it will fit.
> 
> I was wondering what the card is like?
> Voltage locked? And problems? I've read that it has eat issues but as I'm going underwater i don't see if it will effect me.


I think it's Volt locked but it will kick the crap out of your 7950 that's for sure. I haven't heard of any big issues with it, I would snap that up in a minute.


----------



## Devildog83

Quote:


> Originally Posted by *Chubz727*
> 
> Getting constant display driver stop responding when browsing, any game i launch like borderlands 2, battlefield 4, assassins creed 4 will just either, blue screen, black screen or white screen. Even low graphical indie games play for a min same thing happens. I did fresh windows. driver sweeper still same issue i may have a faulty card im gonna RMA it. Only thing i have to fall back on is a 9600GT works flawlessly tho


Are you using 14.1 beta? I had issues with it and went back to 13.12 for now.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> I think it's Volt locked but it will kick the crap out of your 7950 that's for sure. I haven't heard of any big issues with it, I would snap that up in a minute.


What's the OC like? Also which powercolour one do you have the TurboDuo?


----------



## repo_man

Quote:


> Originally Posted by *Chubz727*
> 
> Getting constant display driver stop responding when browsing, any game i launch like borderlands 2, battlefield 4, assassins creed 4 will just either, blue screen, black screen or white screen. Even low graphical indie games play for a min same thing happens. I did fresh windows. driver sweeper still same issue i may have a faulty card im gonna RMA it. Only thing i have to fall back on is a 9600GT works flawlessly tho


IIRC MSI has a bios update for the 270X. Have you tried that? Some of the 270/270Xs were having issues but the bios update/14.1 driver updates has fixed some of it.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> What's the OC like? Also which powercolour one do you have the TurboDuo?


I actually have a 270x Devil @ 1200/1400 which could probably match your 7950. There are many 280x owners in here who will be able to give you more detailed info but it's far superior to the 7950 and even a tad better than most 7970's from what I have seen.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> I actually have a 270x Devil @ 1200/1400 which could probably match your 7950. There are many 280x owners in here who will be able to give you more detailed info but it's far superior to the 7950 and even a tad better than most 7970's from what I have seen.


Okay thanks. I went with it. Free upgrade







thanks again.


----------



## Devildog83

I finally tried litecoin mining and got 300 k/Hs with just the 270x, it's not worth the trouble. After I ran it for a while and shut it down I started having issues with X-Fire, I only got 25% usage out of the 270x which is the bottom card in the set-up. I thought it was the drivers and uninstalled and installed the 13.12 driver and I am working fine but I am wondering if cgminer may have done something to the driver. Could it be that cgminer doesn't like the 14.1 drivers? The system runs great on the 13.12 and I am tempted to leave it there until 14.1 get's more updates but I don't think I will be mining with my rig anymore.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Okay thanks. I went with it. Free upgrade
> 
> 
> 
> 
> 
> 
> 
> thanks again.


Nice, pop back in when it comes with a pic and some clocks and I will add you to the club if you wish.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Nice, pop back in when it comes with a pic and some clocks and I will add you to the club if you wish.


Will do!
Oh btw, how's the backplate? Is it easy to take off? It doesn't look like screws from the pics.


----------



## hamzta09

Anyone here had a 780 Classy or Lightning?

Wondering how big of a difference it is between 280X crossfire and a single 780.

Ive been youtubing etc and havent found any crossfire vs 780 results, shame there are no benching sites that specialize in comparing CF/SLI vs Singlecards (All modern ones)


----------



## Chubz727

Quote:


> Originally Posted by *Devildog83*
> 
> Are you using 14.1 beta? I had issues with it and went back to 13.12 for now.


I tried it with both 13.12 and 14.1 both does the same thing.


----------



## Chubz727

Quote:


> Originally Posted by *repo_man*
> 
> IIRC MSI has a bios update for the 270X. Have you tried that? Some of the 270/270Xs were having issues but the bios update/14.1 driver updates has fixed some of it.


Where can i get that bios update? also doing that does it void the warranty?


----------



## hamzta09

Quote:


> Originally Posted by *Chubz727*
> 
> Where can i get that bios update? also doing that does it void the warranty?


Yes.
You can just flash it back though if you need to maybe RMA it.

Back up old bios with GPU-Z


----------



## Chubz727

Quote:


> Originally Posted by *hamzta09*
> 
> Yes.
> You can just flash it back though if you need to maybe RMA it.
> 
> Back up old bios with GPU-Z


Ok i dont know how to flash a gpu bios but im going to find out do some research lol i see they have one http://www.techpowerup.com/vgabios/149452/powercolor-r9270x-2048-131022.html


----------



## hamzta09

Quote:


> Originally Posted by *Chubz727*
> 
> Ok i dont know how to flash a gpu bios but im going to find out do some research lol i see they have one http://www.techpowerup.com/vgabios/149452/powercolor-r9270x-2048-131022.html


http://www.overclock.net/t/1353325/tutorial-atiwinflash-how-to-flash-the-bios-of-your-ati-cards


----------



## Chubz727

Quote:


> Originally Posted by *hamzta09*
> 
> http://www.overclock.net/t/1353325/tutorial-atiwinflash-how-to-flash-the-bios-of-your-ati-cards


Thank you man appreciate it


----------



## <({D34TH})>

@Devildog83, add me on the list!

ASUS R9 270 DCII OC. Clocks are 1000/1500 for the time being.


----------



## Chubz727

Quote:


> Originally Posted by *Chubz727*
> 
> Thank you man appreciate it


Quote:


> Originally Posted by *hamzta09*
> 
> http://www.overclock.net/t/1353325/tutorial-atiwinflash-how-to-flash-the-bios-of-your-ati-cards


So i manage to flash it the Bios the card came with was 015.041.000.000.000000 the one i downloaded from http://www.techpowerup.com/vgabios/149452/powercolor-r9270x-2048-131022.html
is 015.040.000.000.000000 still doing the same thing. Instant blue screen when loading into BF4 map. and random display driver crashes. I think the video card is just messed up. I already requested a RMA from where i bought it. Since its the weekend i don't think ill get a response untill Monday.


----------



## hamzta09

Quote:


> Originally Posted by *Chubz727*
> 
> So i manage to flash it the Bios the card came with was 015.041.000.000.000000 the one i downloaded from http://www.techpowerup.com/vgabios/149452/powercolor-r9270x-2048-131022.html
> is 015.040.000.000.000000 still doing the same thing. Instant blue screen when loading into BF4 map. and random display driver crashes. I think the video card is just messed up. I already requested a RMA from where i bought it. Since its the weekend i don't think ill get a response untill Monday.


Do you run crossfire?
Do you run 144hz?

If not then Idk what the problem might be.

Does it crash in other games?


----------



## Chubz727

Quote:


> Originally Posted by *hamzta09*
> 
> Do you run crossfire?
> Do you run 144hz?
> 
> If not then Idk what the problem might be.
> 
> Does it crash in other games?


No x fire no 144hz monitor crashes on every game i tried. rig is about a month old, never had a problem with my nvidia card 750w psu more then enough juice for it too.


----------



## repo_man

Quote:


> Originally Posted by *Chubz727*
> 
> Where can i get that bios update? also doing that does it void the warranty?


Check MSI's werbsite. That's where I found the one for my Asus 270X. Also, the ASUS bios update is just an .exe that flashes the card all by itself. So you might not need to go through the hassle of manually flashing it.


----------



## Chubz727

Quote:


> Originally Posted by *repo_man*
> 
> Check MSI's werbsite. That's where I found the one for my Asus 270X. Also, the ASUS bios update is just an .exe that flashes the card all by itself. So you might not need to go through the hassle of manually flashing it.


Can i do that? mine is a power color can i use an MSI bios? if so what version do u have ?


----------



## repo_man

Quote:


> Originally Posted by *Chubz727*
> 
> Can i do that? mine is a power color can i use an MSI bios? if so what version do u have ?


Sorry, old nooblet moment. I was thinking you had an MSI card. I'm not sure if you can use other manufacturers bios updates. Though you can check Powercolor's website!

Also, have you tried reseating the GPU? I've solved the occasional gpu issue with reinstalling it physically in the case. The old adage of engineers: "First rule: give it a good whack." Lol.


----------



## hamzta09

Did you do a driversweeper in safemode, before installing the AMD drivers?


----------



## Chubz727

Yup did all of that like 4 times. lol i appreciate everyone for trying to help i just think i got a bad card. 1st time in years this happened to me.


----------



## repo_man

Quote:


> Originally Posted by *Chubz727*
> 
> Yup did all of that like 4 times. lol i appreciate everyone for trying to help i just think i got a bad card. 1st time in years this happened to me.


Sometimes that's just how it is. I originally had ordered a HIS 270, but it suffered from teh 270/x stutter/flicker. And HIS had no updates for it, so I had to RMA it. I ended up replacing it for the ASUS 270X I have now (mostly because ASUS had released a bios update) and I really like it so far. You might just have a bad one.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Will do!
> Oh btw, how's the backplate? Is it easy to take off? It doesn't look like screws from the pics.


The screws are on the other side, you have to take off the heatsink I believe, I haven't tried it yet.


----------



## Devildog83

D34th has been added. Welcome.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> The screws are on the other side, you have to take off the heatsink I believe, I haven't tried it yet.


Thanks devil, I got a feeling were gonna be good mates. You're a legend cheers!


----------



## <({D34TH})>

Quote:


> Originally Posted by *Devildog83*
> 
> D34th has been added. Welcome.


My GPU is simply 270, not 270X.


----------



## repo_man

Changed the thermal paste on my Asus 270x TOP today. Shaved about 4-5C off the load temps depending on the WU. Was around 48C and now it's been hovering around 43C. Used some IC-7 I have laying around. I'll post some pics of the PCB/cooler if anyone is interested.


----------



## bartledoo

Does the Toxic cards use good thermal paste? i was thinking on replacing it with mx-4


----------



## [CyGnus]

Quote:


> Originally Posted by *bartledoo*
> 
> Does the Toxic cards use good thermal paste? i was thinking on replacing it with mx-4


Do it, normally stock paste is too much anyway so you can shave a few degrees in the swap







my 280x dropped 7ºc and i used MX-4


----------



## Scott1541

I'll probably end up changing the paste on my 270X at some point. I'm not it a hurry to do it though because my temps are alright currently, I can get good temps at low fan speeds.

Changing the paste doesn't void the warranty on MSI cards does it? I remember reading something about it being alright when I did it to my old 460 which was also MSI but don't know for sure.


----------



## dmfree88

Quote:


> Originally Posted by *Devildog83*
> 
> I finally tried litecoin mining and got 300 k/Hs with just the 270x, it's not worth the trouble. After I ran it for a while and shut it down I started having issues with X-Fire, I only got 25% usage out of the 270x which is the bottom card in the set-up. I thought it was the drivers and uninstalled and installed the 13.12 driver and I am working fine but I am wondering if cgminer may have done something to the driver. Could it be that cgminer doesn't like the 14.1 drivers? The system runs great on the 13.12 and I am tempted to leave it there until 14.1 get's more updates but I don't think I will be mining with my rig anymore.


14.1 creates bad bin files on pitcairns. Its also a balancing act with clocks and settings. Use my guide in sig get kalroths miner set up exactly the same as this:

Threadconcurrency 8193 gputhreads 2 xintensity 4 worksize 256 coreclock 1155 memclock 1490

So far these settings have worked on all 270x and 270 that have tried.

For a 7870 all exact same settings but core clock 1003 memclock 1344

Your 270x should get almost 500kh/s while a 7870 tops around 430(for me). Pm me if u need some help you could be makin about 7-10 bucks a day.


----------



## Devildog83

Quote:


> Originally Posted by *<({D34TH})>*
> 
> My GPU is simply 270, not 270X.


It's fixed, sorry about that. It's old man syndrome!


----------



## Devildog83

Quote:


> Originally Posted by *dmfree88*
> 
> 14.1 creates bad bin files on pitcairns. Its also a balancing act with clocks and settings. Use my guide in sig get kalroths miner set up exactly the same as this:
> 
> Threadconcurrency 8193 gputhreads 2 xintensity 4 worksize 256 coreclock 1155 memclock 1490
> 
> So far these settings have worked on all 270x and 270 that have tried.
> 
> For a 7870 all exact same settings but core clock 1003 memclock 1344
> 
> Your 270x should get almost 500kh/s while a 7870 tops around 430(for me). Pm me if u need some help you could be makin about 7-10 bucks a day.


OK I will. 1003/1344 seams very low. It's below stock clocks.


----------



## hamzta09

Anyone here got Hitman Absolution?

At the tenniscourt I get an fps of 35-40... even if I lower some settings, with my 680 I had solid 62fps (limiter)


----------



## Recr3ational

Quote:


> Originally Posted by *Scott1541*
> 
> I'll probably end up changing the paste on my 270X at some point. I'm not it a hurry to do it though because my temps are alright currently, I can get good temps at low fan speeds.
> 
> Changing the paste doesn't void the warranty on MSI cards does it? I remember reading something about it being alright when I did it to my old 460 which was also MSI but don't know for sure.


No it doesn't. Well not with MSI anyway. I re-did mine on both of my 7950s before sending for RMA. Its fine.


----------



## bartledoo

add me to the 270x club!


----------



## cookiesowns

Quote:


> Originally Posted by *Devildog83*
> 
> I think it's Volt locked but it will kick the crap out of your 7950 that's for sure. I haven't heard of any big issues with it, I would snap that up in a minute.


No. The powercolor TurboDUO is not volt locked! Both my cards are not volt locked, and with the latest AB beta 18, I can overvolt memory to 1.7 and core to 1.3V!


----------



## dmfree88

Quote:


> Originally Posted by *Devildog83*
> 
> OK I will. 1003/1344 seams very low. It's below stock clocks.


Mining is very strange both memory and core have to follow a pattern. I get the same results at higher clocks. I could not get more out of ocing it further plus of course power is a factor aswell. Ill update my guide later with more info but fool around with it im sure you can get over 400 and make a good profit


----------



## Hacker90

Quote:


> Originally Posted by *bartledoo*
> 
> add me to the 270x club!


DAMN! That card is HUGE & AWESOME! Welcome to the club


----------



## drieg500

got my 270X back in December but it was just now I was able to fiddle around with it.

temps lowered by 6c-7c

got stable clocks at 1200/1550. haven't touched voltages yet. would love to try if I can.



btw, add me to the OP. thanks


----------



## Hacker90

I cant do that to mine







There is a warning sticker on one of the screws


----------



## drieg500

Quote:


> Originally Posted by *Hacker90*
> 
> I cant do that to mine
> 
> 
> 
> 
> 
> 
> 
> There is a warning sticker on one of the screws


@

What card and brand do you have?

In any case, you can try asking you brand's Facebook national and international pages. I have PMed both MSI Philippines (my country) and MSI Computers US about the small circle sticker above one of the screws. They (in essence) told me that my warranty would not be voided (although the sticker says "Warranty void if tampered", heh) before I went in and replaced the TIM on my HAWK.

I'm pretty happy with the results.


----------



## dabby91

Hey guys! Been lurking around here for a while now, but just realised I haven't posted anything yet! Anyway, got myself the ASUS 280X TOP back before christmas, and has been brilliant in the likes of Bioshock Infinite. Can't use it to its full speed at the moment, as its only running at x8 instead of x16, but its fast enough







Mind adding me to the club?


----------



## Recr3ational

Quote:


> Originally Posted by *Hacker90*
> 
> I cant do that to mine
> 
> 
> 
> 
> 
> 
> 
> There is a warning sticker on one of the screws


Have you got the powercolour?
Cos im getting the same one and seen the sticker too.


----------



## Devildog83

Quote:


> Originally Posted by *cookiesowns*
> 
> No. The powercolor TurboDUO is not volt locked! Both my cards are not volt locked, and with the latest AB beta 18, I can overvolt memory to 1.7 and core to 1.3V!


I wasn't sure because the 270x is I believe, thanks.


----------



## Devildog83

*bartledoo & Daddy91* Have been added, WELCOME !! ---- Please post some clocks for the OP page !!

bartledoo I entered a Toxic because that's what it looks like. Is that correct?


----------



## M1kuTheAwesome

I tried latest AB Beta and TriXX, but the voltage on my 280X TurboDuos still won't go past 1200. The programs let me set it higher, but under load it still won't go past 1200 in use... How did you guys get this card higher? Is my card faulty? Has Powercolor launched any BIOS updates for it?


----------



## repo_man

Quote:


> Originally Posted by *Hacker90*
> 
> I cant do that to mine
> 
> 
> 
> 
> 
> 
> 
> There is a warning sticker on one of the screws


Do what I did, take a straight blade razor and slowly slide it under the sticker. Then Re do paste as ususal, the put the sticker back on top of the screw.


----------



## Devildog83

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> I tried latest AB Beta and TriXX, but the voltage on my 280X TurboDuos still won't go past 1200. The programs let me set it higher, but under load it still won't go past 1200 in use... How did you guys get this card higher? Is my card faulty? Has Powercolor launched any BIOS updates for it?


Did you try going into the settings for AB and selecting extend overclocking limits and then reboot? If you have graphics overdrive in CCC on turn it off. Other than that I do not know.

AB should look like this in the settings. DLPS disabled.


----------



## bartledoo

Quote:


> Originally Posted by *Devildog83*
> 
> *bartledoo & Daddy91* Have been added, WELCOME !! ---- Please post some clocks for the OP page !!
> 
> bartledoo I entered a Toxic because that's what it looks like. Is that correct?


Yes that is right, I am trying to see what my stable oc is atm


----------



## Devildog83

I think I know how to bench and keep the temps down - the view out of my man-cave this morning -


----------



## 2002dunx

Well I went out for a HD 7870 GHz edition and came back with a Powercolor R9 280X









Looks like a ROG product in general and came with a pretty back-plate which interferes with the DDR clips on the Asus Impact VI.....

Doh !

dunx

P.S. reasonable price, so I'm happy..... just need a new PC to put it in, or I take a "dremel" to the clips


----------



## cookiesowns

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> I tried latest AB Beta and TriXX, but the voltage on my 280X TurboDuos still won't go past 1200. The programs let me set it higher, but under load it still won't go past 1200 in use... How did you guys get this card higher? Is my card faulty? Has Powercolor launched any BIOS updates for it?


Hrm.

It appears that voltage control works for me on TriXX, but afterburner beta 18 is only able to overvolt the Memory, not core, even though the sliders are there.

I like the Powercolor 280X's, just hate the cooler since they don't move enough air so my top card gets very toasty when in CF. Nothing a large box fan won't fix when I'm mining









Good news is the card doesn't seem to throttle even when it hits 85-90C.


----------



## Hacker90

Quote:


> Originally Posted by *repo_man*
> 
> Do what I did, take a straight blade razor and slowly slide it under the sticker. Then Re do paste as ususal, the put the sticker back on top of the screw.


Lol, nice Idea







I will do that when my dealer gets the MX-4 in stock







Cant rep u for some reason!
Quote:


> Originally Posted by *Recr3ational*
> 
> Have you got the powercolour?
> Cos im getting the same one and seen the sticker too.


Nope I have the XFX.
Quote:


> Originally Posted by *drieg500*
> 
> @
> 
> What card and brand do you have?
> 
> In any case, you can try asking you brand's Facebook national and international pages. I have PMed both MSI Philippines (my country) and MSI Computers US about the small circle sticker above one of the screws. They (in essence) told me that my warranty would not be voided (although the sticker says "Warranty void if tampered", heh) before I went in and replaced the TIM on my HAWK.
> 
> I'm pretty happy with the results.


XFX 280X bud. I will first ask them if the say YES the warranty will be void then I will just do what @repo_man said


----------



## Recr3ational

Quote:


> Originally Posted by *cookiesowns*
> 
> Hrm.
> 
> It appears that voltage control works for me on TriXX, but afterburner beta 18 is only able to overvolt the Memory, not core, even though the sliders are there.
> 
> I like the Powercolor 280X's, just hate the cooler since they don't move enough air so my top card gets very toasty when in CF. Nothing a large box fan won't fix when I'm mining
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good news is the card doesn't seem to throttle even when it hits 85-90C.


Oh that sucks. Hopefully mine works with Afterburner, i don't like trixx that much.
What case have you got? Maybe you can attach some more fans near it to improve airflow.


----------



## repo_man

Quote:


> Originally Posted by *Hacker90*
> 
> Lol, nice Idea
> 
> 
> 
> 
> 
> 
> 
> I will do that when my dealer gets the MX-4 in stock
> 
> 
> 
> 
> 
> 
> 
> Cant rep u for some reason!


You can't rep me because I'm retired staff.







Staff doesn't have rep.


----------



## staryoshi

I'm going to be bailing on the club soon. I ordered an EVGA GTX 780 Dual ACX FTW for a steal of a deal and it'll be here by Thursday







Love this R9 270X, but it just can't handle 1440p and two in crossfire are too hot and loud in my mATX setup









For 1080p setups, though, this Asus R9 270X is awesomesauce. Love the engineering behind the PCB circuitry and heatsink.


----------



## F3ERS 2 ASH3S

for those of you that missed the BF4 freebie Origin has it + china rising for $35

edit: maybe it was just a special deal for me... .not sure saw it when I logged in


----------



## Devildog83

Quote:


> Originally Posted by *staryoshi*
> 
> I'm going to be bailing on the club soon. I ordered an EVGA GTX 780 Dual ACX FTW for a steal of a deal and it'll be here by Thursday
> 
> 
> 
> 
> 
> 
> 
> Love this R9 270X, but it just can't handle 1440p and two in crossfire are too hot and loud in my mATX setup
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For 1080p setups, though, this Asus R9 270X is awesomesauce. Love the engineering behind the PCB circuitry and heatsink.


Sorry to see you go and good luck with the 780. Pop in and let us know how the card works for you.


----------



## staryoshi

I'll always have my eye on you guys


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *cookiesowns*
> 
> Hrm.
> 
> It appears that voltage control works for me on TriXX, but afterburner beta 18 is only able to overvolt the Memory, not core, even though the sliders are there.
> 
> I like the Powercolor 280X's, just hate the cooler since they don't move enough air so my top card gets very toasty when in CF. Nothing a large box fan won't fix when I'm mining
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good news is the card doesn't seem to throttle even when it hits 85-90C.


I like my card too. I have no issue with the cooler, I only have a single card setup and with a custom fan curve it won't even go past 65C. Looks kood too, but sadly I rarely get to see it cause I don't have a wall mounted rig.







Would be nicer if I could bump the voltage a bit further, but even if that doesn't work out, I don't regret buying it.
Quote:


> Originally Posted by *Devildog83*
> 
> Did you try going into the settings for AB and selecting extend overclocking limits and then reboot? If you have graphics overdrive in CCC on turn it off. Other than that I do not know.
> 
> AB should look like this in the settings. DLPS disabled.


Tried that. Didn't help sadly. :/ Thanks for trying to help though.


----------



## cookiesowns

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> I like my card too. I have no issue with the cooler, I only have a single card setup and with a custom fan curve it won't even go past 65C. Looks kood too, but sadly I rarely get to see it cause I don't have a wall mounted rig.
> 
> 
> 
> 
> 
> 
> 
> Would be nicer if I could bump the voltage a bit further, but even if that doesn't work out, I don't regret buying it.
> Tried that. Didn't help sadly. :/ Thanks for trying to help though.


Use the latest Trixx. Trust me, it WILL work =)


----------



## drieg500

Didn't see any 270X HAWK users in the list. I guess I'm the first one??



BTW, add me to the list Devildog83.


----------



## Devildog83

Quote:


> Originally Posted by *drieg500*
> 
> Didn't see any 270X HAWK users in the list. I guess I'm the first one??
> 
> 
> 
> BTW, add me to the list Devildog83.


Gladly, what clocks are you running at?


----------



## drieg500

Quote:


> Originally Posted by *Devildog83*
> 
> Gladly, what clocks are you running at?


Currently at stock, but my stable OC profile was 1200/1500. haven't tried touching the voltages yet.


----------



## Devildog83

*dreig500* has been added, welocme !!

I am very curious how well that card will do, I thought about getting one for a split second because a friend had a 7870 Hawk that was awesome but alas I had to pair up the Devil's.


----------



## drieg500

Quote:


> Originally Posted by *Devildog83*
> 
> *dreig500* has been added, welocme !!
> 
> I am very curious how well that card will do, I thought about getting one for a split second because a friend had a 7870 Hawk that was awesome but alas I had to pair up the Devil's.


I am pretty happy with the card.

But my choices were either the HAWK or Toxic. Only the HAWK was available at my store and I bought it straight up.

I'm also looking forward to maximize its OC capabilities if I had some extra time.


----------



## rickyman0319

it is strange when I tried to buy r9 270x gpu, and I go to check the price. it went up $30 checkout.


----------



## Roaches

Quote:


> Originally Posted by *rickyman0319*
> 
> it is strange when I tried to buy r9 270x gpu, and I go to check the price. it went up $30 checkout.


Holy cow you're right! Not really fair at all!


----------



## goldswimmerb

Okay guys so I finally got TRIXX working, but I'm unsure about my over clock. I have an XFX 270x card and would like to know if its safe to set the voltage to 1.3 from 1.2. Also my memory clock always stays at 1400MHz any idea how to fix this?


----------



## staryoshi

Sometimes NewEgg prices in the product list are not updated to reflect current pricing. It usually occurs the day a sale price ends.


----------



## F3ERS 2 ASH3S

Second XFX 280x on its way.. should be here on thursday... I succumbed to the high prices


----------



## Archea47

Any rumors or clever adaptations of a full waterblock for the Gigabyte 280X Rev2 yet? Or do I have to break down and do a universal, chip-only solution?

Any good fan bracket options? The NZXT Kraken doesn't work for me because I would be integrating it with my loop, not using an AIO

This would be for a dual-gpu (with Giga 280x) setup

I don't really want to add the heat to my loop but would like my rig to start sponsoring its upgrades while I'm at work


----------



## repo_man

Quote:


> Originally Posted by *cookiesowns*
> 
> Use the latest Trixx. Trust me, it WILL work =)


Saw your post and decided I'd give Trixx a try, after I had already tried all the tips for Afterburner/etc. Trixx indeed has the voltage control for my Asus 270X. +rep to you sir!







Onwards and upwards tomorrow then. I'm going to see if I can get this card above 1200 core stable.


----------



## neurotix

I told you to just use Trixx a week ago.


----------



## turbobooster

[img=http://s29.postimg.org/fiv6w8b0z/1660377_676211249088992_1028863316_n.jpg]


----------



## cremelo

I'm thinking about getting an MSI HD 7970 Lightning BE to do for my company r9 280x Asus DC2 or maybe wait a little longer and buy another 280x.....
Anyone know if the MSI HD 7970 Lightning BE has BiosMod to 280x?


----------



## turbobooster

Also is there a bios mod for the r9 270x toxic


----------



## Devildog83

Maybe the first post isn't clear enough so I have moved the instructions for joining the club up under the LOGO on the 1st page to allow for more visibility. If you would like to join please post *the make and model of your card, a pic, and whatever clocks, 24/7 or overclock you wish to have posted* Thanks Guys!!!


----------



## dabby91

Quote:


> Originally Posted by *Devildog83*
> 
> *bartledoo & Daddy91* Have been added, WELCOME !! ---- Please post some clocks for the OP page !!
> 
> bartledoo I entered a Toxic because that's what it looks like. Is that correct?


Just running at stock clocks atm, was hoping to do some OC'ing when i get the time xD


----------



## turbobooster

this is my card the sapphire r9 270x toxic, would like it to go at 1.3v and go for 24/7 clocks.
the 24/7 clocks I would like is 1250/1650

[img=http://s10.postimg.org/aa9omlxyh/Turbo_Wallpaper.jpg]
screenshot program


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *turbobooster*
> 
> this is my card the sapphire r9 270x toxic, would like it to go at 1.3v and go for 24/7 clocks.
> the 24/7 clocks I would like is 1250/1650
> 
> [img=http://s10.postimg.org/aa9omlxyh/Turbo_Wallpaper.jpg]
> screenshot program


yah know there is an upload image option in the post


----------



## turbobooster

sorry just find it, now the find somebody how can help me with the overclock, or a good bios mod, just I am on air.


----------



## Devildog83

I did it this time but just telling me you are running at stock clocks doesn't help. I need to know what they are. If you don't tell me I have to look them up as I can't remember the stock clocks for all of these cards.

By the way welcome to *turbobooster* let's see what the Toxy can do !!!


----------



## cjc75

Greetings all, I'm in the market for a new GPU this year, courtesy of Uncle Sam who is sending me a nice Tax Return tomorrow... looking to replace my Radeon 6950 with something a lot beefier, and admittedly, I have been leaning towards the GTX 770 4GB cards....

However.

I just today received MicroCenter's monthly ad for their February Sale Deals, and I was stunned and absolutely shocked to find that they are selling the Asus Direct CU R9 280X , and the MSI R9 280x cards, both at around $150 - $170 LESS then what Newegg is selling them for!!

MicroCenter as the Asus at $349, marked down $50 from their regular price of $399 and the MSI on sale at $329, marked down $20 from its regular price of $349!! All the while, Newegg is selling both at $519!!

Of course, both my local MicroCenters are ALREADY Sold Out... but now I am wondering...

Either that is one helluva sale, or MicroCenter knows something. Is there a significant price drop coming from AMD; or perhaps a new R9 model about to be introduced to shake things up a little? Or is this just a heck of a sale deal and I should probably watch to see if they'll get more in stock? To my understanding, the R9 280x is on Par with the GTX 770, but the only reason I was leaning towards the GTX is because the R9 cost a heck of a lot more; but if I can get one at the price MicroCenter is selling them at? Heck, yea I would stay with Radeon then!


----------



## repo_man

Quote:


> Originally Posted by *neurotix*
> 
> I told you to just use Trixx a week ago.


Yea...well...see what had happened was...







I forgot, lol. I'm old, remember?


----------



## cjc75

Quote:


> Originally Posted by *cjc75*
> 
> Greetings all, I'm in the market for a new GPU this year, courtesy of Uncle Sam who is sending me a nice Tax Return tomorrow... looking to replace my Radeon 6950 with something a lot beefier, and admittedly, I have been leaning towards the GTX 770 4GB cards....
> 
> However.
> 
> I just today received MicroCenter's monthly ad for their February Sale Deals, and I was stunned and absolutely shocked to find that they are selling the Asus Direct CU R9 280X , and the MSI R9 280x cards, both at around $150 - $170 LESS then what Newegg is selling them for!!
> 
> MicroCenter as the Asus at $349, marked down $50 from their regular price of $399 and the MSI on sale at $329, marked down $20 from its regular price of $349!! All the while, Newegg is selling both at $519!!
> 
> Of course, both my local MicroCenters are ALREADY Sold Out... but now I am wondering...
> 
> Either that is one helluva sale, or MicroCenter knows something. Is there a significant price drop coming from AMD; or perhaps a new R9 model about to be introduced to shake things up a little? Or is this just a heck of a sale deal and I should probably watch to see if they'll get more in stock? To my understanding, the R9 280x is on Par with the GTX 770, but the only reason I was leaning towards the GTX is because the R9 cost a heck of a lot more; but if I can get one at the price MicroCenter is selling them at? Heck, yea I would stay with Radeon then!


Just discovered that Frys is selling the Asus Direct CU R9 280X at $319!!

$200 less then Newegg!

Whats going on here?


----------



## Archea47

Quote:


> Originally Posted by *cjc75*
> 
> Just discovered that Frys is selling the Asus Direct CU R9 280X at $319!!
> 
> $200 less then Newegg!
> 
> Whats going on here?


Well you got me all excited, but there aren't any in stock within 100 miles of Chicago according to their website


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *cjc75*
> 
> Just discovered that Frys is selling the Asus Direct CU R9 280X at $319!!
> 
> $200 less then Newegg!
> 
> Whats going on here?


if you buy from Frys make sure that its not a return. They like to rebox bad items


----------



## Devildog83

I would buy all I could get my hands on if I was within' 1000 miles of a Microcenter.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> I would buy all I could get my hands on if I was within' 1000 miles of a Microcenter.


dont feel bad I live 15 miles away from one and they are sold out of anything above a r7 260


----------



## tsm106

Fry's getting you all to click to their site? They never do such things...


----------



## cjc75

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> if you buy from Frys make sure that its not a return. They like to rebox bad items


Yea I had that exact experience with Frys when I bought my 6950...

They had the Diamond model on sale, so I went and bought it, got home and opened the box and was all excited to find it was the Reference Model... got it all installed, and then discovered someone had ALREADY flashed its Dual BIOS, and that it was a flawed card, ran hot at +90c!! Not only that it kept locking up while playing Crysis 2! So tok it back a few days later, got it exchanged for another, also a Diamond, and got home and it was not used, but also was not the Reference model... oh well. Still had dual bios option so I kept it, and its been running great ever since... but now I'm kinda wanting something better! lol

As for those R9 280X' .. yea everyone local is sold out, I checked MicroCenter nationwide, and every store in the entire nation is all sold out...

But it is their FEBRUARY deal, so its a price they're supposed to have throughout the entire month, so I'm gonna watch for their next stock to arrive!


----------



## turbobooster

Hoping that somebody can help me with unlocking my card from stock 1150/1500 to a decent 24/7 overclock, I know what the max is, at least in 3d 11


----------



## JoeDirt

For anyone with a PowerColor 270, the BIOS seems to be hardware locked from flashing.


----------



## hamzta09

To those of you whom run Crossfire with 280X (or any card really)

Which games have you specifically had problems with? And which games did you find a noticable improvement?
So far I can only find one game that uses Crossfire and thats Battlefield 4, and Ive tried roughly 100+ games on steam and only a few ~5 in total, actually used the other GPU, well BioShock Infinite does use the other GPU, but that game aint playable when you move around, randomly you get freezes.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> To those of you whom run Crossfire with 280X (or any card really)
> 
> Which games have you specifically had problems with? And which games did you find a noticable improvement?
> So far I can only find one game that uses Crossfire and thats Battlefield 4, and Ive tried roughly 100+ games on steam and only a few ~5 in total, actually used the other GPU, well BioShock Infinite does use the other GPU, but that game aint playable when you move around, randomly you get freezes.


ill check against crysis 3 when my card arrives may try metro 2033 to.. if you want I can try sim city


----------



## hamzta09

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> ill check against crysis 3 when my card arrives may try metro 2033 to.. if you want I can try sim city


lol SimCity is CPU bound like crazy, I get slowdowns even with just singlecard in that game :/

But yeah, please post results.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> lol SimCity is CPU bound like crazy, I get slowdowns even with just singlecard in that game :/
> 
> But yeah, please post results.


I almost want to say the slowdown is net related... I really haven't had in issue with simcity then again it's not like it graphically intensive


----------



## cookiesowns

Ok,

Add me to the club. Looks like I'll be sticking with these 280X's =)

It's not my full bench/game stable clocks yet. Just some basic clock speeds with no voltage tuning.

R9 280X Powercolor TurboDUO OC "AXR9"


----------



## hamzta09

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> I almost want to say the slowdown is net related... I really haven't had in issue with simcity then again it's not like it graphically intensive


The slowdown is framerate, "the net" has nothing to do with framerate.

Zoom in and voila ~30fps.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> The slowdown is framerate, "the net" has nothing to do with framerate.
> 
> Zoom in and voila ~30fps.


Maybe I'll have to re look. Never noticed anything below 60


----------



## Devildog83

*cookiesown* has been added. Welcome to the club!!!


----------



## Recr3ational

Don't add me to the club yet. Still don't know what clocks are stable yet.


----------



## Devildog83

Quote:


> Originally Posted by *hamzta09*
> 
> To those of you whom run Crossfire with 280X (or any card really)
> 
> Which games have you specifically had problems with? And which games did you find a noticable improvement?
> So far I can only find one game that uses Crossfire and thats Battlefield 4, and Ive tried roughly 100+ games on steam and only a few ~5 in total, actually used the other GPU, well BioShock Infinite does use the other GPU, but that game aint playable when you move around, randomly you get freezes.


I know I can run Crysis3, BF3 and BF4 for sure. Also Rome Total war and MineCraft. Crysis3 I keep V-Sink on and it runs smooth at 60FPS, I get to 100 FPS with 4xAA and V-Sink off with all other at max but the top card get's to 80c so I keep V-Sink on and it never heats up. BF3 runs at 160 FPS all max with V-Sink off and BF4 I run at very high with V-Sink off and get 50 to 80 FPS.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> 
> 
> Don't add me to the club yet. Still don't know what clocks are stable yet.


If you don't know you better ax somebody. LOL


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> If you don't know you better ax somebody. LOL


Confused, AX?


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Confused, AX?


Sorry it was a play on ebonics. Ax instead of ask.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Sorry it was a play on ebonics. Ax instead of ask.


May I be added to the club please sir.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> May I be added to the club please sir.


Sure, you said wait for clocks so I was waiting for clocks.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Sure, you said wait for clocks so I was waiting for clocks.


Good idea


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Good idea


Hey, I aim to please.


----------



## smoke2

Please, can you post your ASUS DCII 280X VRM1 and VRM2 temps in load?
The best would be with Fractal R4 case, but I appreciate also temps with another case


----------



## Recr3ational

Anyone got any gpu monitor program gadget?
I used to use gpu observer and all gpu meter but it wont show both of my gpus


----------



## [CyGnus]

Recr3ational GPU Observer works you just have to choose GPU 1 and open another GPU Observer and choose GPU 2


----------



## Recr3ational

Quote:


> Originally Posted by *[CyGnus]*
> 
> Recr3ational GPU Observer works you just have to choose GPU 1 and open another GPU Observer and choose GPU 2


Its only displaying one gpu. I used to do what you said.
Scratch that it isnt showing any.


----------



## Archea47

Quote:


> Originally Posted by *hamzta09*
> 
> To those of you whom run Crossfire with 280X (or any card really)
> 
> Which games have you specifically had problems with? And which games did you find a noticable improvement?
> So far I can only find one game that uses Crossfire and thats Battlefield 4, and Ive tried roughly 100+ games on steam and only a few ~5 in total, actually used the other GPU, well BioShock Infinite does use the other GPU, but that game aint playable when you move around, randomly you get freezes.


Skyrim uses equal amounts of GPU1 & 2 for me but there is some strange stuttering and AI issues occasionally. Those sorts of crossfire issues are documented through the net - frame limiting the game to 59fps is supposed to solve it


----------



## HunnoPT

Hello all, im new to the site and sign specially for this, maybe you guys can help me out ( I will struggle a bit to make my point, since English isnt my main language xD)

I bought 1 Gigabyte 270x WF 3x OC (GV-R927XOC-2GD/F1 revision 1.0) and later 1 Msi Gaming TF 270x.(yet to see romm message)
The thing is that i wanted to flash a bios on both of them (same clocks, same fan profile, and more voltage/TDP unlocked)
The Msi is ok since it has the 2 bios switch if anything dont work ok. (Already flashed wih the AMAZING VBE7 ) i could change clocks ,rpofile fan andTDP to 50%, but cant change the Vddc no matter if its even 0.01v+ , it allways crashes when i load a 3d app. I did only changed the (boost vddc) but still nothing. 1Q: Is there any other setting at powerplay tab that i should change or my card cant just change the vddc?

When it cames to the Gigabyte WF well im scared to flash a bios since it doesnt have any bios switch so i figure theres only one of them.
2Q: So if the bios flash dont work out, can i use my Igpu on the 2500k and at windows reflash the bios on the 270x? Will the program allow to flash ? Will it recognise the card?

Thats my questions for the experts or anyone that could help . Tanks in advance to all

Note that the Gigabyte is on a AMD system and my MSI at the I5 System, both at stock with turbo ON. i cant crossfire them yet since my PSU sucks

Some pics of my system ..and cat Lolz

http://imageshack.com/a/img809/4674/93f6.jpg

http://imageshack.com/a/img585/7489/88td.jpg

http://imageshack.com/a/img716/2217/pxrr.jpg


----------



## bardacuda

Hi. Just got this guy today so figure I'll join.

Make/model = Gigabyte R9 270/GV-R927OC-2GD

Pics:





Stock clocks still for now (975 core / 5600 mem).
Speaking of which...is there a good all inclusive guide for overclocking radeons? My last setup was 2 GTS 450s in SLI and I've never owned a radeon card before. In afterburner I *was* able to change the voltage up to over 1.2V and slide the core and memory as high as I could get stable. So far I've installed afterburner and I have this power target that I've never used before and it won't let me adjust voltage or slide the core clock past 1050.

I understand there's some config file I can edit in afterburner to unlock this? Anyone know the max temp for this chip? Recommended peak load temps? Do I need to flash the bios to play with voltage? Is that even a thing?

Looked around for a general radeon overclocking guide which I figured I would find stickied in the main video card section on the forums or with a quick bit of google but oddly it seems a little elusive so far.

P.S. I had some confusion while this was still shipping about the number of shaders, since newegg and some other sites were reporting 1024 shaders for this, and a couple of other models. I can confirm now that it has 1280 shaders in case anyone else was having the same problem.


----------



## HunnoPT

You can enable unofficial overclock on msi settings, down at the bottow. Voltage is locked for most R9 cards, temps should be ok with that cooler, nothing to worry if on big app load as Benchs/gaming dont past 75c/177f

There is no rocket sience for OCs since each card is different. Try it yourself, increase the TDP(wich is watts) at 20%+ and increase the core/mem mhz until you hit unstability, then back off 10mhz at each and thats it. Use MSI Kombustor for oc test , its great.

You can unlock bios voltage by flashing a bios, some work, some others dont and since that card dont have 2x bios its kinda risky..
Hope i help


----------



## staryoshi

That PowerColor Turbo Duo is a good looking card (I didn't realize it came with a backplate, too). They've come a long way from their hideous red PCBs on high end cards. My last PowerColor card was a PCS+ HD 5870







(So ugly, but it stayed cool - http://www.newegg.com/Product/Product.aspx?Item=N82E16814131352)


----------



## bardacuda

Ok is there a guide to flashing the BIOS? If it doesn't take is it possible to flash it back to the original again?


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Anyone got any gpu monitor program gadget?
> I used to use gpu observer and all gpu meter but it wont show both of my gpus


Quote:


> Originally Posted by *Recr3ational*
> 
> Anyone got any gpu monitor program gadget?
> I used to use gpu observer and all gpu meter but it wont show both of my gpus


I would use HWinfo64 and if you want Afterburner and GPUZ also. I mostly just use HWinfo64 because it's so detailed and and you can run a logging session and it gives you a very detailed readout which you can save of all of your components, their voltages and temps.


Spoiler: Warning: Spoiler!


----------



## Devildog83

*bardacuda* Has been added. Welcome !!! You are only the 4th 270 in the club, let's see some OC's.


----------



## Devildog83

Quote:


> Originally Posted by *bardacuda*
> 
> Ok is there a guide to flashing the BIOS? If it doesn't take is it possible to flash it back to the original again?


Just save the bios in GPUZ and flashback if it doesn't take. I am too chicken to flash my bios's.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> I would use HWinfo64 and if you want Afterburner and GPUZ also. I mostly just use HWinfo64 because it's so detailed and and you can run a logging session and it gives you a very detailed readout which you can save of all of your components, their voltages and temps.
> 
> 
> Spoiler: Warning: Spoiler!


I meant on the desktop. SO i can quickly glance over at the gadget. Instead of opening up cpuid or w.e


----------



## Devildog83

http://powercolor.com/us/News.asp?id=1211 New R7 card from Powercolor.


----------



## Devildog83

I see, something like this that I use to monitor CPU and memory usage.


----------



## GTR Mclaren

1.206 is the max voltage that Trixx show me...there is a way to push more?


----------



## bardacuda

Quote:


> Originally Posted by *Devildog83*
> 
> Just save the bios in GPUZ and flashback if it doesn't take. I am too chicken to flash my bios's.


I don't see an option in GPU-Z to save a copy of the BIOS. Is there another/better tool I can use to copy the existing/flash a new BIOS? Assuming it doesn't take, how would I go about restoring the original? I have both onboard video or one of my old 450s I could use while I reflash.

EDIT: nvm the first part. I found the save BIOS button.


----------



## Devildog83

Quote:


> Originally Posted by *bardacuda*
> 
> I don't see an option in GPU-Z to save a copy of the BIOS. Is there another/better tool I can use to copy the existing/flash a new BIOS? Assuming it doesn't take, how would I go about restoring the original? I have both onboard video or one of my old 450s I could use while I reflash.
> 
> EDIT: nvm the first part. I found the save BIOS button.


Good, I thought I was hallucinating for a bit.


----------



## Shogon

My first pair of AMD cards evar, and sadly all they will ever do is just mine







.


----------



## bartledoo

1275/1575 might be as far as i push it :/ wanted to go to 1300 on the core


----------



## bardacuda

Well without knowing how to unlock voltage control or the sliders in afterburner, or flash a new BIOS, this is the best I can do for now.



I clocked my CPU up to 3960MHz, turned off the heat, and opened up the door (my CPU takes a lot of extra voltage to go over 3.9GHz and it's about -12ºC right now here in eastern Canada...that's why I'm idling at 4º), and did a Valley run and this is what I got:



I was going to run 3DMark, but there has been an update, and now after installing it I get an error message when I try to run the benches. Tried uninstalling/rebooting/reinstalling/re-downloading, etc. and same issue. I did run it once before updating earlier but that was also before I applied any overclocks to the card and my CPU was on my every day setting of 3.7GHz at the time. But here it is anyway:

http://www.3dmark.com/3dm/2427844



For comparison, here are my old Valley and 3DMark runs with my 450s:



http://www.3dmark.com/3dm/1495958



...dat min fps


----------



## HCS01

I picked up a PowerColor 29 270X Devil a little while back.


----------



## Farih

Quote:


> Originally Posted by *bartledoo*
> 
> 1275/1575 might be as far as i push it :/ wanted to go to 1300 on the core


C'mon you got a toxic !

Overvolt that bastard









1300mhz on the core is nothing for a 270x








Quote:


> Originally Posted by *HCS01*
> 
> I picked up a PowerColor 29 270X Devil a little while back.


Grats !

Now overclock it past 1300mhz to.


----------



## HunnoPT

MR Devil Dog why wasnt the foreign guy added to the club?









Anyone could help on answer to my thread? Ima about to do some crazy flashing if you dont


----------



## bartledoo

Quote:


> Originally Posted by *Farih*
> 
> C'mon you got a toxic !
> 
> Overvolt that bastard
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1300mhz on the core is nothing for a 270x
> 
> 
> 
> 
> 
> 
> 
> 
> Grats !
> 
> Now overclock it past 1300mhz to.


Voltages are locked!







first amd gpu, teach me how to oc !


----------



## drieg500

Quote:


> Originally Posted by *bartledoo*
> 
> Voltages are locked!
> 
> 
> 
> 
> 
> 
> 
> first amd gpu, teach me how to oc !


Download the latest Afterburner beta (3.0.0 beta 18) and check everything under the "Compatibility properties" except for low-level access.

BTW, trying to OC the HAWK today, but its so damn hot in my country (Philippines btw) because its nearing summer on our end I'm afraid I won't get accurate results because of the hot/warm ambient temps in my room.

Trying out 1215/1550 and now 1235/1560. ran a lot of benchmarks and burn-in tests, although I can't conclude as of yet if these would be my stable 24/7 gaming OC profile. Would love to push it to 1300 though.


----------



## F3ERS 2 ASH3S

$200 over MSRP now
http://www.newegg.com/Product/Product.aspx?Item=N82E16814150678


----------



## rdr09

Quote:


> Originally Posted by *drieg500*
> 
> Download the latest Afterburner beta (3.0.0 beta 18) and check everything under the "Compatibility properties" except for low-level access.
> 
> BTW, trying to OC the HAWK today, but its so damn hot in my country (Philippines btw) because its nearing summer on our end I'm afraid I won't get accurate results because of the ambient temps.
> 
> Trying out 1215/1550 and now 1235/1560. ran a lot of benchmarks and burn-in tests, although I can't conclude as of yet if these would be my stable 24/7 gaming OC profile. Would love to push it to 1300 though.


set your oc during warm weather. 7/24 oc, that is.
Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> $200 over MSRP now
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150678


i got my 290 and the waterblock for $535 total. i will never buy from egg or at least avoid them in the future.


----------



## Recr3ational

PowerColour R9 280x
1250/1600




I didn't like the backplate that came with it so i made a custom one.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *rdr09*
> 
> set your oc during warm weather. 7/24 oc, that is.
> i got my 290 and the waterblock for $535 total. i will never buy from egg or at least avoid them in the future.


Yeah. I was just not fortunate enough to have the money to get in wile it was cheap.


----------



## rdr09

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Yeah. I was just not fortunate enough to have the money to get in wile it was cheap.


you've seen the non-ref 290 selling for $540 -560? those maybe a better buy unless you have to watercool. AMD is shooting themselves on the foot.


----------



## Devildog83

Quote:


> Originally Posted by *HCS01*
> 
> I picked up a PowerColor 29 270X Devil a little while back.
> 
> 
> Spoiler: Warning: Spoiler!


NIce card sir, please post your clocks and I will add you to the club, even if you just run at stock please let me know. I have found no way to unlock the voltage for this card. If you find a way please let me know.


----------



## Devildog83

Quote:


> Originally Posted by *Shogon*
> 
> 
> 
> My first pair of AMD cards evar, and sadly all they will ever do is just mine
> 
> 
> 
> 
> 
> 
> 
> .


Edit


----------



## Devildog83

Overclocks for *bartledoo & barracuda* have been updated.

edit


----------



## Devildog83

*Recr3ational* has been added, welcome!!! Nice rig !!!


----------



## Devildog83

Quote:


> Originally Posted by *Devildog83*
> 
> Overclocks for *bartledoo & barracuda* have been updated.
> 
> *Once again please add clocks to you post if you wish to be added.*


It sounds weird but I have found that 3DMark Vantage will fail before almost anything and is what I use to make sure I am stable. Others have said this too.


----------



## Devildog83

I have to say I am sorry folks, in the OP it did not say to post any clocks. This was my oversight. It has been changed and I hope this will avoid further confusion. Anyone who asked to be added without clocks to this point I will add stock clocks until I am requested to change them.









*HunnoPT,HCS01 & Shogon* Have been added to the club. Welcome !!!


----------



## bardacuda

Quote:


> Originally Posted by *Devildog83*
> 
> Overclocks for *bartledoo & barracuda* have been updated.
> 
> edit


bar*d*acuda









So is there any way to unlock voltage control and the maximums on the sliders? Do I have to flash a new BIOS or no? If I do what are the risks? If I brick my card is it easy enough to reflash the original BIOS and have it back to normal again? I wanna crank this sucker up!


----------



## Devildog83

Quote:


> Originally Posted by *bardacuda*
> 
> bar*d*acuda
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So is there any way to unlock voltage control and the maximums on the sliders? Do I have to flash a new BIOS or no? If I do what are the risks? If I brick my card is it easy enough to reflash the original BIOS and have it back to normal again? I wanna crank this sucker up!


Edited, I honestly do not know about that card. Sorry.


----------



## Shogon

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> $200 over MSRP now
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150678


Try Fry's, The San Jose one here in Cali had 3 of those for $430 each.

I was going to get you those clocks OP but I'm glad I don't have to now







too lazy to do anything with that pc other then what the mining commence.


----------



## bardacuda

Quote:


> Originally Posted by *Devildog83*
> 
> Edited, I honestly do not know about that card. Sorry.


Np....just asking anyone in general. Surely someone here can answer or can point me in the right direction at least. I would love you long time.









Otherwise I will have to go on an epic google quest







Fighting my way through endless hordes of random search results and forum posts.







And honestly I'm pretty lazy.


----------



## Archea47

Quote:


> Originally Posted by *Recr3ational*
> 
> PowerColour R9 280x
> 1250/1600
> 
> 
> 
> 
> I didn't like the backplate that came with it so i made a custom one.


Hey Recr3ational,

Are there blocks ready for your cards or did you have to customize/make one?

I would like to watercool my Gigabyte 280X and am willing to get a little creative


----------



## Recr3ational

Quote:


> Originally Posted by *Archea47*
> 
> Hey Recr3ational,
> 
> Are there blocks ready for your cards or did you have to customize/make one?
> 
> I would like to watercool my Gigabyte 280X and am willing to get a little creative


Yes sir. Its 7970 blocks. SOME (very important some) 280x natively fits the 7970 blocks.
Check here link


----------



## Farih

Quote:


> Originally Posted by *bartledoo*
> 
> Voltages are locked!
> 
> 
> 
> 
> 
> 
> 
> first amd gpu, teach me how to oc !


1. Set your Toxic card on the second BIOS (blue light off)
2. Download GPU-Z and use it to save your GPU BIOS.
3. Download VBE7 and change the BIOS you have just saved. (set voltage and power target)
4. Download Atiwinflash and flash the edited BIOS to your GPU (command prompt >> atiwinflash -f -p 0 nameofyourrom.) (Flash at your own risk !)
5. Restart PC and enjoy higher core voltage, power target and clock range.
6. Happy overclocking









P.S:
This works for most AMD cards.
Quote:


> Originally Posted by *bardacuda*
> 
> Np....just asking anyone in general. Surely someone here can answer or can point me in the right direction at least. I would love you long time.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Otherwise I will have to go on an epic google quest
> 
> 
> 
> 
> 
> 
> 
> Fighting my way through endless hordes of random search results and forum posts.
> 
> 
> 
> 
> 
> 
> 
> And honestly I'm pretty lazy.


You could try the above. (best if you have a dual bios though)


----------



## bardacuda

Quote:


> Originally Posted by *Farih*
> 
> 1. Set your Toxic card on the second BIOS (blue light off)
> 2. Download GPU-Z and use it to save your GPU BIOS.
> 3. Download VBE7 and change the BIOS you have just saved. (set voltage and power target)
> 4. Download Atiwinflash and flash the edited BIOS to your GPU (command prompt >> atiwinflash -f -p 0 nameofyourrom.) (Flash at your own risk !)
> 5. Restart PC and enjoy higher core voltage, power target and clock range.
> 6. Happy overclocking
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S:
> This works for most AMD cards.
> You could try the above. (best if you have a dual bios though)


Thanks for the reply. Assuming I don't have dual BIOS (because I actually don't) what are the risks? Will I be able to reflash the original BIOS if I have to?


----------



## drieg500

finally did something right before I go to sleep.









got my HAWK stable at 1250/1575 at stock voltage (1.2v) with 120% power limit.

although it was weird that upon touching the voltages (even for a little bit) either screen would flicker or the system will crash, even with just my previous stable 1200/1500 overclock.

I'll just have to try again tomorrow with 1275/1600, hopefully with stable voltage increases.









btw, Devildog83, update my clocks to 1250/1575. also, you got my username spelled incorrectly.


----------



## thrgk

i have a few questions.

for a 290 that unlocked to 290x, is it just as good as a real 290x?

what is the real difference between 290 and 290x? does one OC better?

is the average overclock 1150core? as i thought i read somewhere even on water these cards do not OC well.

a 290x with a waterblock, is 56c on the core normal or seem high?

Thanks


----------



## Devildog83

Quote:


> Originally Posted by *thrgk*
> 
> i have a few questions.
> 
> for a 290 that unlocked to 290x, is it just as good as a real 290x?
> 
> what is the real difference between 290 and 290x? does one OC better?
> 
> is the average overclock 1150core? as i thought i read somewhere even on water these cards do not OC well.
> 
> a 290x with a waterblock, is 56c on the core normal or seem high?
> 
> Thanks


Try here - http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/2490#post_21501193


----------



## Devildog83

Quote:


> Originally Posted by *drieg500*
> 
> 
> 
> finally did something right before I go to sleep.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> got my HAWK stable at 1250/1575 at stock voltage (1.2v) with 120% power limit.
> 
> although it was weird that upon touching the voltages (even for a little bit) either screen would flicker or the system will crash, even with just my previous stable 1200/1500 overclock.
> 
> I'll just have to try again tomorrow with 1275/1600, hopefully with stable voltage increases.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> btw, Devildog83, update my clocks to 1250/1575. also, you got my username spelled incorrectly.


Edited and updated.


----------



## hamzta09

Anyone know of any known symptoms of undervolting?

So far Ive had monitor randomly going black and then back to normal for ~1-2 seconds, even when playing, could be cable I know.
But Ive also had driver failing in Civ 5.


----------



## Recr3ational

Quote:


> Originally Posted by *hamzta09*
> 
> Anyone know of any known symptoms of undervolting?
> 
> So far Ive had monitor randomly going black and then back to normal for ~1-2 seconds, even when playing, could be cable I know.
> But Ive also had driver failing in Civ 5.


+1 I also what to know what consequences under volting creates?


----------



## bajer29

Quote:


> Originally Posted by *hamzta09*
> 
> Anyone know of any known symptoms of undervolting?
> 
> So far Ive had monitor randomly going black and then back to normal for ~1-2 seconds, even when playing, could be cable I know.
> But Ive also had driver failing in Civ 5.


Quote:


> Originally Posted by *Recr3ational*
> 
> +1 I also what to know what consequences under volting creates?


Doubly so... I get the pink grid pattern very briefly and a little bit of artifacting with a little bit of stutter. My clocks aren't ridiculous either. Maybe my card is overheating? I never get over 70C full load, though (63C average).

I use Sapphire Trixx to OC. Should I be forcing constant voltage? What about the voltage slider on the main OC screen? Do I do anything with voltage at all?


----------



## hamzta09

Quote:


> Originally Posted by *bajer29*
> 
> Doubly so... I get the pink grid pattern very briefly and a little bit of artifacting with a little bit of stutter. My clocks aren't ridiculous either. Maybe my card is overheating? I never get over 70C full load, though (63C average).
> 
> I use Sapphire Trixx to OC. Should I be forcing constant voltage? What about the voltage slider on the main OC screen? Do I do anything with voltage at all?


70c is a rather low temp for high load, so thats not the case.


----------



## bajer29

Quote:


> Originally Posted by *hamzta09*
> 
> 70c is a rather low temp for high load, so thats not the case.


That's a relief... I'm hoping it's not bad memory, then. Maybe it IS just undervolting.


----------



## Recr3ational

Quote:


> Originally Posted by *bajer29*
> 
> Doubly so... I get the pink grid pattern very briefly and a little bit of artifacting with a little bit of stutter. My clocks aren't ridiculous either. Maybe my card is overheating? I never get over 70C full load, though (63C average).
> 
> I use Sapphire Trixx to OC. Should I be forcing constant voltage? What about the voltage slider on the main OC screen? Do I do anything with voltage at all?


I get pink lines sometimes on two side monitors. It goes away when I shake my mouse. ULPS? Powerplay? I think powerplay is a main issue here.


----------



## mAs81

Well , at last , I think I'm going to keep these clocks...(for daily use I'll stick with the 1020/1500 stock clocks)
@1200/1700 this is what I got:


Spoiler: Warning: Spoiler!




with 1194 core voltage and @50% fan speed my temps were under 70c










So,please Devildog add me when you can...
On another note,
my backplate is on its way








Will post pics as soon as I install it(and buy a new camera







)


----------



## Devildog83

*mAs81* has been added, Welcome!!! Nice Valley score.


----------



## mAs81

Quote:


> Originally Posted by *Devildog83*
> 
> *mAs81* has been added, Welcome!!! Nice Valley score.


Thanks man..!!I'm very happy with those scores myself!!!!!!!!
heads up though,I was first added by Durvelle27!!He's got me in the list with stock clocks(1050/1500 which aren't for my msi,by the way







)


Spoiler: Warning: Spoiler!






If it's not too much to ask please delete it???Thnx


----------



## cremelo

New OC











Valley test



Clock: 1070Mhz to 1170Mhz
GPU Voltage: 1.2 to 1.232
Memory Clock: 6400Mhz to 6500Mhz ( 1600mhz to 1625mhz )
Power Target: 120%
FanSpeed: 50% Manual
Max Temp: 70ºC
Ambient Temp: 20ºC


----------



## Devildog83

Quote:


> Originally Posted by *mAs81*
> 
> Well , at last , I think I'm going to keep these clocks...(for daily use I'll stick with the 1020/1500 stock clocks)
> @1200/1700 this is what I got:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> with 1194 core voltage and @50% fan speed my temps were under 70c
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So,please Devildog add me when you can...
> On another note,
> my backplate is on its way
> 
> 
> 
> 
> 
> 
> 
> 
> Will post pics as soon as I install it(and buy a new camera
> 
> 
> 
> 
> 
> 
> 
> )


Done, thanks for pointing it out.


----------



## Devildog83

Since some were posting Valley runs I thought I would post this, still can't beat it but I was only at 4.8 Ghz and 2000 memory.


----------



## mAs81

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Devildog83*
> 
> Since some were posting Valley runs I thought I would post this, still can't beat it but I was only at 4.8 Ghz and 2000 memory.





Nicee....


----------



## Devildog83

*cremelo* I updated your new overclock. Nice job.

Is it normal for Valley to not use all of the CPU cores? I noticed after my last run the only 3 cores of 8 were being utilized anywhere near 100%.


----------



## Dimestore55

Sorry about coming late to your party but I figured you guys/gals would be best suited to answer my question.

Will a full card waterblock for a R9 290x fit a R9 270x? or...

Do you know of a waterblock for 270x?


----------



## cjc75

Price Increase Alert!

Newegg just raised the price of the MSI R9 280x from $519 this morning, now up to $579!
http://www.newegg.com/Product/Product.aspx?Item=N82E16814127759

Asus R9 280X price raised from $519 this morning, _now up to $599!_
http://www.newegg.com/Product/Product.aspx?Item=N82E16814121803

Sapphire Dual-X R9 280x Price raised to $579!
http://www.newegg.com/Product/Product.aspx?Item=N82E16814202061


----------



## Recr3ational

I crossfired the 280 with my 7950 this is what i got after a quick overclock. Still not fine tuned


----------



## Devildog83

New egg has gone nuts. I am glad I spent $440 for mine that outperform any 280x but I do wish I had bought a 290 @ $400 instead just because the value has increased so much. If you bought your 280x @ $300 when they came out you cashed in there.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> I crossfired the 280 with my 7950 this is what i got after a quick overclock. Still not fine tuned
> 
> 
> Spoiler: Warning: Spoiler!


That's very nice!!


----------



## thrgk

600 for a 290x sapphire 6 weeks old good deal?


----------



## Devildog83

Quote:


> Originally Posted by *thrgk*
> 
> 600 for a 290x sapphire 6 weeks old good deal?


They are running like $800 or more now so I would say YES. I did see an open box Powercolor ref. card for $700.


----------



## Recr3ational

Heres what im confused about, Why is my first gpu always at max. Even when its idle. The clocks on max 24/7 any ideas?


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Heres what im confused about, Why is my first gpu always at max. Even when its idle. The clocks on max 24/7 any ideas?


Do you have graphics overdrive enabled in the CCC?


----------



## cremelo

Quote:


> Originally Posted by *Devildog83*
> 
> Since some were posting Valley runs I thought I would post this, still can't beat it but I was only at 4.8 Ghz and 2000 memory.


WOW I'll still try to get this Score








Quote:


> Originally Posted by *Devildog83*
> 
> *cremelo* I updated your new overclock. Nice job.
> 
> Is it normal for Valley to not use all of the CPU cores? I noticed after my last run the only 3 cores of 8 were being utilized anywhere near 100%.


Tanks








This week I still want to reach the mark of 1200/1700 xD


----------



## Devildog83

Quote:


> Originally Posted by *cremelo*
> 
> WOW I'll still try to get this Score
> 
> 
> 
> 
> 
> 
> 
> 
> Tanks
> 
> 
> 
> 
> 
> 
> 
> 
> This week I still want to reach the mark of 1200/1700 xD


I have 2 GPU's, it's really not fair. 1200/1700 would be kickin' though.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Do you have graphics overdrive enabled in the CCC?


No i dont do i need it on?


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> No i dont do i need it on?


No, what are you using for overclocking, Afterburner?


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> No, what are you using for overclocking, Afterburner?


trixx mate


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> trixx mate


Hmmm, don't know Trixx too much. I have seen this before but I can't seem too remember the fix. Sorry.


----------



## thrgk

whats the best waterblock for 290x? i know Ek had some troubles before, was looking at heatkiller. If i do go EK, which is the best? Nickel?


----------



## Archea47

Quote:


> Originally Posted by *thrgk*
> 
> whats the best waterblock for 290x? i know Ek had some troubles before, was looking at heatkiller. If i do go EK, which is the best? Nickel?


This is the 270/280 club bud, not the 290. That said, nothing makes the Nickel perform better than the copper version

Men and women I'm so frustrated ... don't get my wrong, my 280Xs have performed flawlessly at all I throw at them. Just the plan was to get the Hawaii cards until the reviewers said they were unstable due to VRM temps etc. and I didn't want to wait for aftermarket cooling so I grabbed the Gigabyte 280Xs. Now ... well beyond despising the "290 is unstable" crowd now I have enough radiator to cool some GPUs so I could have picked up the 290s, unlocked and watercooled. Instead I have two 280Xs that apparently will never have full cover blocks

I was thinking about selling the 280s but I can't go without gaming until April or w/e when there should be more 290s and perhaps the prices will fall. Maybe I should sell just one - I only actually need one. It's frustrating that I can't put them F/S here

Any ideas or consolations would be appreciated


----------



## mAs81

Quote:


> Originally Posted by *Archea47*
> 
> I was thinking about selling the 280s but I can't go without gaming until April or w/e when there should be more 290s and perhaps the prices will fall. Maybe I should sell just one - I only actually need one. It's frustrating that I can't put them F/S here
> Any ideas or consolations would be appreciated


If you only need one,then sell it..That's what I'd do








And although It won't solve your current problem, +REP to get you closer to the OCN marketplace


----------



## mAs81

Quote:


> Originally Posted by *Recr3ational*
> 
> Heres what im confused about, Why is my first gpu always at max. Even when its idle. The clocks on max 24/7 any ideas?


Perhaps this might help??
http://www.tomshardware.co.uk/answers/id-1864321/gpu-stuck-maximum-load.html


----------



## Recr3ational

Quote:


> Originally Posted by *mAs81*
> 
> Perhaps this might help??
> http://www.tomshardware.co.uk/answers/id-1864321/gpu-stuck-maximum-load.html


Thanks for that mate but that's not the issue. It was my first thought too. Driver issues I think.

Edit: I think its Trixx, changed to Afterburner and uninstalled TRixx all is fine now.

Edit: I found the problem. Its Powerplay. IF its off. It goes up if its on. My screen flikers haha. Lose-Lose situation.


----------



## bardacuda

Quote:


> Originally Posted by *bajer29*
> 
> Doubly so... I get the pink grid pattern very briefly and a little bit of artifacting with a little bit of stutter. My clocks aren't ridiculous either. Maybe my card is overheating? I never get over 70C full load, though (63C average).
> 
> I use Sapphire Trixx to OC. Should I be forcing constant voltage? What about the voltage slider on the main OC screen? Do I do anything with voltage at all?


Undervolting reduces the capability of your card to maintain a given clock speed stably, therefore if you lower your voltage too much you may need to either a) lower clock speed as well to become stable again, or b)maintain the clock speed and raise voltage back up until you reach a stable state.


----------



## cremelo

Quote:


> Originally Posted by *Recr3ational*
> 
> Thanks for that mate but that's not the issue. It was my first thought too. Driver issues I think.
> 
> Edit: I think its Trixx, changed to Afterburner and uninstalled TRixx all is fine now.
> 
> Edit: I found the problem. Its Powerplay. IF its off. It goes up if its on. My screen flikers haha. Lose-Lose situation.


let me ask you something using the 280x + HD 7950 you can overclock the 7950 to get the same level of 280x?????
I like your idea and I have a 7950 arrest with a friend ....


----------



## Recr3ational

Quote:


> Originally Posted by *cremelo*
> 
> let me ask you something using the 280x + HD 7950 you can overclock the 7950 to get the same level of 280x?????
> I like your idea and I have a 7950 arrest with a friend ....


Well yes the 280x is basically a rebadge 7970 GHZ edition. So basically it having a 7950 and a 7970 in crossfire. Obviously the 7950s going to hit a wall before the 7970 does but it will still work.
Im planning to upgrade to another 280x when i have the cash, this is just a better way to wait and not loose much performance in the mean while.

Also, (correct me if im wrong) you could mod the 7950's bios with a 280x to match the performance even greater? I was thinking of doing it but i have no idea how to mod a bios.

Best idea though is to leave the 280x at stock and overclock the 7950. At 1080p that's all you really need. To much is pointless after all.


----------



## rdr09

Quote:


> Originally Posted by *cremelo*
> 
> let me ask you something using the 280x + HD 7950 you can overclock the 7950 to get the same level of 280x?????
> I like your idea and I have a 7950 arrest with a friend ....


1100 core on the 7950 should match the 280. i've crossfired a 7950 with a 7970 in the past and just let them run their own clocks. Did not even synch them. i only played C3, C2, and BF3, though.


----------



## bajer29

Quote:


> Originally Posted by *bardacuda*
> 
> Undervolting reduces the capability of your card to maintain a given clock speed stably, therefore if you lower your voltage too much you may need to either a) lower clock speed as well to become stable again, or b)maintain the clock speed and raise voltage back up until you reach a stable state.


Thanks, baracuda. I was messing around with clocks and voltages last night. I raised the voltage to 1.3V (highest I can get with trixx), turned on force constant voltage and I still get artifacting and the pink grid pattern or flashing colors (pink and/ or green). My clocks are still not stable. I've tried 1160/1600, 1175/1575, 1200/1600, etc. while tweaking voltages up every so slightly to see if there is a difference. Still no dice. I think I might just have to give up and keep it at its stock clocks (1070/1550).

Also, I've been having weird problems with FarCry 3 where my GPU usage will slowly drop to about 70% after playing for 15 minutes and the FPS drops as well. I have dropped the settings down to dx9 no frame buffer or vsync, and no MSAA on ultra...

With the mantle update in BF4 I get super CPU spikes in the middle of gameplay on stock clocks. Super annoying.

Other than that my card kills it in every other game (nearly 200 FPS in DS3 on ultra!). I don't get it.

I ran into a thread in another forum that states these settings in FC3 should work with a 7970. I'll give them a try tonight.:

Quote:


> Just leave vsync off and set your graphics like this:
> 
> Details: Ultra
> MSAA: 4x
> Directx: 11
> Ambient occlusion: HDAO


*sigh*


----------



## hamzta09

Overclocking these cards, do I need to reboot when driver fails?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> Overclocking these cards, do I need to reboot when driver fails?


A lot of times I do as the overclocks seem to fail easier when you dont


----------



## Recr3ational

Quote:


> Originally Posted by *hamzta09*
> 
> Overclocking these cards, do I need to reboot when driver fails?


Usually no. If you can reopen the driver that crashed then it's usually all right. It should restart by itself anyway.
In my experience it's usually a crash and a recovery or hangs and I'll have to hard reboot.


----------



## DrClaw

anybody here running multiple 270x or 270 configs? how is that going? i heard their is more issues when mining vs using 280x and up


----------



## bardacuda

Quote:


> Originally Posted by *bajer29*
> 
> Thanks, baracuda. I was messing around with clocks and voltages last night. I raised the voltage to 1.3V (highest I can get with trixx), turned on force constant voltage and I still get artifacting and the pink grid pattern or flashing colors (pink and/ or green). My clocks are still not stable. I've tried 1160/1600, 1175/1575, 1200/1600, etc. while tweaking voltages up every so slightly to see if there is a difference. Still no dice. I think I might just have to give up and keep it at its stock clocks (1070/1550).
> 
> Also, I've been having weird problems with FarCry 3 where my GPU usage will slowly drop to about 70% after playing for 15 minutes and the FPS drops as well. I have dropped the settings down to dx9 no frame buffer or vsync, and no MSAA on ultra...
> 
> With the mantle update in BF4 I get super CPU spikes in the middle of gameplay on stock clocks. Super annoying.
> 
> Other than that my card kills it in every other game (nearly 200 FPS in DS3 on ultra!). I don't get it.


Eventually you just hit a wall and no amount of extra voltage will help you. I think I would try setting memory at stock, because it already has fast memory and a wide 384-bit bus, so I doubt overclocking memory would help much as opposed to the core.

For the core even your low setting is already about 100 MHz higher than the stock boost clock, and over 200 over the 950 MHz regular clock speed...so it could be that you have just hit the wall. It's just luck of the draw where that wall is.

Not having owned one or knowing much about this particular card though I'm not sure if that is unusual or not. If basically every other user here is able to go above 1160 easily with 1.3V then maybe something is wrong with your card.

Try lowering the clocks in, say, 20MHz intervals, until it runs fine with no hangs or artifacts, just to find out what the upper limit is with that voltage. Then you can compare this number with other overclocked 280X owners to see if your card has an abnormally low ceiling. If so you may have a somewhat of lemon-y card. But unfortunately, unless it doesn't work at stock speed and voltage, it's not grounds to return it.


----------



## bajer29

Quote:


> Originally Posted by *bardacuda*
> 
> Eventually you just hit a wall and no amount of extra voltage will help you. I think I would try setting memory at stock, because it already has fast memory and a wide 384-bit bus, so I doubt overclocking memory would help much as opposed to the core.
> 
> For the core even your low setting is already about 100 MHz higher than the stock boost clock, and over 200 over the 950 MHz regular clock speed...so it could be that you have just hit the wall. It's just luck of the draw where that wall is.
> 
> Not having owned one or knowing much about this particular card though I'm not sure if that is unusual or not. If basically every other user here is able to go above 1160 easily with 1.3V then maybe something is wrong with your card.
> 
> Try lowering the clocks in, say, 20MHz intervals, until it runs fine with no hangs or artifacts, just to find out what the upper limit is with that voltage. Then you can compare this number with other overclocked 280X owners to see if your card has an abnormally low ceiling. If so you may have a somewhat of lemon-y card. But unfortunately, unless it doesn't work at stock speed and voltage, it's not grounds to return it.


Thanks for the great in-depth reply. +1









I am thinking I've hit the wall even at stock because the more research I do on OCing this specific card, the more I find that they are basically factory OC'd to the very cusp of what the card can handle with its cooling and voltage capabilities.

EDIT: Besides, I think that the memory clock is fine at stock, as you said. It seems the issues I'm experiencing with the colors and artifacts is GPU memory related. I'll come back with my findings, though. I'll see what a core clock only will do.


----------



## hamzta09

Quote:


> Originally Posted by *bardacuda*
> 
> Eventually you just hit a wall and no amount of extra voltage will help you. I think I would try setting memory at stock, because it already has fast memory and a wide 384-bit bus, so I doubt overclocking memory would help much as opposed to the core.
> 
> For the core even your low setting is already about 100 MHz higher than the stock boost clock, and over 200 over the 950 MHz regular clock speed...so it could be that you have just hit the wall. It's just luck of the draw where that wall is.
> 
> Not having owned one or knowing much about this particular card though I'm not sure if that is unusual or not. If basically every other user here is able to go above 1160 easily with 1.3V then maybe something is wrong with your card.
> 
> Try lowering the clocks in, say, 20MHz intervals, until it runs fine with no hangs or artifacts, just to find out what the upper limit is with that voltage. Then you can compare this number with other overclocked 280X owners to see if your card has an abnormally low ceiling. If so you may have a somewhat of lemon-y card. But unfortunately, unless it doesn't work at stock speed and voltage, it's not grounds to return it.


If I raise my mem clock by +25 (1525) I get lower results in Valley bench


----------



## bajer29

Quote:


> Originally Posted by *hamzta09*
> 
> If I raise my mem clock by +25 (1525) I get lower results in Valley bench


What are you stock clocks?


----------



## hamzta09

Quote:


> Originally Posted by *bajer29*
> 
> What are you stock clocks?


For the mem, its rather obvious, 1500.
Core is 1000.


----------



## Devildog83

Quote:


> Originally Posted by *DrClaw*
> 
> anybody here running multiple 270x or 270 configs? how is that going? i heard their is more issues when mining vs using 280x and up


I have a 270x and 7870, mining did not go so well but I was told you need the right balance between clocks and intensity to get anywhere.


----------



## Farih

DevilDog.....
Why arent i on your list yet ?

One's that asked me about GPU BIOS tweaking/overclocking if you still have questions ask me in a PM.
This thread goes to fast sometimes for this old man to keep up








Sorry if it looked like i ignored some of you.....


----------



## Devildog83

Quote:


> Originally Posted by *Farih*
> 
> DevilDog.....
> Why arent i on your list yet ?
> 
> One's that asked me about GPU BIOS tweaking/overclocking if you still have questions ask me in a PM.
> This thread goes to fast sometimes for this old man to keep up
> 
> 
> 
> 
> 
> 
> 
> 
> Sorry if it looked like i ignored some of you.....


Your not? You got a Toxic 270x right?


----------



## Devildog83

*Farih* has been belatedly been added to the club.







Please let me know what clocks you wish to be posted and if you in fact have a Toxic. That's just what I remember.


----------



## pac08

I have two Sapphire R9 280X cards on my mining rig and after getting a new motherboard (Asrock 970 extreme4) i noticed something weird. If i try to let's say watch a video when mining, the card losing hashrate isn't the one the monitor is connected on but the second card. Same goes for catalyst. In the overdrive menu the card showing is the second one. Does anyone know why this happening?


----------



## Devildog83

Got a new high score in 3DMark11, up over 1250/1460 GPU clocks in X-Fire.









http://www.3dmark.com/3dm11/7965869



It's no 1 with Valid results only.

http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/3dm11/P/1541/765/500000?minScore=0&cpuName=AMD%20FX-8350&gpuName=AMD%20Radeon%20HD%207870


----------



## bardacuda

Quote:


> Originally Posted by *DrClaw*
> 
> anybody here running multiple 270x or 270 configs? how is that going? i heard their is more issues when mining vs using 280x and up


I just tried my hand at mining doge...and with a few tweaks to the default settings in guiminer I get about 445kh/s @ +20% power target and 1050/1500 clocks. Not too shibey.
Quote:


> Originally Posted by *Farih*
> 
> DevilDog.....
> Why arent i on your list yet ?
> 
> One's that asked me about GPU BIOS tweaking/overclocking if you still have questions ask me in a PM.
> This thread goes to fast sometimes for this old man to keep up
> 
> 
> 
> 
> 
> 
> 
> 
> Sorry if it looked like i ignored some of you.....


Np. I appreciate you taking the time to reply, and this probably isn't the best thread to ask that question, I know. It probably deserves it's own thread for the level of back and forth and detail that it would have to go to. Anywhooo....PM's awaaaaaayyyyyy!!


----------



## bajer29

Quote:


> Originally Posted by *pac08*
> 
> I have two Sapphire R9 280X cards on my mining rig and after getting a new motherboard (Asrock 970 extreme4) i noticed something weird. If i try to let's say watch a video when mining, the card losing hashrate isn't the one the monitor is connected on but the second card. Same goes for catalyst. In the overdrive menu the card showing is the second one. Does anyone know why this happening?


You might want to ask this question in a mining thread...


----------



## crazysoccerman

I wish miners nothing but electrical shorts









However, calling a loser in his mother's basement a "miner" is an insult to actual miners.



is not the same as


----------



## theilya

is there a way to overvolt 270?

custom bios?


----------



## bardacuda

Quote:


> Originally Posted by *theilya*
> 
> is there a way to overvolt 270?
> 
> custom bios?


See below. I'm working on this very question myself right now.
Quote:


> Originally Posted by *Farih*
> 
> DevilDog.....
> Why arent i on your list yet ?
> 
> One's that asked me about GPU BIOS tweaking/overclocking if you still have questions ask me in a PM.
> This thread goes to fast sometimes for this old man to keep up
> 
> 
> 
> 
> 
> 
> 
> 
> Sorry if it looked like i ignored some of you.....


----------



## Archea47

Quote:


> Originally Posted by *crazysoccerman*
> 
> I wish miners nothing but electrical shorts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> However, calling a loser in his mother's basement a "miner" is an insult to actual miners.


Well I certainly don't appreciate the price hike of the cards as a shopper (though I did enjoy the high price I received for the 7950 I sold on ebay - funded half my water cooling!)

The reality though is the guy in his mother's basement makes money working an hour a day in climate controlled comfort while the other guy digging coal works long days busting hump, breathing toxic dust and in an uncomfortable environment. To play devil's advocate, if they make the same money I would say the guy in his mom's basement is the clear winner


----------



## DrClaw

the real miner lives life in the fast lane but thats just a harsh way to put things, i mine and i dont have a spot of acne on my face. I take care of myself,i work full time, i have a girlfriend, i go to the gym









you know like things city people do. Real miners make a decent living but living in the city people think life expectancy is all time high, ya right.
if you want to know how people should eat and drink this is probably one of the few places in the world

http://www.theguardian.com/world/2013/may/31/ikaria-greece-longevity-secrets-age

go watch it on youtube and see how most of them eat and grow their food and drink real spring water
we are going off topic now







so ill leave it at that.

one thing to note they live into their 80s and 90s, some centenarians and are quite active.


----------



## Devildog83

Miners should most likely watch their backs, the UN will undoubtedly start a carbon tax on Cryptocurrency to try and deter folks from mining at all. Do you really think that Governments are going to allow this relatively easy money grabbing go on for too long? Especially the Ecoterrorists at the UN, those Neo Nazi treehuggin' granola heads. Nothing is free, at least not for very long. The powers that be will always see to that.

Is that too far off topic????


----------



## Devildog83

It has been suggested, because the 7800 club is not getting much traffic, that we combine the clubs. I have know Idea if it's even possible but I thought I would ask in here and see what ya'll think. If you are against it please say so.


----------



## [CyGnus]

I dont see any prob with that


----------



## Devildog83

Quote:


> Originally Posted by *[CyGnus]*
> 
> I dont see any prob with that


That's 1 yes vote.


----------



## neurotix

I think we should probably keep them separate. There's just too many cards. It's confusing enough as it is with 270s, 270X, and 280X- a lot of the time people post overclocks and temps but never actually say WHICH model of card they have, and often they don't have a rigbuilder done either.

Adding 7870s, 7850s and 7870XTs in would make it even more confusing. The other thing is, there's a 7970 owner's club that's pretty active, but technically those are just 280Xs, yet nobody wants to merge that club into this one.

I won't complain if it happens, but I will probably be even more confused when people post things with no rigbuilder. My vote is no, but it's a half-hearted no.

Personally, I think it would be better to have a separate 7870/270X club and a 7970/280X club because that way Pitcairn and Tahiti are separate clubs. Not going to happen but it makes the most sense to me.


----------



## Devildog83

I see your point bud, that club has died off almost completely and I just hate to see that 7800 owners don't have anywhere to go and bounce thoughts off of someone. Personally I like a busy club but it could get confusing with too many cards covered. The best would likely be to have a Pitcairn club and a Tahiti club but who knows.


----------



## dmfree88

Make it happen devil







i like that idea keeps the old and the new together and more relevant to eachother.


----------



## KauBoy

1st of all thank you for adding me to the club. Much appreciated.
Quote:


> Originally Posted by *Devildog83*
> 
> Are you sure you have 1550 on the core or do you mean 1150/1600? I have added you but I will hold on the clocks for a conformation. Welcome to the club.


yes sir.1550 was a typo








1070 and 1550 are the stock clocks of the vapor-x and 1150 and 1600 was my previous OC.
Just now I could hit 1200 and 1600 MHz and stable. This vga is amazing.


----------



## Devildog83

KauBoy, clocks are updated.

dmfree88,

I agree totally. I love the Vishera thread and it is crazy busy sometimes. To me the more the merrier but it's not all up to me. Will see how it plays out and if we will even be allowed to do it if most want too.


----------



## Archea47

Quote:


> Originally Posted by *neurotix*
> 
> I think we should probably keep them separate. There's just too many cards. It's confusing enough as it is with 270s, 270X, and 280X- a lot of the time people post overclocks and temps but never actually say WHICH model of card they have, and often they don't have a rigbuilder done either.
> 
> Adding 7870s, 7850s and 7870XTs in would make it even more confusing. The other thing is, there's a 7970 owner's club that's pretty active, but technically those are just 280Xs, yet nobody wants to merge that club into this one.
> 
> I won't complain if it happens, but I will probably be even more confused when people post things with no rigbuilder. My vote is no, but it's a half-hearted no.
> 
> Personally, I think it would be better to have a separate 7870/270X club and a 7970/280X club because that way Pitcairn and Tahiti are separate clubs. Not going to happen but it makes the most sense to me.


I agree with all of this

So apparently there is a lot of discussion in the 7970 club, while there's not a lot of 280X talk here

On the other hand there's not much discussion in the 78xx club, while there's lots of 270X talk here

It only makes sense! Until of course you consider the work of splitting this 416-page thread in two


----------



## KauBoy

Quote:


> Originally Posted by *Devildog83*
> 
> KauBoy, clocks are updated.


thank you. I love this thread!


----------



## crazysoccerman

Quote:


> Originally Posted by *dmfree88*
> 
> Make it happen devil
> 
> 
> 
> 
> 
> 
> 
> i like that idea keeps the old and the new together and more relevant to eachother.


Old and new are nothing more than marketing terms in this case. I agree with neurotix because he makes sense.


----------



## gh0stfac3killa

yooooo, what is the deal with the prices on these 280x's. holy rusted metal batman!!!! i just looked on newegg and saw the very same gigabyte gpus i bought for 299.99 are now 549.99. man, what is the deal??? that is crazy price hikes. i know they went up do to bitcoin and litecoin minig but man this has to be pure greed on sellers parts. going from 300 to 500 plus!!!!


----------



## theilya

can anyone share custom bios for 270 with voltage increase?


----------



## cjc75

Quote:


> Originally Posted by *gh0stfac3killa*
> 
> yooooo, what is the deal with the prices on these 280x's. holy rusted metal batman!!!! i just looked on newegg and saw the very same gigabyte gpus i bought for 299.99 are now 549.99. man, what is the deal??? that is crazy price hikes. i know they went up do to bitcoin and litecoin minig but man this has to be pure greed on sellers parts. going from 300 to 500 plus!!!!


Thats not all... take a look at the price on the MSI 280X and the Asus 280X...

They went up to $579 and $599!!

Thats ridiculous... they're going on up to $600 now! I had been seriously considering getting one, but now I am not sure... I mean seriously, if my Case could fit a 290X or GTX 780, then I can still buy those for CHEAPER!


----------



## crazysoccerman

If you are a gamer and buy AMD at newegg's prices you're stupid TBH. Do you now see why I, as a gamer who cares about the PC gaming community, hate speculative neckbeards?

This is a crippling blow to affordable PC gaming, IMO.


----------



## neurotix

Quote:


> Originally Posted by *cjc75*
> 
> Thats not all... take a look at the price on the MSI 280X and the Asus 280X...
> 
> They went up to $579 and $599!!
> 
> Thats ridiculous... they're going on up to $600 now! I had been seriously considering getting one, but now I am not sure... I mean seriously, if my Case could fit a 290X or GTX 780, then I can still buy those for CHEAPER!


Yeah, man I remember seeing 7950s on Newegg for $250 with a $20 off mail-in rebate back in... September I think? The 7970s were a little over $300 too.

Too bad I had no money then and couldn't pick up a card or two.


----------



## crazysoccerman

Quote:


> Originally Posted by *neurotix*
> 
> Yeah, man I remember seeing 7950s on Newegg for $250 with a $20 off mail-in rebate back in... September I think? The 7970s were a little over $300 too.
> 
> Too bad I had no money then and couldn't pick up a card or two.


I bought a 7950 for $180 last October. I bought a 280x for $295 last November. The price/performance wasn't as good as my 7950 and it didn't come with any free games, but I bought it anyways. LOL.


----------



## neurotix

Yep, exactly what I'm talking about.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *crazysoccerman*
> 
> If you are a gamer and buy AMD at newegg's prices you're stupid TBH. Do you now see why I, as a gamer who cares about the PC gaming community, hate speculative neckbeards?
> 
> This is a crippling blow to affordable PC gaming, IMO.


In all fairness though the mark up is the market and not the manufacturer MSRP is still 299.99 however due to the demand the prices are up.. Capitalism.. some one is lining their pockets in green.


----------



## gh0stfac3killa

Quote:


> Originally Posted by *cjc75*
> 
> Thats not all... take a look at the price on the MSI 280X and the Asus 280X...
> 
> They went up to $579 and $599!!
> 
> Thats ridiculous... they're going on up to $600 now! I had been seriously considering getting one, but now I am not sure... I mean seriously, if my Case could fit a 290X or GTX 780, then I can still buy those for CHEAPER!


its crazy to think they would jump up so high, considering the whole point of this r9 series was to give the gamer better performance at a cheaper price. i wont lie i had orginaly ordered to xfx 280x as well and at one point i had 4 gpus two gigabytes and two xfx for both my builds. once this mantle crazieness was going on with all the set backs, and it finaly launch i was realy disapointed with how it worked out and how it launched period. so i sent back my xfx gpus and got a evga gtx 780ti sc, not going to lie that single gpu smokes. blazing fast and cool and super quiet. i was realy inpressed. so sad to say my gigabyte r9's are in static bags sitting ontop of my desk. until i can see beter performance from mantle and with the current prices, i hate to say it but the recent drop in nvidia gpus and the launch of the new ti amd's hold on top spot was very short lived. even in bf4 the single ti beat my r9 280x's down. crossfire non crossfire, mantle, and dx. the ti put up the numbers for me and was alot smoother. now im thining of selling my r9's and getting another ti.


----------



## gh0stfac3killa

Quote:


> Originally Posted by *crazysoccerman*
> 
> If you are a gamer and buy AMD at newegg's prices you're stupid TBH. Do you now see why I, as a gamer who cares about the PC gaming community, hate speculative neckbeards?
> 
> This is a crippling blow to affordable PC gaming, IMO.


in my situation i dont have much choice but to use the online retailers, as im deployed in kuwait. and let me tell you the prices here downtown are far far far worse than newegg. i use newegg as a reference point between them, frys, amazon, ebay sometimes, and tigerdirect. there the only ones that will ship to apo adresses. i miss my local shops from back home but sadly i dont have that option anymore. so if you saw the prices here in kuwait new egg would be a deal, lol. in all though it is a huge blow to the budget gamer which i to am. money is tight all the way around im just lucky im able to make builds out here for people wanting to get into pc gaming and work out trades. i do the builds in trade for say two of these or one of those type thing. example on what i mean is i went to a local pc shop here downtown and saw a titan in the window. the asus titan i asked the guy how much for the titan he told me what came up to ruffly 1700 dollars after convertering the exchange from kd to us. i almost passed out lol.


----------



## dmfree88

It sucks but if u give in it makes up for it. I started with a 7870 mining in december. My goal was to get a 280x before prices went crazy then i got a 270x anyways with the profits. Now i may even get a 270 in hopes that they all pay for themselves. I hate that i helped make the price go up but at the same time as a poor dude i was able to upgrade YEARS before expected. Who knows i could be a millionaire someday because of nutcoin. Even if i dont at least now i got a gaming rig that can handle any game. Might be hurtful for the price but if u give in and mine with it then you will likely earn back the difference and then some. At least for now....

So think what you want but after i get this last gpu i will be paying my power bill and trifire pitcairn which in my case wouldnt have likely happened in a looong time and is extremely helpful for my family. Difficulty is on the rise but i will be happy to quit if i dont make a profit and have a sweet rig







. Who knows but im happy to be able to be here


----------



## gh0stfac3killa

oh, and what the heck is a speculative neckbeard??? i have never herd that term, just curiouse.


----------



## bajer29

Quote:


> Originally Posted by *KauBoy*
> 
> 1st of all thank you for adding me to the club. Much appreciated.
> yes sir.1550 was a typo
> 
> 
> 
> 
> 
> 
> 
> 
> 1070 and 1550 are the stock clocks of the vapor-x and 1150 and 1600 was my previous OC.
> *Just now I could hit 1200 and 1600 MHz and stable. This vga is amazing.*
> 
> 
> Spoiler: Warning: Spoiler!


Really? I've been having the hardest time keeping mine stable at 1200 core and stock VRAM. What voltage are you using?


----------



## KauBoy

Quote:


> Originally Posted by *bajer29*
> 
> Really? I've been having the hardest time keeping mine stable at 1200 core and stock VRAM. What voltage are you using?


nope. didn't tweak up the voltages








just the core clock and memory clock


----------



## bajer29

Quote:


> Originally Posted by *KauBoy*
> 
> nope. didn't tweak up the voltages
> 
> 
> 
> 
> 
> 
> 
> 
> just the core clock and memory clock


How many GPUs are in your system? What kind of testing have you done?

This makes me think something weird is going on with my card.


----------



## Devildog83

Quote:


> Originally Posted by *KauBoy*
> 
> nope. didn't tweak up the voltages
> 
> 
> 
> 
> 
> 
> 
> 
> just the core clock and memory clock


At the top right of this page, above your name is a link to rig-builder. If you do this it will always make it easier for folks to help out.

Thanks, DD


----------



## F3ERS 2 ASH3S

Crazy things.. I had one of my monitors unplug partially at the connector, it was my second monitor that showed color correctly still, then boom my primary monitor turned red... plugged the secondary in all the way (this time tightening the screws) boom primary monitor was correct again... what is up with that? just weird to me


----------



## Destrto

I have a quick question for the guys with R9's. I apologize if I have missed a post somewhere that answers this.

I'm trying to find out if there are any compatible full cover waterblocks for the 270X?


----------



## bajer29

Quote:


> Originally Posted by *Destrto*
> 
> I have a quick question for the guys with R9's. I apologize if I have missed a post somewhere that answers this.
> 
> I'm trying to find out if there are any compatible full cover waterblocks for the 270X?


Try EK's compatibility checker http://www.coolingconfigurator.com/

It really depends on what make and model of 270x your looking at to a point...

EDIT: Looks like they're saying that there is nothing compatible for R9 series cards. In another forum I saw that this could be used on a Radeon R9 270x Windforce.


----------



## Devildog83

Quote:


> Originally Posted by *Destrto*
> 
> I have a quick question for the guys with R9's. I apologize if I have missed a post somewhere that answers this.
> 
> I'm trying to find out if there are any compatible full cover waterblocks for the 270X?


I have not been able to find any myself except for the HIS card.


----------



## Devildog83

Alright folks I need some help from you,

Once again we are considering adding 7800 series cards to the club because they are so close and the 7800 club is all but dead. I would like some input as to how we could change the heading. First of all I don't think there is a such a thing as an R9 280 so we can get rid of that, should we just add 7800's to the name or should we get creative? Does it matter to you? Do you love or hate the idea altogether?

I would be grateful for input from our members. Thanks DD


----------



## JCH979

I may not be the most active member but I still lurk about when I can







What neurotix has said pretty much sums up my thoughts on the matter. It wouldn't be the end of the world if the clubs joined imo.
Quote:


> Originally Posted by *neurotix*
> 
> I think we should probably keep them separate. There's just too many cards. It's confusing enough as it is with 270s, 270X, and 280X- a lot of the time people post overclocks and temps but never actually say WHICH model of card they have, and often they don't have a rigbuilder done either.
> 
> Adding 7870s, 7850s and 7870XTs in would make it even more confusing. The other thing is, there's a 7970 owner's club that's pretty active, but technically those are just 280Xs, yet nobody wants to merge that club into this one.
> 
> I won't complain if it happens, but I will probably be even more confused when people post things with no rigbuilder. My vote is no, but it's a half-hearted no.
> 
> Personally, I think it would be better to have a separate 7870/270X club and a 7970/280X club because that way Pitcairn and Tahiti are separate clubs. Not going to happen but it makes the most sense to me.


----------



## mAs81

Quote:


> Originally Posted by *neurotix*
> 
> Personally, I think it would be better to have a separate 7870/270X club and a 7970/280X club because that way Pitcairn and Tahiti are separate clubs. Not going to happen but it makes the most sense to me.


I agree with neurotix..It seems most reasonable to do it this way..


----------



## dmfree88

Why not make it full tahiti plus pitcairn? Bring all of amd together







. No reason to leave out anyone. Or if 2 clubs then tahiti club and pitcairn club. Any card welcome







. Allows for new card releases to be part of the club and new members (assuming tahiti and pitcairn stick around







) but also keeps the older sometimes more knowledgeable (and/or experienced) around to help the new wave even if it is a different card.


----------



## Zero_

Quote:


> Originally Posted by *KauBoy*
> 
> nope. didn't tweak up the voltages
> 
> 
> 
> 
> 
> 
> 
> 
> just the core clock and memory clock


I have the exact same card, from this exact same batch (with matching serials and all)! And I can't go over 1100MHz core at stock voltage.

Lucky pucker.


----------



## KauBoy

Quote:


> Originally Posted by *bajer29*
> 
> How many GPUs are in your system? What kind of testing have you done?
> 
> This makes me think something weird is going on with my card.


its a single GPU setup. i have done 3D mark 11 and unigine heaven after OCing and no issues. i was playing BF4 and AC4 Black Flag after that. no artifacts. working fine for me








apperently one of my friends who has the very same GPU has not been able to pass above 1100/1600 point since it has been artifacting.


----------



## KauBoy

Quote:


> Originally Posted by *Zero_*
> 
> I have the exact same card, from this exact same batch (with matching serials and all)! And I can't go over 1100MHz core at stock voltage.
> 
> Lucky pucker.


i just mentioned about you zero


----------



## Destrto

Thanks guys for the input. I'm about 1,000 pages behind in the Water Cooling club here, so I don't know if someone has tried any other waterblocks with the R9's. I've been looking around online, and EK's site only lists the universal blocks. I'd honestly rather not get one of those. But I guess these cards are still too new for the watercooling market to have anything available yet.


----------



## bajer29

Quote:


> Originally Posted by *Zero_*
> 
> I have the exact same card, from this exact same batch (with matching serials and all)! And I can't go over 1100MHz core at stock voltage.
> 
> Lucky pucker.


I'm with you, buddy.


----------



## Recr3ational

Quote:


> Originally Posted by *Destrto*
> 
> Thanks guys for the input. I'm about 1,000 pages behind in the Water Cooling club here, so I don't know if someone has tried any other waterblocks with the R9's. I've been looking around online, and EK's site only lists the universal blocks. I'd honestly rather not get one of those. But I guess these cards are still too new for the watercooling market to have anything available yet.


Which card have you got?
Edit: just saw you have the 270x


----------



## Destrto

Quote:


> Originally Posted by *Recr3ational*
> 
> Which card have you got?
> Edit: just saw you have the 270x


Yes, the 270X will be the card I plan to purchase (2 eventually). I am trying to cover my bases beforehand by gathering all my info on compatible water cooling products. Some people are saying a few of the 78xx series blocks are compatible, others I'm seeing say there are no blocks currently compatible.


----------



## bajer29

Quote:


> Originally Posted by *Destrto*
> 
> Yes, the 270X will be the card I plan to purchase (2 eventually). I am trying to cover my bases beforehand by gathering all my info on compatible water cooling products. Some people are saying a few of the 78xx series blocks are compatible, others I'm seeing say there are no blocks currently compatible.


I think he means what exact card are you looking at? Brand, model, style, etc.?


----------



## lombardsoup

Just bought a 270 due to the price gouging with the 270x. There was just no way I'd buy a 270x for close to $300. Can live without a slightly higher core clock. Now to wait for it to get here...

Just two more days with Intel onboard graphics Mount and Blade Warband at 1024 x 768 with all IQ settings low lol


----------



## Zero_

Quote:


> Originally Posted by *lombardsoup*
> 
> Just bought a 270 due to the price gouging with the 270x. There was just no way I'd buy a 270x for close to $300. Can live without a slightly higher core clock. Now to wait for it to get here...


Thankfully the prices around where I'm from have remained static, gone down even. The 270 goes at $235 and the 270X at 265. Want me to ship you one?


----------



## lombardsoup

Quote:


> Originally Posted by *Zero_*
> 
> Thankfully the prices around where I'm from have remained static, gone down even. The 270 goes at $235 and the 270X at 265. Want me to ship you one?


Sri Lanka

D:


----------



## HunnoPT

Well..im from Portugal and its even cheaper lol as the for example :Sapphire Radeon DUAL-X R9 270X OC Boost 2GB =186.00 EUR =254.425 USD

Not to shabby ...

Btw for the ones wondering if Elpina is a good memory brand or not, dunno about that but , heres the last announce on "OUR" memory makers:

To whom it may concern:

Yo****aka Kino****a, Trustee
Nobuaki Kobayashi, Trustee
Elpida Memory, Inc.

Notice of Company Name Change

Elpida Memory, Inc. announces that the company plans to change its name as follows with
effect from February 28, 2014.

1. New Company Name
Micron Memory Japan, Inc.
Note: The address and phone number for the company remain unchanged.

2. Effective Date of the Change
February 28, 2014

3. Subsidiary Name Change
A subsidiary of Elpida Memory, Inc. also plans to change its name as follows.


----------



## Destrto

Quote:


> Originally Posted by *bajer29*
> 
> I think he means what exact card are you looking at? Brand, model, style, etc.?


I will be looking to grab the Gigabyte R9 270X 4Gb. I believe the model number is GV-R927XOC-GD.


----------



## Recr3ational

Quote:


> Originally Posted by *Destrto*
> 
> I will be looking to grab the Gigabyte R9 270X 4Gb. I believe the model number is GV-R927XOC-GD.


Watch this space. I'll look around for you two secs.

Have you got a picture of the pcb? So I can cross reference. I'm at work at the moment . Will do it in abit. Unless someone else has done it .

Also why do you need 4Gb if you buying two?


----------



## Devildog83

Quote:


> Originally Posted by *Destrto*
> 
> Yes, the 270X will be the card I plan to purchase (2 eventually). I am trying to cover my bases beforehand by gathering all my info on compatible water cooling products. Some people are saying a few of the 78xx series blocks are compatible, others I'm seeing say there are no blocks currently compatible.


EK has a full blocks but only for the HIS cards. You could just grab a couple of those and you would be set.

http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1247


----------



## gibby1690

hi people ive been looking into buying a r9 280x, my original though was gaming but i was also looking into starting up mining, but to be honest thats not going to happen i dont think.

should i just go with a 270x toxic and save myself some cash? or will the 280x still be more benificial?

games will be bf4, cod ghosts, fifa 14 and gta if it comes.

ive been looking at things all night and it seems a i3 4320 does quiet well in benchmarks with a 7970 installed.

will i notice much performance diffrence from my i3 3240? obv appart from L cache stuff like that


----------



## [CyGnus]

gibby1690 with the 280x you get 3Gb Vram instead of 2GB and 384bits instead of 256 for 1080P i think its the ideal card or wait for the new R9 280 ( rebranded 7950) for better value


----------



## gibby1690

Quote:


> Originally Posted by *[CyGnus]*
> 
> gibby1690 with the 280x you get 3Gb Vram instead of 2GB and 384bits instead of 256 for 1080P i think its the ideal card or wait for the new R9 280 ( rebranded 7950) for better value


im only asking as i see that the toxic still maxes bf4 or am i tripping on that one?

when are the new cards out? im looking at buying next friday


----------



## [CyGnus]

What is the price difference from a 270X Toxic and a 280X in your country? All comes down to price the 270X Toxic is a very capable card no doubts there but if you can spend a bit more for the 280x its worth it







. The GPU is the one component that makes the most difference in your PC so choose wisely


----------



## gibby1690

Quote:


> Originally Posted by *[CyGnus]*
> 
> What is the price difference from a 270X Toxic and a 280X in your country? All comes down to price the 270X Toxic is a very capable card no doubts there but if you can spend a bit more for the 280x its worth it
> 
> 
> 
> 
> 
> 
> 
> . The GPU is the one component that makes the most difference in your PC so choose wisely


for the toxic its £170 and cheapest 280x is about £240.

i have only ever had a gygabyte card, is there any i should stay away from?

as it will probably be the cheapest good brand 280x i will buy

i am also a noob OCer so a higher clocked out the box card is my prefered choice,

or would a lower clocked card give me more room to experiment?


----------



## JetSet Chilli

Hey folks

Probably not the best place to say hi being a first time poster but hey ho.

Hi,

Bought myself the MSI 280x today £256 inc del. Should hopefully get it tomorrow. I look forward to chatting with you all about how to get the best of out of this card. Do many of you have this particular card?


----------



## Destrto

Quote:


> Originally Posted by *Recr3ational*
> 
> Watch this space. I'll look around for you two secs.
> 
> Have you got a picture of the pcb? So I can cross reference. I'm at work at the moment . Will do it in abit. Unless someone else has done it .
> 
> Also why do you need 4Gb if you buying two?


Sorry mate, I'm on my phone out of town for a couple hours. This will be part of an ultimate build I'm putting together,so I want the 2 4Gb cards. Basically cause the price difference is negligible between this gigabyte model and some of the 2Gb models.


----------



## Devildog83

Quote:


> Originally Posted by *JetSet Chilli*
> 
> Hey folks
> 
> Probably not the best place to say hi being a first time poster but hey ho.
> 
> Hi,
> 
> Bought myself the MSI 280x today £256 inc del. Should hopefully get it tomorrow. I look forward to chatting with you all about how to get the best of out of this card. Do many of you have this particular card?


Throw up the make, model, clocks and a pic when you are ready and your in.


----------



## Farih

Quote:


> Originally Posted by *Devildog83*
> 
> *Farih* has been belatedly been added to the club.
> 
> 
> 
> 
> 
> 
> 
> Please let me know what clocks you wish to be posted and if you in fact have a Toxic. That's just what I remember.


Sorry, yes it is a Toxic.

Clocked @ 1362/1550mhz (rank 1 in 3DM11 and FS)


----------



## Devildog83

Quote:


> Originally Posted by *HunnoPT*
> 
> Well..im from Portugal and its even cheaper lol as the for example :Sapphire Radeon DUAL-X R9 270X OC Boost 2GB =186.00 EUR =254.425 USD
> 
> Not to shabby ...
> 
> Btw for the ones wondering if Elpina is a good memory brand or not, dunno about that but , heres the last announce on "OUR" memory makers:
> 
> To whom it may concern:
> 
> Yo****aka Kino****a, Trustee
> Nobuaki Kobayashi, Trustee
> Elpida Memory, Inc.
> 
> Notice of Company Name Change
> 
> Elpida Memory, Inc. announces that the company plans to change its name as follows with
> effect from February 28, 2014.
> 
> 1. New Company Name
> Micron Memory Japan, Inc.
> Note: The address and phone number for the company remain unchanged.
> 
> 2. Effective Date of the Change
> February 28, 2014
> 
> 3. Subsidiary Name Change
> A subsidiary of Elpida Memory, Inc. also plans to change its name as follows.


I have heard that cards that come with Hynix memory can get higher memory clocks. I am not sure that it gives you that much of a performance boost though as in a lot of cases extreme memory overclocks even hurt performance. Both of my cards are Elpida 6 Ghz memory chips and I think they do just fine. I can get the 270x to near 1600 an the memory but when I go much past 1500 I start losing. I think the memory controller ha s a lot to do with it and the bios as I have the same amount of 6Ghz chips as in my 7870 but it will only do around 1500 max. Elpida's are just fine in my book but you will get some who will argue that.


----------



## Pesmerrga

Have you guys seen the new R7 265? Pretty much a 7850. Pitcairn with 1024 SPs. Supposed to be $149, which means Newegg will sell it for $249..









http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R7-265-2GB-Review-Pitcairn-takes-another-pass


----------



## Recr3ational

Can someone do me MASSIVE favour?

If I give you a copy of a bios of my 7950. Can someone changed some values for me? I've been checking what's stable.


----------



## arcticchill360

Any one here have 280x in trifire that can give me an idea on how well it performs against 290x xfire?


----------



## Recr3ational

Quote:


> Originally Posted by *arcticchill360*
> 
> Any one here have 280x in trifire that can give me an idea on how well it performs against 290x xfire?


Well. Less cards = less problems. End off.
Even if tri-fire 280x out performs crossfire 290x (which I doubt)
I doubt you'll be able to use all three without issues. Drivers/ throttling. Etc.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *gibby1690*
> 
> for the toxic its £170 and cheapest 280x is about £240.
> 
> i have only ever had a gygabyte card, is there any i should stay away from?
> 
> as it will probably be the cheapest good brand 280x i will buy
> 
> i am also a noob OCer so a higher clocked out the box card is my prefered choice,
> 
> or would a lower clocked card give me more room to experiment?


Well you are talking different chips too no just clocks

The 270 is the same as the 7800 series as the 280 is the 7900 series in fact the 280x is a rebrand of ghz edition 7970. With my one 280x I am getting 45-65FPS in BF4 However If I set my overclock on my chip to 5Ghz then I get 55-80 FPS


----------



## Devildog83

Quote:


> Originally Posted by *Devildog83*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Farih*
> 
> Sorry, yes it is a Toxic.
> 
> Clocks updated. Very nice clocks. I am ranked no 1 in 3Dmark11 too with my same GPU, 7870's or 270x's, and CPU valid results only.
> 
> 
> 
> http://www.3dmark.com/3dm11/7965869
> 
> http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/3dm11/P/1541/765/500000?minScore=0&cpuName=AMD%20FX-8350&gpuName=AMD%20Radeon%20HD%207870
Click to expand...


----------



## neurotix

Devildog, for some reason your combined score in 3dm11 is low.

Mine is always right around 8k.

Our physics score is roughly the same, yet your combined scores are 1000 pts lower.

Here's mine without tess: http://www.3dmark.com/3dm11/7967439
Here's with: http://www.3dmark.com/3dm11/7967400

In both of these my combined score is 8k.

Not sure what the cause or solution is. You had the same problem with Firestrike but updating to SP1 didn't fix it, it did for me.

I'll hit up Google and see what I can find.


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> Devildog, for some reason your combined score in 3dm11 is low.
> 
> Mine is always right around 8k.
> 
> Our physics score is roughly the same, yet your combined scores are 1000 pts lower.
> 
> Here's mine without tess: http://www.3dmark.com/3dm11/7967439
> Here's with: http://www.3dmark.com/3dm11/7967400
> 
> In both of these my combined score is 8k.
> 
> Not sure what the cause or solution is. You had the same problem with Firestrike but updating to SP1 didn't fix it, it did for me.
> 
> I'll hit up Google and see what I can find.


In X-Fire it is, I still don't know why.


----------



## kpo6969

Quote:


> Originally Posted by *Devildog83*
> 
> Alright folks I need some help from you,
> 
> Once again we are considering adding 7800 series cards to the club because they are so close and the 7800 club is all but dead. I would like some input as to how we could change the heading. *First of all I don't think there is a such a thing as an R9 280 so we can get rid of that,* should we just add 7800's to the name or should we get creative? Does it matter to you? Do you love or hate the idea altogether?
> 
> I would be grateful for input from our members. Thanks DD


*AMD Radeon R9 280 in the works ?*

http://www.guru3d.com/news_story/amd_radeon_r9_280_in_the_works.html


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> Devildog, for some reason your combined score in 3dm11 is low.
> 
> Mine is always right around 8k.
> 
> Our physics score is roughly the same, yet your combined scores are 1000 pts lower.
> 
> Here's mine without tess: http://www.3dmark.com/3dm11/7967439
> Here's with: http://www.3dmark.com/3dm11/7967400
> 
> In both of these my combined score is 8k.
> 
> Not sure what the cause or solution is. You had the same problem with Firestrike but updating to SP1 didn't fix it, it did for me.
> 
> I'll hit up Google and see what I can find.


I am pretty sure that in 3DMark11 it's because I am running X-Fire and with both GPU's and 8 CPU cores running at near 100% it's causing a bit of issues with the combined score. Firestrke is the one that concerns me most because if I ran 11 at 5.0 Ghz like you did I am sure I would be a lot closer to 8,000 combined but in firestrike I can get way better physics and graphics score than someone and the combined score is crap 2,000 or something. Check out the 7800 and you will see what I mean.


----------



## Devildog83

Quote:


> Originally Posted by *kpo6969*
> 
> *AMD Radeon R9 280 in the works ?*
> 
> http://www.guru3d.com/news_story/amd_radeon_r9_280_in_the_works.html


I won't be deleting the 280 then.

I think it will not be the best idea to roll the 7800 club into this one. Most seem to think it will not be the best thing to do so for now we will leave it alone.


----------



## Recr3ational

I'm changing my 7950's bios with a 7970's tonight.
Which switch is it supposed to be on? 1 or 2? The switch on the card itself i mean?


----------



## F3ERS 2 ASH3S

So I got my second card... Installed it then was getting restarts so I did a Windows restore point restore.. everything seems fine.. My Graphics score in 3d11 is 19063 on stock clocks.. http://www.3dmark.com/3dm11/7974302

Was trying to overclock both cards to see what I can get but for some reason after burner does not want to allow me to adjust any of the voltage.. everything is blanked out buy does change when I set any voltage modification in after burner HWinfo was not wanting to pick up the other card. I am trying to figure this out its odd.. . CCC picked it up and I can set crossfire for it..


----------



## Recr3ational

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> So I got my second card... Installed it then was getting restarts so I did a Windows restore point restore.. everything seems fine.. My Graphics score in 3d11 is 19063 on stock clocks.. http://www.3dmark.com/3dm11/7974302
> 
> Was trying to overclock both cards to see what I can get but for some reason after burner does not want to allow me to adjust any of the voltage.. everything is blanked out buy does change when I set any voltage modification in after burner HWinfo was not wanting to pick up the other card. I am trying to figure this out its odd.. . CCC picked it up and I can set crossfire for it..


ULPS?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Recr3ational*
> 
> ULPS?


I already have it disabled

THINK I found it....

I decided to switch the cards around.. I took a look at the card version. My first one is version 1.0 my new one is version 1.6

then I loaded up after burner.. badda bing card 1 (version 1.6) no voltage controls even after everything is set and card 2 (v1.0) is unlocked voltage. After further looking in GPUz there is a different BIOS Version.. Looks like they BIOS locked the voltage on my new card....

Looks like I am for sure flashing the BIOS now on my card I will report back once I have test that


----------



## mAs81

Well I finally installed my backplate..
Although,it needs new screws to fit properly under my cooler







I currently have half the screws on , but it sits on my card nice and tight..
Here's some pics for you guys!!


Spoiler: Warning: Spoiler!









Them guys at Coldzero did an awesome job!!!!


----------



## F3ERS 2 ASH3S

Update: Did the BIOS flash to the other card that I have and it is a no go.... still voltage locked


----------



## Devildog83

Quote:


> Originally Posted by *mAs81*
> 
> Well I finally installed my backplate..
> Although,it needs new screws to fit properly under my cooler
> 
> 
> 
> 
> 
> 
> 
> I currently have half the screws on , but it sits on my card nice and tight..
> Here's some pics for you guys!!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Them guys at Coldzero did an awesome job!!!!


Very sweet !!!


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Update: Did the BIOS flash to the other card that I have and it is a no go.... still voltage locked


Found out that there are 2 resisters missing and an IC on this card Do you think that there can be a voltage mod on it?



EDIT: confirmed through XFX that it is a known part of the revision change


----------



## Lifeshield

Looking at getting a R9 280X. Which would be best to buy?

http://www.scan.co.uk/products/3gb-sapphire-radeon-r9-280x-vapor-x-28nm-6200mhz-gddr5-gpu-950mhz-boost-1070mhz-2048-streams-dp-dvi-

Or

http://www.scan.co.uk/products/3gb-msi-radeon-radeon-r9-280x-gaming-3g-28nm-6000mhz-gddr5-gpu-1000mhz-boost-1050mhz-2048-streams-hd

Thanks in advance.


----------



## bajer29

Quote:


> Originally Posted by *Lifeshield*
> 
> Looking at getting a R9 280X. Which would be best to buy?
> 
> http://www.scan.co.uk/products/3gb-sapphire-radeon-r9-280x-vapor-x-28nm-6200mhz-gddr5-gpu-950mhz-boost-1070mhz-2048-streams-dp-dvi-
> 
> Or
> 
> http://www.scan.co.uk/products/3gb-msi-radeon-radeon-r9-280x-gaming-3g-28nm-6000mhz-gddr5-gpu-1000mhz-boost-1050mhz-2048-streams-hd
> 
> Thanks in advance.


I love my sapphire 280x vx. The only thing is the voltages are finicky from one card to the next. If you're not worried about OCing, go with the Sapphire since it already has higher clocks/ good single card performance (plus it's a wee-bit cheaper). The MSI would probably be better for OCing but has a lower stock clock.


----------



## mAs81

Quote:


> Originally Posted by *Lifeshield*
> 
> Looking at getting a R9 280X. Which would be best to buy?
> 
> http://www.scan.co.uk/products/3gb-sapphire-radeon-r9-280x-vapor-x-28nm-6200mhz-gddr5-gpu-950mhz-boost-1070mhz-2048-streams-dp-dvi-
> 
> Or
> 
> http://www.scan.co.uk/products/3gb-msi-radeon-radeon-r9-280x-gaming-3g-28nm-6000mhz-gddr5-gpu-1000mhz-boost-1050mhz-2048-streams-hd
> 
> Thanks in advance.


I don't know about the sapphire, but I definitely can recommend the msi..It handles quite a bit OC with good cooling(TF Frozr)and is voltage unlocked..
In the other hand,what I do know know about the sapphire is that it has better stock clocks...
http://www.tomshardware.com/reviews/radeon-r9-280x-third-party-round-up,3655-2.html


----------



## HunnoPT

Quote:


> Originally Posted by *Devildog83*
> 
> I have heard that cards that come with Hynix memory can get higher memory clocks. I am not sure that it gives you that much of a performance boost though as in a lot of cases extreme memory overclocks even hurt performance. Both of my cards are Elpida 6 Ghz memory chips and I think they do just fine. I can get the 270x to near 1600 an the memory but when I go much past 1500 I start losing. I think the memory controller ha s a lot to do with it and the bios as I have the same amount of 6Ghz chips as in my 7870 but it will only do around 1500 max. Elpida's are just fine in my book but you will get some who will argue that.


Its true, as far as i seen its pretty much the same . I can do the same overclock at stock bios with both brands as my gigabyte has Elpida and my MSi has Hynix, both can do 1200/1600 stable with stock vddc and powerlimit 20% , also the both cards have good power phases as the MSI TF hs 4+2 and the Gygabyte WF has 6+1+1+1 and thats WAY more important to get a extreme OC then the brand of the manufactor chip.

How can Elpida be bad when it's a real chip manufacturer, as opposed to a company that merely solders chips to circuit boards or slices up wafers? Elpida is NEC + Hitachi + Mitsubishi.
I get a bad impression when no real chip company can be identified from the DRAM chips tho.









But still as far as i tested its the same freakin thing.. Ill get my new power supply in 3 days then i can Xfire them + alter bios, after that ill stress the memory chips on both cards and watch for throttle issues and try to end the myth for good


----------



## Devildog83

Quote:


> Originally Posted by *HunnoPT*
> 
> Its true, as far as i seen its pretty much the same . I can do the same overclock at stock bios with both brands as my gigabyte has Elpida and my MSi has Hynix, both can do 1200/1600 stable with stock vddc and powerlimit 20% , also the both cards have good power phases as the MSI TF hs 4+2 and the Gygabyte WF has 6+1+1+1 and thats WAY more important to get a extreme OC then the brand of the manufactor chip.
> 
> How can Elpida be bad when it's a real chip manufacturer, as opposed to a company that merely solders chips to circuit boards or slices up wafers? Elpida is NEC + Hitachi + Mitsubishi.
> I get a bad impression when no real chip company can be identified from the DRAM chips tho.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But still as far as i tested its the same freakin thing.. Ill get my new power supply in 3 days then i can Xfire them + alter bios, after that ill stress the memory chips on both cards and watch for throttle issues and try to end the myth for good


Looking forward to those results. What PSU do you have?


----------



## Farih

Quote:


> Originally Posted by *HunnoPT*
> 
> Its true, as far as i seen its pretty much the same . I can do the same overclock at stock bios with both brands as my gigabyte has Elpida and my MSi has Hynix, both can do 1200/1600 stable with stock vddc and powerlimit 20% , also the both cards have good power phases as the MSI TF hs 4+2 and the Gygabyte WF has 6+1+1+1 and thats WAY more important to get a extreme OC then the brand of the manufactor chip.
> 
> How can Elpida be bad when it's a real chip manufacturer, as opposed to a company that merely solders chips to circuit boards or slices up wafers? Elpida is NEC + Hitachi + Mitsubishi.
> I get a bad impression when no real chip company can be identified from the DRAM chips tho.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But still as far as i tested its the same freakin thing.. Ill get my new power supply in 3 days then i can Xfire them + alter bios, after that ill stress the memory chips on both cards and watch for throttle issues and try to end the myth for good


I can get hiher then 1600 mhz on Vram.
Had it over 1700mhz to if i can remember properly.

The thing is the higher i put the core clock the lower i needed to clock the Vram to be stable.
@ 1362mhz core i cant get the Vram stable past 1565mhz (but bench faster/better at 1550mhz)
@ 1250mhz core i can set Vram stable at 1600mhz 24/7.

Seems that high core clock frequency's has more impact then the make of the memory chips.

I have Hynix memory on this card btw but dont think things would have been much different if it was Elpida.


----------



## Devildog83

Quote:


> Originally Posted by *Farih*
> 
> I can get hiher then 1600 mhz on Vram.
> Had it over 1700mhz to if i can remember properly.
> 
> The thing is the higher i put the core clock the lower i needed to clock the Vram to be stable.
> @ 1362mhz core i cant get the Vram stable past 1565mhz (but bench faster/better at 1550mhz)
> @ 1250mhz core i can set Vram stable at 1600mhz 24/7.
> 
> Seems that high core clock frequency's has more impact then the make of the memory chips.
> 
> I have Hynix memory on this card btw but dont think things would have been much different if it was Elpida.


I will take the higher core any day, seriously higher memory clocks don't seem to help too much with most applications anyway.


----------



## Recr3ational

guys, when im trying to accesed winflash via CMD, why does it say "the system cannot find the path specified.? Even though it right


----------



## GoLDii3

Quote:


> Originally Posted by *Recr3ational*
> 
> guys, when im trying to accesed winflash via CMD, why does it say "the system cannot find the path specified.? Even though it right


Why you even want to use winflash?


----------



## HunnoPT

Quote:


> Originally Posted by *Farih*
> 
> I can get hiher then 1600 mhz on Vram.
> Had it over 1700mhz to if i can remember properly.
> 
> The thing is the higher i put the core clock the lower i needed to clock the Vram to be stable.
> @ 1362mhz core i cant get the Vram stable past 1565mhz (but bench faster/better at 1550mhz)
> @ 1250mhz core i can set Vram stable at 1600mhz 24/7.
> 
> Seems that high core clock frequency's has more impact then the make of the memory chips.
> 
> I have Hynix memory on this card btw but dont think things would have been much different if it was Elpida.


Ill give it a try , tanks dude








Quote:


> Originally Posted by *Devildog83*
> 
> Looking forward to those results. What PSU do you have?










Nox Urano Sli 600W







Its already starting to fail after 5 long years under stress lol (hd 6850, 560 ti, 270x now) Quadcore2 q6600 [email protected], FX 6300 [email protected] 4.8Mhz) Poor fellow , its time to finally give her a rest R.I.P









Ill get a 1 year old Nox Krypton 900w for 40 € Euro = 54 $ USD And hope it will serve me as well as the old NOX did.


----------



## [CyGnus]

Those PSU's are really bad... if you only have one VGA 270X a XFX 550W Pro is all you need very solid PSU made by Seasonic its around 55€
You can find them in Chiptec.net or in Globaldata


----------



## gibby1690

Quote:


> Originally Posted by *[CyGnus]*
> 
> Those PSU's are really bad... if you only have one VGA 270X a XFX 550W Pro is all you need very solid PSU made by Seasonic its around 55€
> You can find them in Chiptec.net or in Globaldata


quick question.... i have an xfx 550w psu and im looking to buy an r9 280x next week. im not oc my cpu but will probably have at least a minor clock on my gpu

is my psu going to be able to power this card?


----------



## [CyGnus]

gibby1690 with no problems at all the XFX550w can power easily a 290X Overclocked with a 4770k 5GHz and you would still have watts to spare


----------



## gibby1690

Quote:


> Originally Posted by *[CyGnus]*
> 
> gibby1690 with no problems at all the XFX550w can power easily a 290X Overclocked with a 4770k 5GHz and you would still have watts to spare


lol klkl just i seen that last post about the 270x and then looked it up online and i seen people saying 650w was minimum and even at the any oc might throttle it ( think thats what its called ) so got a bit worried. now just to start worrying about can i actually get a 280x there going quick lol


----------



## HunnoPT

Quote:


> Originally Posted by *[CyGnus]*
> 
> Those PSU's are really bad... if you only have one VGA 270X a XFX 550W Pro is all you need very solid PSU made by Seasonic its around 55€
> You can find them in Chiptec.net or in Globaldata


I have 1 of those on my Intel system( xfx core pro 550w , the Krypton 900w is for xfire 2 270x in this system. Everyone says NOX is garbage, and most of them well..actually are but..
Why are the Krypton bad, ive seen some reviews and its quite good







, bad is my 600 urano lol but the krypton seems a solid powersupply with 80+ bronze .

The actual efficiencies for this PSU are 84.16%, 86.65%, and 83.63%, for loads of 100%, 50%, and 20%.

What exprience/reviews/feedback you have to back up your statement? Share with us pls, if it is bad , ill get another one for sure. I dont wanna hurt my components ..







sorry


----------



## Devildog83

Quote:


> Originally Posted by *[CyGnus]*
> 
> Those PSU's are really bad... if you only have one VGA 270X a XFX 550W Pro is all you need very solid PSU made by Seasonic its around 55€
> You can find them in Chiptec.net or in Globaldata


Yes, I run a 270x and a 7870 in Xifire with a 660w Seasonic and have a bit of headroom. You could run 1 270x with a CX430. Someone gave me a VX450w from corsair like brand new. Don't really know what to use it for but I have it.
It must be an older Corsair because I have never seen it. I love it, it's got 1 4 pin PCI-E, not much use would be good with a new 8750K APU as long as you don't want to add a GPU.



Spoiler: Warning: Spoiler!


----------



## HunnoPT

Quote:


> Originally Posted by *Devildog83*
> 
> Yes, I run a 270x and a 7870 in Xifire with a 660w Seasonic and have a bit of headroom. You could run 1 270x with a CX430. Someone gave me a VX450w from corsair like brand new. Don't really know what to use it for but I have it.
> It must be an older Corsair because I have never seen it. I love it, it's got 1 4 pin PCI-E, not much use would be good with a new 8750K APU as long as you don't want to add a GPU.
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice PSU ehe , ive seen some of those, yeah any seasonic is good, since its the best brand of PSU out there in the market, every Corsair (except VS ones) are made by Seasonic, same as XFX and some OCZ´s, unlike most NOX that are made by CTW









Ive actually seen a 750w XFX Pro core holding a 813w system without fails so..amen on them









Ill make a review on the Krypton when i get it, because its already ordered.


----------



## Devildog83

Quote:


> Originally Posted by *HunnoPT*
> 
> Nice PSU ehe , ive seen some of those, yeah any seasonic is good, since its the best brand of PSU out there in the market, every Corsair (except VS ones) are made by Seasonic, same as XFX and some OCZ´s, unlike most NOX that are made by CTW
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ive actually seen a 750w XFX Pro core holding a 813w system without fails so..amen on them
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ill make a review on the Krypton when i get it, because its already ordered.


Actually a lot of Corsairs PSU's are made by Channelwell and Flextronics now, they are supposed to make very good PSU's also. Here - http://whirlpool.net.au/wiki/psu_manufacturers


----------



## mAs81

If you're interested in finding a good PSU,this OCN member has made a very thorough research!!!I know that it helped me a lot when I was looking for a new PSU!!








http://www.overclock.net/t/1431929/psu-index-thread


----------



## HunnoPT

Quote:


> Originally Posted by *Devildog83*
> 
> Actually a lot of Corsairs PSU's are made by Channelwell and Flextronics now, they are supposed to make very good PSU's also. Here - http://whirlpool.net.au/wiki/psu_manufacturers


WOW i didnt know that so many Corsair PSU wasnt Seasonic. also sad we dont even have half the psu´s that appear there, here in Portugal







We have the VS,CX , GS, TX, AX and only 2 or 3 of each model









Tanks for the info


----------



## HunnoPT

Quote:


> Originally Posted by *mAs81*
> 
> If you're interested in finding a good PSU,this OCN member has made a very thorough research!!!I know that it helped me a lot when I was looking for a new PSU!!
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1431929/psu-index-thread


Tanks, ill take a look at some of that threads for sure









But my deal on the Krypton is already made, paid just waiting to arrive by mail, should be here next monday, ill test it under huge loads in a xfire system for 24/7 , if it doesnt suits my needs, then reseel it and get a better one , got my eyes on a Corsair TX 850 v2 at a fair price of 95 USD


----------



## mAs81

Quote:


> Originally Posted by *HunnoPT*
> 
> Tanks, ill take a look at some of that threads for sure
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But my deal on the Krypton is already made, paid just waiting to arrive by mail, should be here next monday, ill test it under huge loads in a xfire system for 24/7 , if it doesnt suits my needs, then reseel it and get a better one , got my eyes on a Corsair TX 850 v2 at a fair price of 95 USD


That's a good deal for 95$.Over here,new,its priced about 134 euro which is about 183 usd..If you get the chance,go for it...


----------



## F3ERS 2 ASH3S

EDIT: I fixed my issue le sigh.. looks like I need to go back to my memory and correct the instability


----------



## bartledoo

Quote:


> Originally Posted by *Farih*
> 
> 1. Set your Toxic card on the second BIOS (blue light off)
> 2. Download GPU-Z and use it to save your GPU BIOS.
> 3. Download VBE7 and change the BIOS you have just saved. (set voltage and power target)
> 4. Download Atiwinflash and flash the edited BIOS to your GPU (command prompt >> atiwinflash -f -p 0 nameofyourrom.) (Flash at your own risk !)
> 5. Restart PC and enjoy higher core voltage, power target and clock range.
> 6. Happy overclocking
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S:
> This works for most AMD cards.
> You could try the above. (best if you have a dual bios though)


do i just check manual power limit adjustment?


----------



## rickyman0319

do u guys know any company make that made full cover block for r9 270/270x ?


----------



## F3ERS 2 ASH3S

Who has figures of how much back plates reduce temps


----------



## [CyGnus]

Devildog83 VX450W is quite good PSU its the one with the Green Stickers right? I had one its made by Seasonic and the CX's are CWT.
HunnoPT if you want to trust your system to a 40/50e PSU that is an option i would't i rather get a XFX650W for Corssfire those 270X's


----------



## Recr3ational

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Who has figures of how much back plates reduce temps


Its unconfirmed. Some people find that it helps, some don't. Nor worth the money i reckon.
If you want it to look better make one like i did. COst £2


----------



## mAs81

My temps are about the same after installing my back plate..Perhaps 1c cooler at times but I guess that must be caused by ambient temps..
Nevertheless,I only have half the screws on,have to find new ones that don't hit my cooler/ram..when I do,I'll get back to you..


----------



## F3ERS 2 ASH3S

OK thanks


----------



## Devildog83

Quote:


> Originally Posted by *rickyman0319*
> 
> do u guys know any company make that made full cover block for r9 270/270x ?


EK makes a full block but only for an HIS card.


----------



## Farih

Quote:


> Originally Posted by *bartledoo*
> 
> do i just check manual power limit adjustment?


You can set it to 150%, thats the same as the check i think.


----------



## DrClaw

Quote:


> Originally Posted by *Devildog83*
> 
> Actually a lot of Corsairs PSU's are made by Channelwell and Flextronics now, they are supposed to make very good PSU's also. Here - http://whirlpool.net.au/wiki/psu_manufacturers


i thought seasonic was making their top end psu like the ax and hx


----------



## Devildog83

Quote:


> Originally Posted by *DrClaw*
> 
> i thought seasonic was making their top end psu like the ax and hx


They do make some of them just not all of them.


----------



## shilka

Quote:


> Originally Posted by *Devildog83*
> 
> They do make some of them just not all of them.


Use this one

http://www.realhardtechx.com/index_archivos/Page5471.htm

Other one is not very good

There are rumors why Seasonic no longer works with Corsair


----------



## Recr3ational

I unlocked the shaders of my 7950. Now it's a fully fuctional 7970


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Recr3ational*
> 
> I unlocked the shaders of my 7950. Now it's a fully fuctional 7970


----------



## [CyGnus]

Recr3ational can you share some benchmarks please


----------



## Recr3ational

Quote:


> Originally Posted by *[CyGnus]*
> 
> Recr3ational can you share some benchmarks please


Its really not that much different to 2x7950. Both at stock cpu. Though im only benchmarking on valley i dont have 3dmark :
Also getting artifacts. Changed bios back


----------



## Tugz

Just switched my MSI 280x's for XFX 280x's. Personally think the XFX card is better looking and looks nicer in my case.

Don't ask why i switched. =P



Overclocked the cards.


They said these cards can't over clock more then 100mhz. Seems they were wrong as i pushed it higher. xD

My rig with MSI cards


----------



## taem

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Who has figures of how much back plates reduce temps


Can't remember the link but I read 1-3 degrees c. But that's if the backplate is actually making contact for heat dissipation. Last two cards I had with backplates, the 280x Vapor X and now the Powercolor PCS+ 290, the backplate is purely cosmetic, they are a mm or two away from the pcb. Same with the evga 780 backplates you can order, they are spaced away from the pcb.

I still love backplates though, IMHO any highish-end card that costs $300 or more ought to have them. A $500 card absolutely should have them. But most don't obviously.


----------



## Recr3ational

poop. I broke my card. Bad flash i think :\ unless there's another reason its giving artifacts.


----------



## mAs81

Quote:


> Originally Posted by *Recr3ational*
> 
> poop. I broke my card. Bad flash i think :\ unless there's another reason its giving artifacts.


Have you tried clocking down the core cpu and memory speed of the card??


----------



## Recr3ational

Quote:


> Originally Posted by *mAs81*
> 
> Have you tried clocking down the core cpu and memory speed of the card??


Yes sir. It's back to the stock bios. I think I damaged the nemory.


----------



## mAs81

Quote:


> Originally Posted by *Recr3ational*
> 
> Yes sir. It's back to the stock bios. I think I damaged the nemory.


Hmm,it looks that way..How about further underclocking your gpu ram...?It might save you from the artifacts,and you only need it in memory intensive games anyway..


----------



## Recr3ational

Quote:


> Originally Posted by *mAs81*
> 
> Hmm,it looks that way..How about further underclocking your gpu ram...?It might save you from the artifacts,and you only need it in memory intensive games anyway..


Yeah tried it doesn't seem to do much. Sucks.


----------



## [CyGnus]

Memory OC does nothing to games if you increase lets say 300MHz it might give maybe 1FPS the Core is the one that matters


----------



## mAs81

Quote:


> Originally Posted by *Recr3ational*
> 
> Yeah tried it doesn't seem to do much. Sucks.


Sorry mate...What are you going to do?Do you still have warranty?Think about trying to rma it?
If not(and you're going to throw it away),here's a crazy thought:
http://www.overclock.net/t/529271/bake-your-graphics-card-in-the-oven-fix-it-worked
Never done it myself,but desperate times....
Try putting it on another system nonetheless to be sure that it is the BIOS' fault...


----------



## Recr3ational

Quote:


> Originally Posted by *mAs81*
> 
> Sorry mate...What are you going to do?Do you still have warranty?Think about trying to rma it?
> If not(and you're going to throw it away),here's a crazy thought:
> http://www.overclock.net/t/529271/bake-your-graphics-card-in-the-oven-fix-it-worked
> Never done it myself,but desperate times....
> Try putting it on another system nonetheless to be sure that it is the BIOS' fault...


I'm just gonna buy another 280x it was my fault. Haha. I'll use this one as a coaster or something


----------



## cjc75

So question...

If I am just using a 24" LCD at 1080p (1920 x 1080), running through HDMI cable, and have no plans to upgrade beyond that at any time soon... no plans for a larger monitor, no plans for any additional monitors, etc...

Would I still benefit from having a 384bit Bus @ 3GB Vram, as opposed to a 256bit Bus @ 4GB Vram?


----------



## Shurtugal

Quote:


> Originally Posted by *cjc75*
> 
> So question...
> 
> If I am just using a 24" LCD at 1080p (1920 x 1080), running through HDMI cable, and have no plans to upgrade beyond that at any time soon... no plans for a larger monitor, no plans for any additional monitors, etc...
> 
> Would I still benefit from having a 384bit Bus @ 3GB Vram, as opposed to a 256bit Bus @ 4GB Vram?


Either of these will run 1080p just fine, I'm using a 280x at 1080p and haven't noticed any problems on any games at ultra, in saying this, I haven't actually measured my FPS. 4GB Isn't really necessary with a 256bit bus in my opinion, and 2GB will run 1080p just fine.

To MY point, I was thinking of selling my Matrix on ebay or somewhere, how much do you guys think I could get for it approximately? I'm just looking for an estimate, not to sell it right now. (The reason of this is to go ITX.)

Thanks guys!

(Also, AUD is preferred, but I can compare to similar US products)


----------



## mAs81

Quote:


> Originally Posted by *Shurtugal*
> 
> To MY point, I was thinking of selling my Matrix on ebay or somewhere, how much do you guys think I could get for it approximately? I'm just looking for an estimate, not to sell it right now. (The reason of this is to go ITX.)
> 
> Thanks guys!
> 
> (Also, AUD is preferred, but I can compare to similar US products)


Well here in Greece it is priced at about 375 euros(about 560 aud),and a quick search on amazon and ebay turned out about 500 euros shipped and 600 usd shipped(about 760 and 665 aud..)
So,since it is new but used you could get 400 to 500 aud at least,perhaps more...
Good luck


----------



## Recr3ational

Hmm this card is really wierd. It doesn't show artifacts in kumbustor, furmark etc. Only shows on games sometimes. Confusing me.


----------



## Devildog83

I got a Titanfall beta key today and will spend a good portion of the day having some fun. I will post some results later today.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> I got a Titanfall beta key today and will spend a good portion of the day having some fun. I will post some results later today.


I've been enjoying titanfall. Looking forward for release


----------



## gnemelf

I have to dissagree its a 10fps + per 150mhz for me ..


----------



## Devildog83

Ok, Titanfall is an decent game but the settings are a bit strange. I set everything to High with 2xAA and V-Sink on and WOW, colors everywhere. I turned off V-sink and it played fine, did not look the greatest but played smooth. I them turned on fraps to see what the FPS was and it's weird because stuck at 60 FPS just like V-Sink was on and it started to stutter a bit and I had some having artifacts. I lowered the setting to the recommended which were very low for some reason and no change, turned off AA and no change. It's almost like changing settings means nothing. I will do some more testing including with X-Fire off and see what's up.


----------



## mAs81

Quote:


> Originally Posted by *gnemelf*
> 
> I have to dissagree its a 10fps + per 150mhz for me ..


You mean when you OC your memory,right?I guess each card reacts differently to OCing..








Good results,by the way..
On another note I now have reinstalled my back plate using new screws and it is final..My temps are officially down about 1 to 2 degrees c depending ambient temperature..And I really love the way it looks too!!!!!!!


----------



## JCH979

I think it's capped at 60fps. I turned vsync off in-game and nothing happened, turned it on and off in CCC but I didn't notice a difference. Other than that it's playing very smooth for me at maxed out settings on 1080p and vsync off on both stock clocks and my oc'd clocks for around 2 hours that I played.


----------



## olympicwhite

Hi, I'm a noob here and I don't speak english very well, but I recently bought a 280X Toxic and I'm having 2D flickering while using Chrome or desktop working but all runs well while gaming, does anybody got the same problem or knows any solution?


----------



## mAs81

Quote:


> Originally Posted by *olympicwhite*
> 
> Hi, I'm a noob here and I don't speak english very well, but I recently bought a 280X Toxic and I'm having 2D flickering while using Chrome or desktop working but all runs well while gaming, does anybody got the same problem or knows any solution?


Do you have the latest drivers?CCC 14.1? Perhaps a driver re install or downgrade might help you...


----------



## rickyman0319

I am going to have Visiontek r9 280x gpu. I am wondering how do I overvolt and overclocked the speed for it. if so, how do I do it?

anyway does anyone have this card? if so, can I have your config info?

thank


----------



## olympicwhite

Quote:


> Originally Posted by *mAs81*
> 
> Do you have the latest drivers?CCC 14.1? Perhaps a driver re install or downgrade might help you...


I have 13.12 installed, I'll try the new drivers, hopefully they will solve the problem


----------



## DrClaw

Quote:


> Originally Posted by *Tugz*
> 
> Just switched my MSI 280x's for XFX 280x's. Personally think the XFX card is better looking and looks nicer in my case.
> 
> Don't ask why i switched. =P
> 
> 
> 
> Overclocked the cards.
> 
> 
> They said these cards can't over clock more then 100mhz. Seems they were wrong as i pushed it higher. xD
> 
> My rig with MSI cards


sweet system but i cant see your power supply







where the heck is it? lol


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *DrClaw*
> 
> sweet system but i cant see your power supply
> 
> 
> 
> 
> 
> 
> 
> where the heck is it? lol


Its on the backside of the case.

http://www.newegg.com/Product/Product.aspx?Item=N82E16811139022


----------



## Devildog83

Quote:


> Originally Posted by *DrClaw*
> 
> sweet system but i cant see your power supply
> 
> 
> 
> 
> 
> 
> 
> where the heck is it? lol


I think it might be an Air 540, the power supply is in the back.


----------



## Devildog83

I turned off off X-Fire and played Titanfall and got the same 60 FPS with 1 7870, gameplay was smooth and a lot better with 1 card.. It sure didn't like X-Fire.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> I turned off off X-Fire and played Titanfall and got the same 60 FPS with 1 7870, gameplay was smooth and a lot better with 1 card.. It sure didn't like X-Fire.


Yeah i thought that too,
I had artifacts. I though it was my bios flash but i think it was titanfall.
Its a decent game though. Did you enjoy it as much as i did?


----------



## bardacuda

Quote:


> Originally Posted by *DrClaw*
> 
> sweet system but i cant see your power supply
> 
> 
> 
> 
> 
> 
> 
> where the heck is it? lol


Hahaha I was just going to ask the same thing when I saw that pic but you beat me to it!


----------



## Tugz

Quote:


> Originally Posted by *bardacuda*
> 
> Hahaha I was just going to ask the same thing when I saw that pic but you beat me to it!


lol. thats one of the reasons why i love this case. im seriously thinking of buying a white one to do another build.


----------



## smoke2

I need quick help!
Can opportunity to buy 280X.
One version is from Gigabyte Windforce OC 2.0 with newer Windforce cooler, the second one is ASUS DCII TOP.
Both for practically same price.
There are very few reviews on Gigabyte with WF 2.0 cooler, but one older review mentioned the fans are spining from 76-83% of fans speed in load.
From this time there is a new BIOS on Gigabyte site, so maybe it helps to reduce fan speeds?

Which one would you choose, which one is quieter?

Would be glad for any remarks.


----------



## dabby91

Quote:


> Originally Posted by *smoke2*
> 
> I need quick help!
> Can opportunity to buy 280X.
> One version is from Gigabyte Windforce OC 2.0 with newer Windforce cooler, the second one is ASUS DCII TOP.
> Both for practically same price.
> There are very few reviews on Gigabyte with WF 2.0 cooler, but one older review mentioned the fans are spining from 76-83% of fans speed in load.
> From this time there is a new BIOS on Gigabyte site, so maybe it helps to reduce fan speeds?
> 
> Which one would you choose, which one is quieter?
> 
> Would be glad for any remarks.


Not so sure on the Gigabyte but I know that its voltage locked, whether that is a deciding point for you.I own the ASUS myself, and the cooler keeps it really quiet and cool, well worth it!


----------



## [CyGnus]

smoke2 both cards can be voltage unlocked the asus is the gigabyte you have to flash a custom bios so in the end all resumes to preference on the aspect of each card


----------



## smoke2

Quote:


> Originally Posted by *dabby91*
> 
> Not so sure on the Gigabyte but I know that its voltage locked, whether that is a deciding point for you.I own the ASUS myself, and the cooler keeps it really quiet and cool, well worth it!


Voltage locked isn't an issue for me, I will stay on the OC frequencies from manufacturer, but prefer not much noisy cooler, so probably ASUS will be my favourite.
What about Sapphire 280X Toxic noises?
In some reviews is noisy and in the others not too much...


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Yeah i thought that too,
> I had artifacts. I though it was my bios flash but i think it was titanfall.
> Its a decent game though. Did you enjoy it as much as i did?


Yeah, I thought the game was pretty cool. I just played the first section a few times but what I played I had fun. I am not really into the online, (Die five times every mission) kinda games
though. I don't play online too much because you play team against team in a game like that but everyone is really out for themselves, it's not really a team effort. You just go out there and run around killing anything you can and get killed a few times and then rinse and repeat. It was kinda cool as far as online play goes though.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Yeah, I thought the game was pretty cool. I just played the first section a few times but what I played I had fun. I am not really into the online, (Die five times every mission) kinda games
> though. I don't play online too much because you play team against team in a game like that but everyone is really out for themselves, it's not really a team effort. You just go out there and run around killing anything you can and get killed a few times and then rinse and repeat. It was kinda cool as far as online play goes though.


Well I played with a fellow OCN'er and we played as a team. It was fun. We both ran together in our titans. Shot people off eachother titans. I guess you have to play with your mates. As I can see playing by yourself could be boring after a while.


----------



## bajer29

Anyone know of an application that can check VRAM temps? I've been a little worried recently, while testing out OCs. Since I've just gone back to stock settings I thought I'd see what the backplate felt like at stock. It was very hot to the touch near where the VRAM is located on the card. I'd like to know what those temps are without having to rig my card with censors or having to buy a surface thermometer. Any ideas?


----------



## [CyGnus]

bajer29 GPUZ reads VRM temps but not all card's have the sensor for it


----------



## Devildog83

Quote:


> Originally Posted by *bajer29*
> 
> Anyone know of an application that can check VRAM temps? I've been a little worried recently, while testing out OCs. Since I've just gone back to stock settings I thought I'd see what the backplate felt like at stock. It was very hot to the touch near where the VRAM is located on the card. I'd like to know what those temps are without having to rig my card with censors or having to buy a surface thermometer. Any ideas?


HWinfo64 too.


----------



## ricko99

Anyone having artifacts on titanfall like mine? asus r9 280x OC TOP


----------



## lombardsoup

Its been quite a while since I've OC'd a card. Just got my 270, and I have some questions:

What's the best software nowadays?
What's the safe voltage limit for my card?
OC core first then mem? Or leave mem alone?


----------



## Devildog83

I did with V-Sink on, most of it went away with V-Sink off but I had stutter issues but that was caused because it doesn't like my X-Fire set-up. I run at 60 FPS with X-Fire off and have very little issues. It's not perfect but it is a beta.


----------



## Jaffi

Any chances to get a titanfall beta key? I did not sign up


----------



## Devildog83

Quote:


> Originally Posted by *lombardsoup*
> 
> Its been quite a while since I've OC'd a card. Just got my 270, and I have some questions:
> 
> What's the best software nowadays?
> What's the safe voltage limit for my card?
> OC core first then mem? Or leave mem alone?


I would use Afterburner for an MSI card for sure. I would overclock the core first yes, then overclock the memory until you stop getting a performance boost from it or it becomes unstable, whichever comes first. The voltage limit depends on the card itself and what the bios and software will allow. Don't know on the 270 but The max I get out of my 270x is about 2.69v.


----------



## dmfree88

I need a quality pair of 96mm fans for my 7870 anyone got any suggestions? I cant stand the noise anymore but i dont want to give up performance. After adding the 270x with 3 fans thats quieter its time to fix the msi







.

Seems difficult to find anything thats not stock or that looks cheap.


----------



## Northern Isnlad

i need a Little help as i can't decide should i get my self a sapphire r9 270x vapor-X or should wait for R9 280? need your inputs as i do gaming in 1080p


----------



## Hacker90

I would wait for the 280! Or find 280x selling for 320-380$


----------



## Recr3ational

Quote:


> Originally Posted by *ricko99*
> 
> Anyone having artifacts on titanfall like mine? asus r9 280x OC TOP


Thabks god it wasn't just me then. I think it's just the game.


----------



## JetSet Chilli

Some rather shoddy pics of my rig and my MSI 280x


----------



## JetSet Chilli

A bit gutted to discover that the MSI's are now voltage locked.









I don't think I can be bothered with flashing the bios so will stick at stock volts and have play around from there. 1100 off the bat seems pretty stable so far. Will test it in BF4


----------



## JetSet Chilli

Mmm. Just tried uploading my Bios to techpowerup and it says it has already been uploaded. Should I be worried that my card is a return? I was already suspicious because the box was a little shoddy and when I ordered it, it was actually out of stock and the site (Scan.co.uk) still doesn't have any stock!


----------



## nitrubbb

proud owner of MSI 270X Gaming 4G

first AMD card for me


----------



## mAs81

Quote:


> Originally Posted by *JetSet Chilli*
> 
> A bit gutted to discover that the MSI's are now voltage locked.
> 
> 
> 
> 
> 
> 
> 
> 
> I don't think I can be bothered with flashing the bios so will stick at stock volts and have play around from there. 1100 off the bat seems pretty stable so far. Will test it in BF4


How do you know it is locked for sure?Did you try unlocking the voltage via Afterburner?We have the same card,your stock clocks are 1020/1500 right?How do you have 1100?Is it the oc mode via the msi app,or you just turned it up with either AB or CCC??

Quote:


> Originally Posted by *JetSet Chilli*
> 
> Mmm. Just tried uploading my Bios to techpowerup and it says it has already been uploaded. Should I be worried that my card is a return? I was already suspicious because the box was a little shoddy and when I ordered it, it was actually out of stock and the site (Scan.co.uk) still doesn't have any stock!


Did the stickers n the box seem opened?Or the bag that had the card in?


----------



## Sozin

Anyone have issues with Afterburner/GPUZ recognizing their cards?





I restarted my computer, loaded up those two programs, and saw that nothing was being monitored. Fresh install of 13.12 drivers and six hours ago everything was being accurately monitored, so I have no idea what has gone wrong.


----------



## JetSet Chilli

Quote:


> Originally Posted by *mAs81*
> 
> How do you know it is locked for sure?Did you try unlocking the voltage via Afterburner?We have the same card,your stock clocks are 1020/1500 right?How do you have 1100?Is it the oc mode via the msi app,or you just turned it up with either AB or CCC??
> Did the stickers n the box seem opened?Or the bag that had the card in?


Yeah I overclocked it in CCC. I did the afterburner config change to switch on unofficialoverclocking but that and changing the voltage settings in the app was to no avail








As for my paranoia, it could be simply that - paranoia (having spent £250 for the first time on a GPU). The stickers and bag seemed ok, it was more that the box was a little beat up and scuffed yet when it was delivered it was very well packaged.


----------



## Recr3ational

Quote:


> Originally Posted by *JetSet Chilli*
> 
> A bit gutted to discover that the MSI's are now voltage locked.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't think I can be bothered with flashing the bios so will stick at stock volts and have play around from there. 1100 off the bat seems pretty stable so far. Will test it in BF4


Wow really? That sucks, I was goign to get the MSI too.


----------



## mAs81

Quote:


> Originally Posted by *JetSet Chilli*
> 
> Yeah I overclocked it in CCC. I did the afterburner config change to switch on unofficialoverclocking but that and changing the voltage settings in the app was to no avail
> 
> 
> 
> 
> 
> 
> 
> 
> As for my paranoia, it could be simply that - paranoia (having spent £250 for the first time on a GPU). The stickers and bag seemed ok, it was more that the box was a little beat up and scuffed yet when it was delivered it was very well packaged.


So you can't unlock the voltage control through AB settings??Weird..









Spoiler: Warning: Spoiler!






As for the card itself I think you're OK..
EDIT:
some others had that problem..But it seems that forcing constant voltage or reinstalling AB and Drivers fixed that for them..
https://forum-en.msi.com/index.php?topic=175356.0


----------



## lombardsoup

Quote:


> Originally Posted by *mAs81*
> 
> So you can't unlock the voltage control through AB settings??Weird..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> As for the card itself I think you're OK..


Didn't work for me either, with those options checked AND the .cfg edited to allow these.


----------



## Devildog83

Quote:


> Originally Posted by *nitrubbb*
> 
> proud owner of MSI 270X Gaming 4G
> 
> first AMD card for me


Post a pic and throw in your clocks and I will add you.


----------



## Devildog83

*JetSet Chilli* has been added, welcome to the club. I hope you get the voltage issues resolved.


----------



## mAs81

Quote:


> Originally Posted by *lombardsoup*
> 
> Didn't work for me either, with those options checked AND the .cfg edited to allow these.


I edited my post,perhaps that will help you...


----------



## Arkanon

After having a dodgy GTX660 i switched back to the Red team. My results after some tinkering so far:




I'll post some links with some benchmarks later, too bad that i5 750 is holding back the scores by alot


----------



## Devildog83

Quote:


> Originally Posted by *Arkanon*
> 
> After having a dodgy GTX660 i switched back to the Red team. My results after some tinkering so far:
> 
> I'll post some links with some benchmarks later, too bad that i5 750 is holding back the scores by alot


It looks like a 270x, is that right and what kind.


----------



## DrClaw

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Its on the backside of the case.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16811139022


wow thats pretty damn cool, i saw the video on newegg, you have so much space behind the case thats unreal, you can fit your whole hand back there lol, musta been an easy quick build with that case:thumb:


----------



## Arkanon

Quote:


> Originally Posted by *Devildog83*
> 
> It looks like a 270x, is that right and what kind.


Yessir!
MSI R9 270x Gaming 2G
3dm11 stable @ 1300Mhz core. Haven't tried any further till now. Games start crashing at around 1280Mhz core. Any tips and tricks to push it a bit further would be welcome.


----------



## JetSet Chilli

Thanks DevilDog


----------



## JetSet Chilli

I have tried the following to unlock voltage in Afterburner but still cannot change voltage.








Uninstalled CCC, Drivers and Afterburner
Ran driver sweeper and cleaned up any historic driver stuff
Reinstall driver only and Afterburner.
Amend the AB cfg file to allow overlocking
Changed settings in Afterburner UI.


----------



## Recr3ational

Quote:


> Originally Posted by *JetSet Chilli*
> 
> I have tried the following to unlock voltage in Afterburner but still cannot change voltage.
> 
> 
> 
> 
> 
> 
> 
> 
> Uninstalled CCC, Drivers and Afterburner
> Ran driver sweeper and cleaned up any historic driver stuff
> Reinstall driver only and Afterburner.
> Amend the AB cfg file to allow overlocking
> Changed settings in Afterburner UI.


Use trixx. My 280x doesn't work with AB either for some reason but works with Trixx


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Use trixx. My 280x doesn't work with AB either for some reason but works with Trixx


My 7870 only unlocks with AB and my 270x doesn't unlock at all. I don't understand why they just don't make them all adjustable at least to a safe overvolt.


----------



## JetSet Chilli

Finally got there with turning on voltage controls. It's now too late to play around with them though








It looks like a change was made in a later version of AB (both live and beta versions)

You have to add the following text (at your own risk obviously) to your VEN_1002xxxxx.cfg file in AB profiles
[Settings]
VDDC_IR3567B_Detection = 32h
VDDC_IR3567B_Output = 0
MVDDC_IR3567B_Detection = 32h
MVDDC_IR3567B_Output = 1
[I2C_BUS_06_DEV_32]

@Recr3ational - Trixx would only let me down volt.

This thread has the information and more http://www.overclock.net/t/1455914/for-those-with-locked-280x-msi-gaming-edition-cards


----------



## orlfman

My msi r9 270x!


----------



## bajer29

Quote:


> Originally Posted by *[CyGnus]*
> 
> bajer29 GPUZ reads VRM temps but not all card's have the sensor for it


Thank you... it works on my card. For some reason I never thought to look in GPU-Z for temps. Even better yet, I never thought to ever scroll down in the sensors tab


----------



## cremelo

Quote:


> Originally Posted by *ricko99*
> 
> Anyone having artifacts on titanfall like mine? asus r9 280x OC TOP


I dont play Titanfall but sometimes the artifacts back on Bf4.......
I think it may be the drivers again.... Already update the Bios of GPU???


----------



## cremelo

Quote:


> Originally Posted by *Sozin*
> 
> Anyone have issues with Afterburner/GPUZ recognizing their cards?
> 
> 
> 
> 
> 
> I restarted my computer, loaded up those two programs, and saw that nothing was being monitored. Fresh install of 13.12 drivers and six hours ago everything was being accurately monitored, so I have no idea what has gone wrong.


Download the beta version of MSI Afterburner


----------



## ricko99

Quote:


> Originally Posted by *cremelo*
> 
> I dont play Titanfall but sometimes the artifacts back on Bf4.......
> I think it may be the drivers again.... Already update the Bios of GPU???


My driver is still 13.12 with latest bios. 14.1 causes more artifacts in more games


----------



## cremelo

Quote:


> Originally Posted by *ricko99*
> 
> My driver is still 13.12 with latest bios. 14.1 causes more artifacts in more games


strange.... Already try increasing the target power to 110%?
I believe the new Bios not packed everything yesterday I noticed some artifacts in BF4 and returned to stock restarted my machine and still thought artifacts in BF4....
But ONLY in BF4


----------



## Devildog83

*orlfman* has been added. Welcome. What clocks are you running?


----------



## Sozin

Quote:


> Originally Posted by *cremelo*
> 
> Download the beta version of MSI Afterburner


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Sozin*


Have you gone into settings and unlocked voltage? Does he info read it?


----------



## Sozin

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Have you gone into settings and unlocked voltage? Does he info read it?


Yeah that box is checked, nothing changes.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Sozin*
> 
> Yeah that box is checked, nothing changes.


Booo...

When I checked my second xfx 280x I saw the same thing. I pulled it open and found they removed that IC.


----------



## Sozin

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Booo...
> 
> When I checked my second xfx 280x I saw the same thing. I pulled it open and found they removed that IC.


It's weird though. It works sometimes; like last night, I uninstalled 13.12 and installed the 14.1 beta driver and both Afterburner and HWmonitor were reading everything perfectly. I check it this morning and nothing is working.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Sozin*
> 
> It's weird though. It works sometimes; like last night, I uninstalled 13.12 and installed the 14.1 beta driver and both Afterburner and HWmonitor were reading everything perfectly. I check it this morning and nothing is working.


HMMMMM that may need some investigating.


----------



## Sozin

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> HMMMMM that may need some investigating.


Indeed. The card itself is perfectly fine, when I can read the temps and clocks it all appears to be fine, it's just weird that the monitoring is flaky.


----------



## Devildog83

Quote:


> Originally Posted by *Sozin*


At the very bottom here click apply overclocking at start-up and reboot and see if it stays with what you set it at. Can you show a clip of what the Graphics overdrive page looks like in CCC?


----------



## Sozin

Quote:


> Originally Posted by *Devildog83*
> 
> At the very bottom here click apply overclocking at start-up and reboot and see if it stays with what you set it at. Can you show a clip of what the Graphics overdrive page looks like in CCC?


The problem is I can't set it at anything. If I manually input a number it just reverts back to 0. I will get a screenshot of the overdrive page after it's finished updating. I should also say that GPUZ recognizes the card and shows all the info, but that also doesn't report any of the statuses.


----------



## Devildog83

I just have some info for any who might be interested, I have ordered a sleeved 24 pin cable to match my PCI-E cables and LED strips for under the edges of my motherboard from modDiy.com and it was a big mistake. I ordered them Jan 24th and still no word on if or when they are going to shipped.I understand the Chinese New Year thing but holy cow this is crazy. My step-son ordered a full set of cables from them 6 weeks a ago and they are lost and can't be found, they have offered to remake and reship them but that will take weeks for him to get them because they are shipped from China. Their cables are very nice and I needed to match but this has been a huge mistake. I do not recommend ever getting anything from them. They also can't be contacted except through Face-Book anymore, they used to have a phone# you could call to order and get info about an order but not now. This is no way to run a business IMHO and I will make it a mission to let as many people know that I can.

End of rant.


----------



## Sozin

Updated to Windows 8.1, currently working...


----------



## Recr3ational

Has you guys tried trixx at all? Cos all these issues with the R9 + Afterburner seems to be consistent.

I had the same issue and Trixx fixed it.


----------



## ricko99

Quote:


> Originally Posted by *cremelo*
> 
> strange.... Already try increasing the target power to 110%?
> I believe the new Bios not packed everything yesterday I noticed some artifacts in BF4 and returned to stock restarted my machine and still thought artifacts in BF4....
> But ONLY in BF4


My power is maxed at 120% my card is somehow picky about artifacts as it shows artifacts only on certain games like AC4 and BF4 as well as titanfall. other games are fine no problem. oh yeah sometimes i also have artifacts when using chrome


----------



## Devildog83

*Arkanon* has been added. Welcome!!

If anyone else has been missed post a pic and clock, GPUZ will do, and I will add you.


----------



## Devildog83

Quote:


> Originally Posted by *Sozin*
> 
> Updated to Windows 8.1, currently working...


Good deal, have to remember that. Were you running windows 8.0?

Do you want to be added to the club?


----------



## bartledoo

Can I get informed on these abbreviations like AB and ccc is? Lol


----------



## Devildog83

Quote:


> Originally Posted by *bartledoo*
> 
> Can I get informed on these abbreviations like AB and ccc is? Lol


AB is afterburner and CCC is Catalyst Control Center


----------



## cremelo

Quote:


> Originally Posted by *ricko99*
> 
> My power is maxed at 120% my card is somehow picky about artifacts as it shows artifacts only on certain games like AC4 and BF4 as well as titanfall. other games are fine no problem. oh yeah sometimes i also have artifacts when using chrome


I was having the same problem that your








Now i have overclocked my 280x they stoped, but sometimes back artifacts on BF4 ( again the only game :c ) At looks Asus 280x have some problems, maybe new bios ( again ) from asus....


----------



## bartledoo

Quote:


> Originally Posted by *Devildog83*
> 
> AB is afterburner and CCC is Catalyst Control Center


Woooooow I should of known that.... Brainfart I guess


----------



## Sozin

Quote:


> Originally Posted by *Devildog83*
> 
> Good deal, have to remember that. Were you running windows 8.0?
> 
> Do you want to be added to the club?


Yes, just updated to 8.1 and glad to see it's still working.

Uh sure, but any idea why GPU-Z thinks I have a 7900 card?


----------



## Recr3ational

Quote:


> Originally Posted by *Sozin*
> 
> Yes, just updated to 8.1 and glad to see it's still working.
> 
> Uh sure, but any idea why GPU-Z thinks I have a 7900 card?


Crossfired?


----------



## Sozin

Quote:


> Originally Posted by *Recr3ational*
> 
> Crossfired?


Not yet, the second 280X should be here Thursday tomorrow actually.

Actually even CCC thinks I have a 7900 card as well...uh what?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Sozin*
> 
> Not yet, the second 280X should be here Thursday tomorrow actually.
> 
> Actually even CCC thinks I have a 7900 card as well...uh what?


Hmmm,,, Do you have the newest version of GPUz?
Hwinfo64 reads mine as a 7900 card but everything else reads it as a 280x

YAY got my key after all


----------



## Sozin

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Hmmm,,, Do you have the newest version of GPUz?
> Hwinfo64 reads mine as a 7900 card but everything else reads it as a 280x


0.7.7


----------



## Devildog83

Quote:


> Originally Posted by *Sozin*
> 
> 0.7.7


It may be that programs need to be uninstalled, maybe sweep all info from them and reinstall. If these programs were installed before the new card try this and see. They all read the system info when they are installed. It also could be a windows 8 issue.


----------



## Sozin

Quote:


> Originally Posted by *Devildog83*
> 
> It may be that programs need to be uninstalled, maybe sweep all info from them and reinstall. If these programs were installed before the new card try this and see. They all read the system info when they are installed. It also could be a windows 8 issue.


Yeah, I'm going to do a fresh install of Windows 8.1 when I get that second card tomorrow and see what happens. I'll use the 13.12 drivers too.


----------



## bardacuda

Well I went ahead and tried flashing the BIOS on my Gigabyte R9 270. First with a Gigabyte 270X BIOS which I also modified to use 1.250V instead of the stock 1.206V. That didn't work and for about 10 minutes I thought I ruined my card. My system would not boot (and beeped furiously in protest on the first boot attempt), the fans on the card would not spin up, and nothing would show up on my monitor. Even with my monitor plugged in to another known good discrete card (one of my old GTS 450s) or on-board video it wouldn't boot and I got nothing but a black screen. After switching the cards around to different slots a few times I eventually did get it to boot, although the fans on the 270 still wouldn't spin up and it appeared lifeless. Luckily my system detected that I had a device in the slot my 270 was in, and also detected that it was in fact an R9 200 series card...so I was able to reflash the original BIOS. I was having a bit of a freak out for a while though lol.

Now that I knew I could recover from a botched flash, I went ahead and tried it again. This time modifying the original BIOS to use the same voltage of 1.250 and it would boot, but as soon as I ran anything that put the card into its performance state (Valley) and it tried to apply the 1.250 volts my system would crash.

Apparently you can't change the voltage by modifying the BIOS (raising it anyway...I haven't tried lowering it). I assume it just doesn't physically have the required circuitry. Anywhoo it is working now at its 1.206 stock voltage and I was able to adjust the maximum core frequency slider up to as high as 1270 rather than the original 1050. It wouldn't go any higher than that, which is fine, because it doesn't seem to be stable with the core much higher than 1150 anyway. I didn't bother trying to adjust the maximum for the memory slider because I've already ran into instabilities with memory at 1500 so I left that one alone.

Here's a valley run with the new OC:



EDIT:

My GPU-Z screenie at 1270/1500.


----------



## ARacoma9999

Hi, I've been lurking around for a little (maybe not as much as I could've, but patience isn't my virtue as of the moment haha) and I just wanted to ask a simple question: Are there any backplates and cooling blocks out that are out already that fit the MSI Radeon R9 270X 2GB Gaming Edition? Everyone tells me to compare reference designs to the 7870 since it's basically the same card, but I can't seem to find a for sure answer. If this has been answered already, I apologize. And any help given is much appreicated!


----------



## neurotix

Bumping this thread in my list.


----------



## bardacuda

Quote:


> Originally Posted by *DrClaw*
> 
> anybody here running multiple 270x or 270 configs? how is that going? i heard their is more issues when mining vs using 280x and up


Just an update to this: With further tweaking of the settings (and switching to the command line miner instead of guiminer) I am averaging over 465 khash/s and holding steady at 66ºC @ 62% fan speed with this Gigabyte card. It required flashing a modded BIOS to get the core clock but the mem clock is actually below stock. Running at stock voltage (1.206V) but I will probably try to tinker with this some more. Only other voltage I tried was 1.250 and that didn't take. It is able to run valley at 1155 core so I should be able to mine at 1120 at a lower voltage.

It is only the 1 card in my system right now, but I actually just ordered an ASUS 270 so I will update you after that arrives and I get it up and running.


----------



## gnemelf

guess my max voltage is not quite 1.3v but this version of the powercolor is unlocked.


----------



## ignus1212

Hi, I have a 280x on 14.1 drivers since it was released, I got a BSOD today, does anyone have an idea what could be wrong or have experienced this before?


----------



## neurotix

http://www.overclock.net/a/common-bsod-error-code-list-for-overclocking

Says 7e is corrupted file, run sfc /scannow and the other stuff in that link.

Might not be related, but it's worth a shot.


----------



## Grzesiek1010

Hi i have problem with my msi r9 270x 2gb gaming

My card wont OC :/
I try evertyhing msi afterburner, sapphire, AMD CCC i want 1200/1550 but card Crash add power +20 and still Crash driver 13.12
Card have new bios with 13sb

Any solution?


----------



## ignus1212

Thanks alot, I didn't find a corrupted file, but digged a bit about the STOP code when you pointed it out and I guess its a Windows 7 problem.

Hotfix for it that I found:

http://support.microsoft.com/kb/983615


----------



## [CyGnus]

bardacuda you have to try different core clock values my 280X mines better at 1050 vs 1100/1150 and you have to play a bit with your bat file finding the best values is kind of tricky


----------



## cremelo

Anyone whit Asus R9 280x DC2 Top on max Vcore??? How much OC?
I whant make 1200/6800 on 1.3


----------



## bardacuda

Quote:


> Originally Posted by *[CyGnus]*
> 
> bardacuda you have to try different core clock values my 280X mines better at 1050 vs 1100/1150 and you have to play a bit with your bat file finding the best values is kind of tricky


Trust me...I've played with it...and it didn't take me long to realize that there are so many possible combinations of clock speeds, thread concurrency, gpu threads, worksize, intensity, etc. that I could sit here till I'm an old man and never find the optimal config. This is just the best one I've found so far.







I've seen claims from people that they get ~500kh/s using a core freq of 970 and lower thread concurrency but didn't provide all of the parameters for me to test it...and also I think the memory was over 1500 for that config and I already get screw ups if I go over 1425. If you know of a better config for a 270/270X/7870 that uses core and mem speeds below 1120 and 1400 I'd be more than happy to try it out.


----------



## Devildog83

Quote:


> Originally Posted by *bardacuda*
> 
> Trust me...I've played with it...and it didn't take me long to realize that there are so many possible combinations of clock speeds, thread concurrency, gpu threads, worksize, intensity, etc. that I could sit here till I'm an old man and never find the optimal config. This is just the best one I've found so far.
> 
> 
> 
> 
> 
> 
> 
> I've seen claims from people that they get ~500kh/s using a core freq of 970 and lower thread concurrency but didn't provide all of the parameters for me to test it...and also I think the memory was over 1500 for that config and I already get screw ups if I go over 1425. If you know of a better config for a 270/270X/7870 that uses core and mem speeds below 1120 and 1400 I'd be more than happy to try it out.


I was told that for our cards will best perform at 1050/1375. I will try and find the post.


----------



## KingT

I've recently bought a Powercolor R9 270X PCS+ (stock 1100/1425MHz), I use it strictly for mining.

Now I have a few questions:

1) At stock clocks 1100/1425Mhz it hashes really poor, below 380 khashs. If I lower clock and memory speed to 975/1250 I get ~ 410 khashs.
Highest I was able to get was 450 khash/s at underclocked core to 1070 and memory to 1475MHz.

If *I leave core to 975 or 1000* ad up memory from 1250 to 1300 I get less khash/s, that really doesnt make any sense, movig up to 1350 still I get less khash/s than with memory @ 1250Mhz.
With memory @ 1400 I get around same khash/s as with 1250MHz on memory.

I know that memory is stable up to 1500MHz and core 1150MHz, tested it in many games but still in mining high memory freq does not yield me more khash/s.

2) Is there a way to gain voltage control on these cards, it has an International Rectifier IR3567B voltage controller and it supports I2C for voltage monitorins/adjustment. Afterburner does not let me change voltage.

3) What settings for cgminer 3.7..2. , for intensity, tc, worksize are you guys using, my best result is with I 18, tc 14336, worksize 256, g1 and I get ~ 410 khash with 975/1250MHz.

CHEERS.


----------



## bardacuda

Quote:


> Originally Posted by *Grzesiek1010*
> 
> Hi i have problem with my msi r9 270x 2gb gaming
> 
> My card wont OC :/
> I try evertyhing msi afterburner, sapphire, AMD CCC i want 1200/1550 but card Crash add power +20 and still Crash driver 13.12
> Card have new bios with 13sb
> 
> Any solution?


What voltage are you using? Are you able to adjust the voltage? Have you tried lowering the memory clock? Try lowering the memory clock. What does 'have new BIOS with 13sb' mean?


----------



## Devildog83

Quote:


> Originally Posted by *KingT*
> 
> I've recently bought a Powercolor R9 270X PCS+ (stock 1100/1425MHz), I use it strictly for mining.
> 
> Now I have a few questions:
> 
> 1) At stock clocks 1100/1425Mhz it hashes really poor, below 380 khashs. If I lower clock and memory speed to 975/1250 I get ~ 410 khashs.
> Highest I was able to get was 450 khash/s at underclocked core to 1070 and memory to 1475MHz.
> 
> If *I leave core to 975 or 1000* ad up memory from 1250 to 1300 I get less khash/s, that really doesnt make any sense, movig up to 1350 still I get less khash/s than with memory @ 1250Mhz.
> With memory @ 1400 I get around same khash/s as with 1250MHz on memory.
> 
> I know that memory is stable up to 1500MHz and core 1150MHz, tested it in many games but still in mining high memory freq does not yield me more khash/s.
> 
> 2) Is there a way to gain voltage control on these cards, it has an International Rectifier IR3567B voltage controller and it supports I2C for voltage monitorins/adjustment. Afterburner does not let me change voltage.
> 
> 3) What settings for cgminer 3.7..2. , for intensity, tc, worksize are you guys using, my best result is with I 18, tc 14336, worksize 256, g1 and I get ~ 410 khash with 975/1250MHz.
> 
> CHEERS.


Have you tried dropping the intensity some?


----------



## KingT

I have tried everything but this card seems to like a certain core/mem ratios.

But it's absurd that moving from 1250MHz memory to 1275,1300,1325,1350,1375 I get ~ 50khash/s drop. at 1400MHz I get 10 khash/s less than I get with 1250MHz.

CHEERS..


----------



## bardacuda

Quote:


> Originally Posted by *KingT*
> 
> I've recently bought a Powercolor R9 270X PCS+ (stock 1100/1425MHz), I use it strictly for mining.
> 
> Now I have a few questions:
> 
> 1) At stock clocks 1100/1425Mhz it hashes really poor, below 380 khashs. If I lower clock and memory speed to 975/1250 I get ~ 410 khashs.
> Highest I was able to get was 450 khash/s at underclocked core to 1070 and memory to 1475MHz.
> 
> If *I leave core to 975 or 1000* ad up memory from 1250 to 1300 I get less khash/s, that really doesnt make any sense, movig up to 1350 still I get less khash/s than with memory @ 1250Mhz.
> With memory @ 1400 I get around same khash/s as with 1250MHz on memory.
> 
> I know that memory is stable up to 1500MHz and core 1150MHz, tested it in many games but still in mining high memory freq does not yield me more khash/s.
> 
> 2) Is there a way to gain voltage control on these cards, it has an International Rectifier IR3567B voltage controller and it supports I2C for voltage monitorins/adjustment. Afterburner does not let me change voltage.
> 
> 3) What settings for cgminer 3.7..2. , for intensity, tc, worksize are you guys using, my best result is with I 18, tc 14336, worksize 256, g1 and I get ~ 410 khash with 975/1250MHz.
> 
> CHEERS.


Sry for multiple posts but I am just seeing this after posting my last..uhh..post.

I can tell you already that mining is frickin' weird when it comes to clocks. Higher clocks do not necessarily get you a higher hash rate. You have to find the *right* clocks. Having said that the best config I've found so far is using this command at the beginning of the batch file:

setx GPU_MAX_ALLOC_PERCENT 100

and then for flags:

-w 256 -v 1 -I 19 -g 1 -l 1 --gpu-engine 1120 --gpu-memclock 1250 --gpu-powertune 20 --thread-concurrency 11263

I've only been at this for a few days though so I'm sure there are better configs out there that I just have yet to find. If you find a better one please let me know.


----------



## F3ERS 2 ASH3S

I hve noticed that each card seems to play slightly differntly, I don't think that there is one set fits them all rule here with mining


----------



## KingT

Yeah but it's ridiculous that card @ 1000/1250MHz gives exactly the same khash/s as with 1000/1425MHz.

@ baracuda

tc lower than 12288 causes HW errors on my card.

CHEERS..


----------



## bardacuda

Hmm...well I was running a config using 21000-something TC which was almost as good but I've since ditched it...I'll see if I can dig it up again


----------



## Grzesiek1010

Quote:


> Originally Posted by *bardacuda*
> 
> What voltage are you using? Are you able to adjust the voltage? Have you tried lowering the memory clock? Try lowering the memory clock. What does 'have new BIOS with 13sb' mean?




my setup


----------



## [CyGnus]

Try --gpu-powertune 0 the less you use here the better try it







i ma doing this with my 280x, i will add a 270X to my Rig then i will do dedicated rigs with 4 270X's each


----------



## bardacuda

Quote:


> Originally Posted by *KingT*
> 
> Yeah but it's ridiculous that card @ 1000/1250MHz gives exactly the same khash/s as with 1000/1425MHz.
> 
> @ baracuda
> 
> tc lower than 12288 causes HW errors on my card.
> 
> CHEERS..


Ok so here's what I get with:

-w 256 -v 1 -I 19 -g 1 -l 1 --thread-concurrency 21568 --gpu-engine 1080 --gpu-memclock 1250 --gpu-powertune 0



If your card can sustain the clocks, try this:

http://www.reddit.com/r/dogemining/comments/1xhv01/r9_270_optimal_cgminer_config/

Quote:


> Originally Posted by *Grzesiek1010*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> my setup


What are the highest stable numbers you have found so far? Find that first and go from there. I'm thinking core clocks might be okay but I would try lowering the mem clock....either go back to stock 1400 and work your way up, or lower it to 1500 and if it is still unstable work your way down. Put your core clock back to stock while you find your stable mem clock. Once you do, leave it there and then work on finding your stable core clock. I'm not sure what the stock voltage is so 'stock + 50mv' is also unhelpful. Still though I would think 1200 core would be stable if not close to it.

Quote:


> Originally Posted by *[CyGnus]*
> 
> Try --gpu-powertune 0 the less you use here the better try it
> 
> 
> 
> 
> 
> 
> 
> i ma doing this with my 280x, i will add a 270X to my Rig then i will do dedicated rigs with 4 270X's each


Good...I'll let you do the experimenting and steal your settings afterwards.


----------



## lombardsoup

What's the highest core clock people have been able to get with the 270 without voltage increases? Would rather not burn my card as my budget is extremely limited.


----------



## KingT

Quote:


> Originally Posted by *bardacuda*
> 
> Ok so here's what I get with:
> 
> -w 256 -v 1 -I 19 -g 1 -l 1 --thread-concurrency 21568 --gpu-engine 1080 --gpu-memclock 1250 --gpu-powertune 0


Sorry I only get 418 khash/s with those settings.

I get max 445 khash/s with 1070/1490MHz, I 19 and 21568 tc.

CHEERS..


----------



## Northern Isnlad

yep. how about HIS? i'm getting his 280X for 21000.


----------



## ozzy1925

guys, i bought saphire r9 270x toxic for mining today.Is that correct r270x toxic doesnt show vrm temps? Also cant unlock voltage control under msi after burner buti see vddc under trixx software is that voltage settings?


----------



## Devildog83

Quote:


> Originally Posted by *lombardsoup*
> 
> What's the highest core clock people have been able to get with the 270 without voltage increases? Would rather not burn my card as my budget is extremely limited.


I would recommend increasing by 10 Mhz at a time and running something like Unigen Valley on HD extreme or 3DMark Vantage until it get's unstable and then back it off 10 or 15 Mhz. Then you can increase the memory until you stop getting better results. Please keep a eye on the core temp as you do this as you may have to adjust the fan profile to keep it cool. Increasing memory will at some point stop helping. Another note, a lot of cards will will stay stable with lower memory and a bit higher core speeds and get better performance. Your card will let you know when it's too high, it will flicker or even freeze when unstable and under load. Again, keep a close eye on temps.


----------



## bardacuda

@KingT did you try 24000, 1150, 1500 like in the link? I get pretty high rates with that but my card can't sustain the clocks. Also it seems there is a sweet spot for your core speed for a given memory speed. For example with the last settings I posted I got as high as 458kh/s by adjusting the core clock up to 1100. Any 1MHz change after that (1099 or 1101) and my rate dropped to 455 or less...with the higher clocks (1101+) giving me even more significant drops than the lower clocks (1099 and less).
Quote:


> Originally Posted by *lombardsoup*
> 
> What's the highest core clock people have been able to get with the 270 without voltage increases? Would rather not burn my card as my budget is extremely limited.


I was able to do a Valley run at 1155/1500 with no voltage increase, but it wasn't exactly stable. My stock voltage is 1.206. I would guess somewhere around 1125 would be completely stable. I've already ran into instabilities on the memory at 1450+ though, so I'm just going to leave that at 1400 or less for 24/7. Each card is different though.


----------



## bajer29

Quote:


> Originally Posted by *ozzy1925*
> 
> guys, i bought saphire r9 270x toxic for mining today.Is that correct r270x toxic doesnt show vrm temps? Also cant unlock voltage control under msi after burner buti see vddc under trixx software is that voltage settings?


As I was reminded earlier, use GPUZ to check VRM temps. If it doesn't show up there the card probably doesn't support it.

Make a screen cap of the settings you're talking about in Trixx. I do believe that vddc is the slider for memory voltage.

Check this thread out: http://www.overclock.net/t/974626/what-is-vddc


----------



## ozzy1925

Quote:


> Originally Posted by *bajer29*
> 
> As I was reminded earlier, use GPUZ to check VRM temps. If it doesn't show up there the card probably doesn't support it.
> 
> Make a screen cap of the settings you're talking about in Trixx. I do believe that vddc is the slider for memory voltage.
> 
> Check this thread out: http://www.overclock.net/t/974626/what-is-vddc


here is the screenshot:

i bought it for mining , should i give back this card and get another brand for mining?


----------



## lombardsoup

Quote:


> Originally Posted by *bardacuda*
> 
> I was able to do a Valley run at 1155/1500 with no voltage increase, but it wasn't exactly stable. My stock voltage is 1.206. I would guess somewhere around 1125 would be completely stable. I've already ran into instabilities on the memory at 1450+ though, so I'm just going to leave that at 1400 or less for 24/7. Each card is different though.


HWMonitor and GPUZ are reporting 1.188v stock for my card. Haven't tried 1155 core yet but 1100 is so far giving me no issues. Will probably leave memory alone as I'm not seeing huge gains like I have with core oc.


----------



## bajer29

Quote:


> Originally Posted by *ozzy1925*
> 
> here is the screenshot:
> 
> i bought it for mining , should i give back this card and get another brand for mining?


If "GPU voltage" was shown instead or in addition to "VDDC voltage" slider, it would mean GPU core voltage is unlocked. Unfortunately it doesn't look that way. Try scrolling up or down in the Trixx overclocking window. Maybe it's just not shown in your screenshot.

I know nothing about mining so I can't tell you whether to keep it or not.


----------



## Devildog83

Quote:


> Originally Posted by *ozzy1925*
> 
> here is the screenshot:
> 
> i bought it for mining , should i give back this card and get another brand for mining?


Quote:


> Originally Posted by *ozzy1925*
> 
> guys, i bought saphire r9 270x toxic for mining today.Is that correct r270x toxic doesnt show vrm temps? Also cant unlock voltage control under msi after burner buti see vddc under trixx software is that voltage settings?


You don't need overclocking for mining. In fact most of the time the best results are from underclocking.


----------



## ozzy1925

Quote:


> Originally Posted by *bajer29*
> 
> If "GPU voltage" was shown instead or in addition to "VDDC voltage" slider, it would mean GPU core voltage is unlocked. Unfortunately it doesn't look that way. Try scrolling up or down in the Trixx overclocking window. Maybe it's just not shown in your screenshot.
> 
> I know nothing about mining so I can't tell you whether to keep it or not.


it goes down to 875 and max to 1219


----------



## bardacuda

@ozzy1925 The fact that you can adjust the voltage down, and slide core and mem both up or down, without modding the BIOS would make that a great card for mining IMHO. You will probably end up turning the voltage and core down to get your best results, and memory prob won't move far if at all. BTW I'm 99.5% sure that voltage slider is for core. If you have adjustable memory voltage and not core voltage that's the weirdest card I've ever seen before. Normally you'd have to get out the soldering iron and the circuit diagrams and do some hard modding to be able to adjust the memory voltage.


----------



## ozzy1925

Quote:


> Originally Posted by *Devildog83*
> 
> You don't need overclocking for mining. In fact most of the time the best results are from underclocking.


Quote:


> Originally Posted by *bardacuda*
> 
> @ozzy1925 The fact that you can adjust the voltage down, and slide core and mem both up or down, without modding the BIOS would make that a great card for mining IMHO. You will probably end up turning the voltage and core down to get your best results, and memory prob won't move far if at all. BTW I'm 99.5% sure that voltage slider is for core. If you have adjustable memory voltage and not core voltage that's the weirdest card I've ever seen before. Normally you'd have to get out the soldering iron and the circuit diagrams and do some hard modding to be able to adjust the memory voltage.


i really appreciate your help i really loved this card (also 2x 290 trix oc on the way)dont want to return it back.


----------



## ARacoma9999

Quote:


> Originally Posted by *Grzesiek1010*
> 
> Hi i have problem with my msi r9 270x 2gb gaming
> 
> My card wont OC :/
> I try evertyhing msi afterburner, sapphire, AMD CCC i want 1200/1550 but card Crash add power +20 and still Crash driver 13.12
> Card have new bios with 13sb
> 
> Any solution?


Try using MSI Afterburner to OC as opposed to CCC. There's a 3.0 beta you can download from guru3d.com, I've been using it for a while. The max you can OC this card is 1175/1400 I believe without crashes. I have the same card. Also, there's a new version of CCC out, 14.1. Go and update that. But as for OCing, use MSI Afterburner only.


----------



## F3ERS 2 ASH3S

So to test mantle In bf4 I have turn off crossfire correct?


----------



## [CyGnus]

Any idea on the R9 280 release date? Prices?


----------



## Archea47

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> So to test mantle In bf4 I have turn off crossfire correct?


Currently, yep. Do the usual procedure of installing the 14.1 drivers, turn off crossfire and under Video in BF4 change DX11 to Mantle (and restart BF4)

It _works_ with crossfire enabled but after a short amount of time it starts to hang for a second every 5-30 seconds


----------



## Devildog83

Quote:


> Originally Posted by *[CyGnus]*
> 
> Any idea on the R9 280 release date? Prices?


Don't know yet.

Got a new high in 3DMark11 - If I had the guts to push 5.0 Ghz on the CPU I might get P14,000 +.



http://www.3dmark.com/3dm11/8000214

#1 again

www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/3dm11/P/1541/765/500000?minScore=0&cpuName=AMD%20FX-8350&gpuName=AMD%20Radeon%20HD%207870


----------



## F3ERS 2 ASH3S

Ah thank you


----------



## bardacuda

Quote:


> Originally Posted by *Devildog83*
> 
> Don't know yet.
> 
> Got a new high in 3DMark11 - If I had the guts to push 5.0 Ghz on the CPU I might get P14,000 +.
> 
> www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/3dm11/P/1541/765/500000?minScore=0&cpuName=AMD%20FX-8350&gpuName=AMD%20Radeon%20HD%207870
> 
> #1 again


Nice! Wow holy 8350s! I take it 3DMark11 favors physical core count. Scrolled through the whole list of the first 1000 scores and didn't see 1 Intel on there. Most of it was a blur though lol


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *bardacuda*
> 
> Nice! Wow holy 8350s! I take it 3DMark11 favors physical core count. Scrolled through the whole list of the first 1000 scores and didn't see 1 Intel on there. Most of it was a blur though lol


Incorrect assumption, Intels fair better on the physics due to the hyperthreading.

that list is just similar grouped scores intel physics is about 2k more normally


----------



## Devildog83

Quote:


> Originally Posted by *bardacuda*
> 
> Nice! Wow holy 8350s! I take it 3DMark11 favors physical core count. Scrolled through the whole list of the first 1000 scores and didn't see 1 Intel on there. Most of it was a blur though lol


It was 8350's only.


----------



## bardacuda

Ahh okay makes sense now lol









A durrrrrrr it has the search parameters at the top of the page. I must be goin' blind hahaha. Sittin here playin' with my hardware too much apparently. hahahah


----------



## Devildog83

This is the highest score for any CPU with 2x 7870's - Notice I beat his graphics score by 1,200 points. AMD doesn't really do physics as well as Intel so I don't pay that much attention to it.


----------



## [CyGnus]

What Ram do you have? Fast Ram will bump those physic scores a 2400c9 or 2600 c10 kit







my physics score with 2400MHZ ram and 4.6GHz 4770K is almost 13k
Score: http://www.3dmark.com/3dm11/7656094


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> This is the highest score for any CPU with 2x 7870's - Notice I beat his graphics score by 1,200 points. AMD doesn't really do physics as well as Intel so I don't pay that much attention to it.


Superb score. You are nearing my dual 280x score


----------



## bajer29

Quote:


> Originally Posted by *Archea47*
> 
> Currently, yep. Do the usual procedure of installing the 14.1 drivers, turn off crossfire and under Video in BF4 change DX11 to Mantle (and restart BF4)
> 
> It _works_ with crossfire enabled but after a short amount of time it starts to hang for a second every 5-30 seconds


It hangs with single card for me... Runs flawlessly with DX11 enabled instead.


----------



## Devildog83

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Superb score. You are nearing my dual 280x score


That score is not mine, it's the highest of any CPU a 4770k. I have Trident x 2400 running at 2200+ I think it's cas 12 though. By the way I got over 20,000 graphics score with tessy off.


----------



## bardacuda

Wow how annoying...I can't seem figure out how to sort search results by subscore (graphics). This guy has a 17747 graphics score but still an almost 700 point difference. He only had an 1185 core clock.

http://www.3dmark.com/3dm11/7413792

I've never used 3DMark11...only "3DMark"(13?) with the ice/fire/cloud benches...and even that is broken for me right now. When I get my other 270 I'll have to try and beat you







But I'm sure I won't since I can only do about 1150 core.


----------



## Devildog83

I don't like running 3DMark because of all 3 tests, at least until I buy the pro version and can run them separately. The last time I did I got near 13,000 graphics but again the combined and physics scores can't match the very expensive Intel chips.


----------



## bardacuda

Just noticed my ASUS 270 shipped today and will be here tomorrow. Anybody know about running crossfire on a board with an nForce chipset? I guess it's possible with hacked drivers or something? Any info would be appreciated. I just spent over $500 on video cards so I don't think I'll be upgrading my platform too soon. Not too mention I'm not interested in doing so until I see some Excavator and Skylake benchmarks.


----------



## orlfman

Quote:


> Originally Posted by *bardacuda*
> 
> Just noticed my ASUS 270 shipped today and will be here tomorrow. Anybody know about running crossfire on a board with an nForce chipset? I guess it's possible with hacked drivers or something? Any info would be appreciated. I just spent over $500 on video cards so I don't think I'll be upgrading my platform too soon. Not too mention I'm not interested in doing so until I see some Excavator and Skylake benchmarks.


You'll most likely have to get a new motherboard that supports crossfire.... There doesn't appear to be anyway to enable crossfire on your NF980-G65... It seems to be SLI only with no driver support for crossfire..


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *orlfman*
> 
> You'll most likely have to get a new motherboard that supports crossfire.... There doesn't appear to be anyway to enable crossfire on your NF980-G65... It seems to be SLI only with no driver support for crossfire..


^this

I had to upgrade from a 880 chipset to a 990 chipset when I wanted to do SLI due to the licensing that nVidia had, I do believe it is the same the other way around,


----------



## dmfree88

Quote:


> Originally Posted by *KingT*
> 
> I've recently bought a Powercolor R9 270X PCS+ (stock 1100/1425MHz), I use it strictly for mining.
> 
> Now I have a few questions:
> 
> 1) At stock clocks 1100/1425Mhz it hashes really poor, below 380 khashs. If I lower clock and memory speed to 975/1250 I get ~ 410 khashs.
> Highest I was able to get was 450 khash/s at underclocked core to 1070 and memory to 1475MHz.
> 
> If *I leave core to 975 or 1000* ad up memory from 1250 to 1300 I get less khash/s, that really doesnt make any sense, movig up to 1350 still I get less khash/s than with memory @ 1250Mhz.
> With memory @ 1400 I get around same khash/s as with 1250MHz on memory.
> 
> I know that memory is stable up to 1500MHz and core 1150MHz, tested it in many games but still in mining high memory freq does not yield me more khash/s.
> 
> 2) Is there a way to gain voltage control on these cards, it has an International Rectifier IR3567B voltage controller and it supports I2C for voltage monitorins/adjustment. Afterburner does not let me change voltage.
> 
> 3) What settings for cgminer 3.7..2. , for intensity, tc, worksize are you guys using, my best result is with I 18, tc 14336, worksize 256, g1 and I get ~ 410 khash with 975/1250MHz.
> 
> CHEERS.


Sorry late reply but check my guide in sig with kalroths you should get about 490kh/s. Using xintensity 4 gpu threads 2 tc 8193 and the clocks posted in my guide im at 505kh/s

If u really want to make money right now check us.trademybit.com reg is open again necause theres now a scrypt-n profit pool. If you use my other guide for vertminer you can get 245kh in scrypt-n and make much more profit currently off of panda and vtc. Im not at my pc but google "thekev dmfree88 guide" and my bitcointalk post comes up and will get u started there. Its a different miner but may be the next step for most coins as its asic resistant.

Either way xintensity or rawintensity is the way to go. Will be updating my guide soon with more info about the latest release of sgminer aswell as latest tips ive learned.

About your clocks tho it is true lower clocks sometimes work better. Its on full power constantly so thats not really a bad thing... sometimes increasing core allows you to increase mem further they both have to be tinkered same time usually. It can even be opposite lower core and mem can go up.. Its a pain but lower power with similar hashrates is better. Better to be at 990 core 1360 mem at 440kh rather then 1250/1500 at 450kh (just random examples).. likely costing more then its making extra.

Random side thought: would using the optimum mining clocks show improvements elsewhere like gaming? Maybe not higher fps due to lower clocks usually but maybe smoother? Less artifacts/tearing? Just a thought.


----------



## KingT

Quote:


> Originally Posted by *dmfree88*
> 
> Sorry late reply but check my guide in sig with kalroths you should get about 490kh/s. Using xintensity 4 gpu threads 2 tc 8193 and the clocks posted in my guide im at 505kh/s
> 
> If u really want to make money right now check us.trademybit.com reg is open again necause theres now a scrypt-n profit pool. If you use my other guide for vertminer you can get 245kh in scrypt-n and make much more profit currently off of panda and vtc. Im not at my pc but google "thekev dmfree88 guide" and my bitcointalk post comes up and will get u started there. Its a different miner but may be the next step for most coins as its asic resistant.
> 
> Either way xintensity or rawintensity is the way to go. Will be updating my guide soon with more info about the latest release of sgminer aswell as latest tips ive learned.
> 
> About your clocks tho it is true lower clocks sometimes work better. Its on full power constantly so thats not really a bad thing... sometimes increasing core allows you to increase mem further they both have to be tinkered same time usually. It can even be opposite lower core and mem can go up.. Its a pain but lower power with similar hashrates is better. Better to be at 990 core 1360 mem at 440kh rather then 1250/1500 at 450kh (just random examples).. likely costing more then its making extra.
> 
> Random side thought: would using the optimum mining clocks show improvements elsewhere like gaming? Maybe not higher fps due to lower clocks usually but maybe smoother? Less artifacts/tearing? Just a thought.


Thanx for your effort but I have already tried that cgminer mod, doesn't get me a much higher khash/s ( around 20 khash/s extra) but for me it's not that constant, sometimes it starts great, sometimes it starts poor (get low khash/s).

Rep+

CHEERS..


----------



## bardacuda

Quote:


> Originally Posted by *dmfree88*
> 
> Sorry late reply but check my guide in sig with kalroths you should get about 490kh/s. Using xintensity 4 gpu threads 2 tc 8193 and the clocks posted in my guide im at 505kh/s


Nice! Tried exact same settings (except worksize because multiples of 48 give me very low rates) and clocks and was also getting over 500kh/s. Clocks aren't sustainable for me though but with my clocks of 1100/1250 I am getting over 475kh/s. Best config I've found so far. Also it doesn't make my display all choppy and artifact-y and I can still browse and stuff almost like normal. +REP here too


----------



## bajer29

I take it I'm the only one getting annoyed of all the talk about mining in this thread... DevilDog, should this be the place to talk about mining? A lot of the questions are OT as far as the hardware goes, but there's a lot of mining talk flooding this owner's club. I feel like more of these questions could be answered correctly and quicker in an official R9 2XX Mining Questions Thread.

Just a thought.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *bajer29*
> 
> I take it I'm the only one getting annoyed of all the talk about mining in this thread... DevilDog, should this be the place to talk about mining? A lot of the questions are OT as far as the hardware goes, but there's a lot of mining talk flooding this owner's club. I feel like more of these questions could be answered correctly and quicker in an official R9 2XX Mining Questions Thread.
> 
> Just a thought.


Throw the link out there









===================================================================================

Man so Mantle drops my highest frams but steadies it, A lot smoother game play in BF4 I am happy THANK YOU AMD

KDR went up just because of it.


----------



## Recr3ational

Quote:


> Originally Posted by *bajer29*
> 
> I take it I'm the only one getting annoyed of all the talk about mining in this thread... DevilDog, should this be the place to talk about mining? A lot of the questions are OT as far as the hardware goes, but there's a lot of mining talk flooding this owner's club. I feel like more of these questions could be answered correctly and quicker in an official R9 2XX Mining Questions Thread.
> 
> Just a thought.


Me too, I think its just because i have no idea what the hell they're on about so i have nothing to talk about lol.
I don't mid OT just would like it if it was varied topics.


----------



## bajer29

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Throw the link out there
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ===================================================================================
> 
> Man so Mantle drops my highest frams but steadies it, A lot smoother game play in BF4 I am happy THANK YOU AMD
> 
> KDR went up just because of it.


Haha, there is no link. It's more or less of a suggestion to get a thread started. Since I have no interest at all or ever intend on mining, I shouldn't be the one to start it.


----------



## Devildog83

Quote:


> Originally Posted by *bajer29*
> 
> I take it I'm the only one getting annoyed of all the talk about mining in this thread... DevilDog, should this be the place to talk about mining? A lot of the questions are OT as far as the hardware goes, but there's a lot of mining talk flooding this owner's club. I feel like more of these questions could be answered correctly and quicker in an official R9 2XX Mining Questions Thread.
> 
> Just a thought.


I understand, I don't mine either although I did try it. The thing is that this is a thread that covers R9 cards, topics like mining are directly related to the the cards we have in this thread. Like it or not the R9 cards are the best for mining and it makes it relevant, plus it affects the cost of these cards which is a huge issue. The members should be able to discuss anything related to the cards we cover and not just what we want them too.

Thank you for your input though.

DD


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> I understand, I don't mine either although I did try it. The thing is that this is a thread that covers R9 cards, topics like mining are directly related to the the cards we have in this thread. Like it or not the R9 cards are the best for mining and it makes it relevant, plus it affects the cost of these cards which is a huge issue. The members should be able to discuss anything related to the cards we cover and not just what we want them too.
> 
> Thank you for your input though.
> 
> DD


Thats the only thing i care about tbh. I don't mind miners, they do what they want. Its just the pricing that effects me.


----------



## Devildog83

What I would really like to hear about is how HSA is working out with the new AMD APU's and the R7 and R9 cards. If anyone has any info on that it would be nice. Thanks.


----------



## dmfree88

Quote:


> Originally Posted by *KingT*
> 
> Thanx for your effort but I have already tried that cgminer mod, doesn't get me a much higher khash/s ( around 20 khash/s extra) but for me it's not that constant, sometimes it starts great, sometimes it starts poor (get low khash/s).
> 
> Rep+
> 
> CHEERS..


Are you on the 14.1 drivers? They tend to bring down hashrates. Recommend using 13.11 or 13.12 especially with pitcairn. If u do switch dont forget to delete the bin files so it creates new ones.

I mus say scrypt is actually far less consistent then scrypt-n. Vertminer still has a few bugs but is much more consistant and should bring on a wave o new coins fo gpu only (leaving asic behind again).

Anyways hope u get the best out of it just gotta find the sweet spots. Tc 10241 may help with those nad starts. Also everytime a new bin is created or tc is changed it can royally screw up one of my cards then after pc restart it works fine.. its all a little buggy it seems







. Good luck tho hope u get some high hash high dollars







.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> What I would really like to hear about is how HSA is working out with the new AMD APU's and the R7 and R9 cards. If anyone has any info on that it would be nice. Thanks.


I have a build coming for a buddy at work that I am putting it together, It will be a 7850K with a XFX R9 270x so we will see at least something. My time will be short lived so if you want any benches gotta let me know soon.


----------



## bardacuda

Ahahaha. I thought it seemed quiet in here lol. I don't think there's really any programs yet that take advantage of HSA but I haven't looked into it since Kaveri was released. There might be some now. I do recall hearing that Java was going to be overhauled to take advantage of HSA. That could be sort of big I think because Java programs are already pretty...slow but I guess it's hard to optimize because it's cross-platform? iunno. Then again when I think of Java I think of silly browser games. Any important program would be written specifically for Windows or Linux or what have you and for today's common hardware wouldn't it?

I'm rooting for more HSA and HUMA adoption because I wanna see some crazy optimizations for Excavator. It would be cool to see AMD get back to a point where they could compete in the high end and put the pressure on Intel. Not saying that I think Excavator itself will compete, but if it is successful it might help them in getting back to that point.


----------



## Devildog83

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> I have a build coming for a buddy at work that I am putting it together, It will be a 7850K with a XFX R9 270x so we will see at least something. My time will be short lived so if you want any benches gotta let me know soon.


Sure that would be good to know what advantages the HSA and all gives the APU's. I guess if there are no applications that can take advantage of it yet there would be no reason to do bench's except to see how it compares to say a 4300 with a Mantle ready GPU.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> Sure that would be good to know what advantages the HSA and all gives the APU's. Bench's would be great.


PM me a list of what you want to see so that I can prepare for it in advanced.. Im a bit tired at the moment so I will most likely lose this post.


----------



## lombardsoup

Just got artifacts with 1100 core clock, haven't touched power limit yet. As I've never actually used it before, will keeping this increased long term do any damage to my card?

Still can't unlock core voltage btw


----------



## neurotix

Quote:


> Originally Posted by *lombardsoup*
> 
> Just got artifacts with 1100 core clock, haven't touched power limit yet. As I've never actually used it before, will keeping this increased long term do any damage to my card?
> 
> Still can't unlock core voltage btw


Upping the power limit will do no long term damage to your card as it's only a software thing, from my understanding.

Basically, this affects how much wattage the card draws under load, but it's all in software. Increasing it affects stability of your overclock. What actually makes the card draw more wattage is the VDDC or voltage control. Even then, with 1.3v to the card you will still be within the TDP of the card. The wattage limit is actually physical and is governed by the two PCI-E power connectors. 6 pin is 75W and 8 pin is 150W for 225W total. The system can exceed this and some cards can use up to 300W if you raise the voltage enough. I've heard that with the 290X and a certain bios allowing 2V you can make your card draw a potential of ~800W under load, but this far exceeds the spec of the PCI-E connectors so I don't know how true it is. Depending on if your R9 270 uses one or two 6-pin connectors will mostly determine the wattage your card draws. (My 270X uses two 6-pin connectors for 150W TDP) This is also influenced by the voltage.

Long story short: nobody really knows how Power Limit works. For better overclocks set it to 50% (or 20% if that's your max) and forget about it. I set the power limit on my 270X (in another rig) to 50%, it's been that way for 6 months and no problems so far.


----------



## Majentrix

Why is overclocking so limited on the 270?
Did AMD not want people to OC their cards to higher clock rates than that of the 270x?


----------



## Archea47

Quote:


> Originally Posted by *neurotix*
> 
> The wattage limit is actually physical and is governed by the two PCI-E power connectors. 6 pin is 75W and 8 pin is 150W for 225W total


+ the ~75W supplied by the motherboard PCIE slot, if I recall correctly


----------



## Dimestore55

Does anyone with a 270 or 270x card out there know if this will fit on it?. I have tried several card manufacturers websites and can't seem to find any information about the heatsink bolt pattern.


----------



## Archea47

I would check out EKWB's coolingconfigurator.com, Dimestore55. They will list if their universal VGA cooler will fit your card, and then you can extrapolate using EKWB's measurements of their product from there. I think it would be safe to assume it fits, but ... "trust, but verify"


----------



## Dimestore55

Archea47

Thanks for the tip. EK makes a similar block with almost an identical bolt pattern and they say it fits so I'll just take a leap of faith (I already bought the Raystorm).









EDIT: ...but not the card yet.


----------



## Archea47

Quote:


> Originally Posted by *Dimestore55*
> 
> Archea47
> 
> Thanks for the tip. EK makes a similar block with almost an identical bolt pattern and they say it fits so I'll just take a leap of faith (I already bought the Raystorm).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: ...but not the card yet.


Happy to help - I'm considering doing the same with my Gigabyte 280Xs

And on that note ... I just had my first artifact. I'm viewing this thread in FireFox and there was an approximately 800x600 pixel black square with jagged green lines through it when I went to make this reply. I scrolled down and back up and it went away


----------



## bardacuda

Can I join the club...again?









Seems the ASUS can be under- and over-volted with a modded BIOS and clocks much better. The cooler is definitely way worse though and can't handle a lot of extra voltage under load...although the fans stay at a fairly low rpm and quiet if the load doesn't put the temp up over ~60ºC-ish...which is about where it seems to hover after a short mining test run, so it should end up being perfect for a dedicated mining card I'm hopin'. Over-volted in Valley the temp just seems to run away and never levels off.







Had to give it a break between each run.

Sadly there will be no crossfire benching (or gaming)







*sniffle*

EDIT:
Quote:


> Originally Posted by *Majentrix*
> 
> Why is overclocking so limited on the 270?
> Did AMD not want people to OC their cards to higher clock rates than that of the 270x?


This is what I am thinking. I wish I could put the cooler and the dual 6-pin connectors from my gigabyte card on to my ASUS card and have the best of both worlds, but it seems you just can't win. Gotta get a 270X if you want all the perks in one card apparently. Maybe a Sapphire 270 could be over-volted *and* overclocked *and* stay frosty? I dunno. Anyone have one? I'm curious now


----------



## dmfree88

I have a giga windforce 270 max clocks ab allows(like 1150/1500 is max i think) tops at 65 degrees not sure if it will over or undervolt with modded bios im sure it would. My 270x giga wf with the 3rd fan sure makes a difference tho. Much cooler havent tried to push it too far tho


----------



## bardacuda

Yeah I have the same one. With the stock BIOS it would only allow 1050/1500 max clocks. With a modded BIOS it allows up to 1270 max on the core and at least 1600 on the memory (haven't bothered trying to make the mem slider go higher), but it doesn't allow any voltage changes at all. Just did some further testing with the ASUS and this thing sucks. It overheats way too easy. I'm wondering if I got a dud. I under-volted it to 1.200 (stock was 1.215) and it will go over 90ºC if I set it to mine for more than a few minutes. I'd like to return it and get another gigabyte but the site I bought if off of doesn't have gigabytes, and all the other sites have inflated prices.


----------



## ozzy1925

Quote:


> Originally Posted by *dmfree88*
> 
> I have a giga windforce 270 max clocks ab allows(like 1150/1500 is max i think) tops at 65 degrees not sure if it will over or undervolt with modded bios im sure it would. My 270x giga wf with the 3rd fan sure makes a difference tho. Much cooler havent tried to push it too far tho


my sapphire r270x toxic seems voltage locked.Is it possible to unlock with different bios?


----------



## dmfree88

Maybe it just needs thermal paste? Sometimes factory sucks at applying properly. Ive heard something about some asus have a new bios release on some cards have u checked the asus site? Dont change the paste if it voids ur warranty.. but 90 is super hot.

Not sure ozzy maybe try sapphire trixx? They have modified versions out there that might work. If not im sure u could just edit your bios. My giga is locked too i will play with it more soon.


----------



## bardacuda

Well that's what I'm thinking, but yeah, I have to see about the warranty and the return policy before taking the cooler off.

The thing with the BIOS is it's set at a stock TDP of 113 watts, so if you don't adjust the power limit it will throttle under load. I'm thinking they just changed that number in the new BIOS. I can do that myself anyway. The fact that TDP is set so low makes me think ASUS knew this cooler wouldn't cut it so they're relying on software throttling to keep thermals in check.....but then you're not getting all the performance that you could and it's still running too hot anyway. The gigabyte stock TDP is set at 155W and it runs waaayyyyy cooler even with a slightly higher load voltage.

Hopefully it is just a poor TIM application and is easily fixed.

Good luck with the voltage on the Gigabyte. Seriously. Maybe yours has different circuitry and you'll have more success.


----------



## dmfree88

I also noticed when my 7870 is in a different pci slot it has no voltage control. So might be something with my mobo. Dont feel like taking it apart again thats become a pain







.


----------



## DrClaw

isnt it time they drop prices, pretty sure miners and anyone who wants to mine has already bought the cards they want, max i have seen in one mining rig is like 6 gpus so what the heck
where i live same computer store has had the same 290s and 280x for almost a month now and no one has bought it, still has their 290x tri x at over 700 dollars

lousy vendors


----------



## bardacuda

Newegg is rediculous. The same Gigabyte I just bought a few weeks ago is up to $290 CAD now from $220. All of the 270s on NCIX.ca are at least $230. Got this ASUS off of canadacomputers.com for $215 and even still that's $35 above MSRP. They have some Club3D ones for $200 but I'm not sure about that brand. I guess people building farming rigs just figure it's only a few more weeks until the card pays itself off so it's still worth it to buy in the long run.


----------



## goldswimmerb

Quote:


> Originally Posted by *neurotix*
> 
> Upping the power limit will do no long term damage to your card as it's only a software thing, from my understanding.
> 
> Basically, this affects how much wattage the card draws under load, but it's all in software. Increasing it affects stability of your overclock. What actually makes the card draw more wattage is the VDDC or voltage control. Even then, with 1.3v to the card you will still be within the TDP of the card. The wattage limit is actually physical and is governed by the two PCI-E power connectors. 6 pin is 75W and 8 pin is 150W for 225W total. The system can exceed this and some cards can use up to 300W if you raise the voltage enough. I've heard that with the 290X and a certain bios allowing 2V you can make your card draw a potential of ~800W under load, but this far exceeds the spec of the PCI-E connectors so I don't know how true it is. Depending on if your R9 270 uses one or two 6-pin connectors will mostly determine the wattage your card draws. (My 270X uses two 6-pin connectors for 150W TDP) This is also influenced by the voltage.
> 
> Long story short: nobody really knows how Power Limit works. For better overclocks set it to 50% (or 20% if that's your max) and forget about it. I set the power limit on my 270X (in another rig) to 50%, it's been that way for 6 months and no problems so far.


remember that the pcie slot also can deliver 70w


----------



## goldswimmerb

I must have gotten an awful Overclocker... my card throws a bluescreen or artifacts when the core is set to more than 1100Mhz and the memory more than 1400... My card is a 270x BTW...


----------



## Devildog83

Quote:


> Originally Posted by *goldswimmerb*
> 
> remember that the pcie slot also can deliver 70w


My motherboard actually has a 4 pin molex connector to give more power to the PIC-E slots so I am sure if I needed I could push what ever I needed thru them.


----------



## dmfree88

Thats nuts. I want a molex for my pci







.


----------



## orlfman

Quote:


> Originally Posted by *Devildog83*
> 
> My motherboard actually has a 4 pin molex connector to give more power to the PIC-E slots so I am sure if I needed I could push what ever I needed thru them.


My old asus a8n32-sli deluxe from 2005 had a 4 pin molex for additional power. All I had running in it was a single 6800gt but remember making sure I had a molex connected into it to be "safe" lol.


----------



## DrClaw

Quote:


> Originally Posted by *goldswimmerb*
> 
> remember that the pcie slot also can deliver 70w


its 75 watts


----------



## Delphi

Are these temps safe for R9 270x's? GPU 1, and GPU 2 are my R9's. GPU 3 is my HD 6870 strictly for mining.


----------



## Devildog83

Quote:


> Originally Posted by *orlfman*
> 
> My old asus a8n32-sli deluxe from 2005 had a 4 pin molex for additional power. All I had running in it was a single 6800gt but remember making sure I had a molex connected into it to be "safe" lol.


It was initially made for 3 and 4 card set-ups but with the power draw of the newer cards I would use it if I was running 2 x 290x's. I don't need it for my set-up now though.


----------



## KingT

Quote:


> Originally Posted by *Delphi*
> 
> Are these temps safe for R9 270x's? GPU 1, and GPU 2 are my R9's. GPU 3 is my HD 6870 strictly for mining.


You're safe, those temps are not too high, that Pitcairn chip ran at 78C on a reference AMD R9 270X cards, so why would you worry about yours?

http://www.techpowerup.com/reviews/AMD/R9_270X/30.html

CHEERS..


----------



## DeviousAddict

Quote:


> Originally Posted by *Devildog83*
> 
> My motherboard actually has a 4 pin molex connector to give more power to the PIC-E slots so I am sure if I needed I could push what ever I needed thru them.


So does mine. Can't see me needing to use it though.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *DeviousAddict*
> 
> So does mine. Can't see me needing to use it though.


Some boards like the msi 990fxa gd65 use the money connector to separate from the main power to the board but still requires it to function. Which in cases like that it's not really extra power


----------



## vieuxchnock

Wrong place to post, sorry


----------



## Devildog83

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Some boards like the msi 990fxa gd65 use the money connector to separate from the main power to the board but still requires it to function. Which in cases like that it's not really extra power


As you most likely already know this board has an 8 pin and a 4 pin CPU power just in case you want to try and get 7 or 8 Ghz out of your chip.







I'll never do it and most likely won't do Quad or Tri-Fire either but it is built for it that's for sure.


----------



## dmfree88

I still got a ultra-atx intel AND amd nforce4 sli mobo. Wonder what it can handle for cards its sure a beast


----------



## Devildog83

I though about starting an Intel build someday soon with the new ROG board the has the NB/VRM heatsink with ports for watercooling built in to it. Maximus VI

http://www.newegg.com/Product/Product.aspx?Item=N82E16813132038


----------



## kersoz2003

*Sapphire Vapor-X R9 280X BIOS FLASH GUIDE*

As far as I know you all have some problems ( vrm , heat , performance drop , throttling) with this card . I had those same issues before flashing my card's bios with a modded bios . This bios and this guide is goingto to help you overcome all these problems. This card runs perfect but 0.41 bios is BAD ..SO here we gonna flash the card with a GOOD one which is v44 ( actually based on the 0.39 version which is the non-problematic bios version) . So all you need to do is to read below :

*1- PREPERATION STEP :*

In this step you need to learn your card's bios number and card's vram vendor. You can learn bios number with gpu-z such as :



and learn bios vram vendor with a software :

http://s3.dosya.tc/server18/ZxxnFC/MemoryInfo1005.zip.html

in the software :










there is "elpdia" or "hynix"

so mine is elpdia. learn what yours is.

*2- FLASH PREPERATION STEP*

In this step you need to download ATIWINFLASH tool :

http://www.techpowerup.com/downloads/2311/ati-winflash-2-6-7/

after you get it , you have to extract all files to a folder. After this step you need to get the moded bios for your card ( with the info you get ,bios vram info especially )

*here is all the modded bios for Sapphire R9-280X :*

Sapphire 280x Vaporx best bioses:

*http://www.upload.gen.tr/d.php/www/eTyTg/Sapphire_E21004_V44_K2_NT_AGR.rom.html (elpdia) ( which I find best for elpdia and I use currently )*

and others :

https://www.dropbox.com/s/vgfvp1peost0bxd/Sapphire_VXOCBG6A_U42_AGR.zip (Elpida)
https://www.dropbox.com/s/eag6qwjr8kxsg27/2104XTLBG6A-V42_AGR.zip (Elpida)
https://www.dropbox.com/s/uge8yz8n88ms4cr/2104XTLAFR-V45_AGR.zip (Hynix)
https://www.dropbox.com/s/5kvgyof6plquq8h/2104XTLAFR-V46_AGR.zip (Hynix)
https://www.dropbox.com/s/tpg7l5o7ifu342a/Sapphire_E251_T03_AGR.zip (Hynix)
https://www.dropbox.com/s/7sq0gczfnt2xca7/Sapphire_E21004_LU_Hynix_AGR.zip (Hynix)
https://www.dropbox.com/s/fwxlc0fldzsr9zx/Sapphire_E21004_LU_Elpida_AGR.zip

you need to download the one which suits your card and its vram.

*3- FLASHING THE BIOS*

This is the final step ahead. Now you go to the folder which you extracted atiwinflash , open it and put the bios (after opening the bios from zip of course) that you downloaded to the same folder. Now this step is important : YOU NEED TO RUN THE ATIWINFLASH.exe as ADMINISTRATION :



After the software opens you will see like this :



you need to click LOAD IMAGE box first and find the bios you put in the ATIWINFLASH folder you extracted. Then you need to click the PROGRAM box to flash the bios ... TADAAAAAAAA finally it flashes it and askes you to restart it ( DO IT !!) after you restarted your brand new GOOD bios and super sexy and performance card is ready to use









I hope you get the help you need









prepared by : kersoz2003 (THEKNIGHT)

P.S = this guide can be applied to other brands and series too. just ask for my help for further details.


----------



## Archea47

kersoz2003 - great guide! Thank you Sir


----------



## Recr3ational

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *kersoz2003*
> 
> *Sapphire Vapor-X R9 280X BIOS FLASH GUIDE*
> 
> As far as I know you all have some problems ( vrm , heat , performance drop , throttling) with this card . I had those same issues before flashing my card's bios with a modded bios . This bios and this guide is goingto to help you overcome all these problems. This card runs perfect but 0.41 bios is BAD ..SO here we gonna flash the card with a GOOD one which is v44 ( actually based on the 0.39 version which is the non-problematic bios version) . So all you need to do is to read below :
> 
> *1- PREPERATION STEP :*
> 
> In this step you need to learn your card's bios number and card's vram vendor. You can learn bios number with gpu-z such as :
> 
> 
> 
> and learn bios vram vendor with a software :
> 
> http://s3.dosya.tc/server18/ZxxnFC/MemoryInfo1005.zip.html
> 
> in the software :
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> there is "elpdia" or "hynix"
> 
> so mine is elpdia. learn what yours is.
> 
> *2- FLASH PREPERATION STEP*
> 
> In this step you need to download ATIWINFLASH tool :
> 
> http://www.techpowerup.com/downloads/2311/ati-winflash-2-6-7/
> 
> after you get it , you have to extract all files to a folder. After this step you need to get the moded bios for your card ( with the info you get ,bios vram info especially )
> 
> *here is all the modded bios for Sapphire R9-280X :*
> 
> Sapphire 280x Vaporx best bioses:
> 
> *http://www.upload.gen.tr/d.php/www/eTyTg/Sapphire_E21004_V44_K2_NT_AGR.rom.html (elpdia) ( which I find best for elpdia and I use currently )*
> 
> and others :
> 
> https://www.dropbox.com/s/vgfvp1peost0bxd/Sapphire_VXOCBG6A_U42_AGR.zip (Elpida)
> https://www.dropbox.com/s/eag6qwjr8kxsg27/2104XTLBG6A-V42_AGR.zip (Elpida)
> https://www.dropbox.com/s/uge8yz8n88ms4cr/2104XTLAFR-V45_AGR.zip (Hynix)
> https://www.dropbox.com/s/5kvgyof6plquq8h/2104XTLAFR-V46_AGR.zip (Hynix)
> https://www.dropbox.com/s/tpg7l5o7ifu342a/Sapphire_E251_T03_AGR.zip (Hynix)
> https://www.dropbox.com/s/7sq0gczfnt2xca7/Sapphire_E21004_LU_Hynix_AGR.zip (Hynix)
> https://www.dropbox.com/s/fwxlc0fldzsr9zx/Sapphire_E21004_LU_Elpida_AGR.zip
> 
> you need to download the one which suits your card and its vram.
> 
> *3- FLASHING THE BIOS*
> 
> This is the final step ahead. Now you go to the folder which you extracted atiwinflash , open it and put the bios (after opening the bios from zip of course) that you downloaded to the same folder. Now this step is important : YOU NEED TO RUN THE ATIWINFLASH.exe as ADMINISTRATION :
> 
> 
> 
> After the software opens you will see like this :
> 
> 
> 
> you need to click LOAD IMAGE box first and find the bios you put in the ATIWINFLASH folder you extracted. Then you need to click the PROGRAM box to flash the bios ... TADAAAAAAAA finally it flashes it and askes you to restart it ( DO IT !!) after you restarted your brand new GOOD bios and super sexy and performance card is ready to use
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I hope you get the help you need
> 
> 
> 
> 
> 
> 
> 
> 
> 
> prepared by : kersoz2003 (THEKNIGHT)
> 
> P.S = this guide can be applied to other brands and series too. just ask for my help for further details.






Wait im confused, What are you flashing? a 280x?


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> 
> Wait im confused, What are you flashing? a 280x?


It has to be with 2048 shaderclocks, 384 bit interface and 3 Gbs of memory. Yes Sapphire 280x vapor X


----------



## bardacuda

Yo Dog, can I bug ya to update my Gigabyte clocks to my valley stable clocks? Post w/ screen cap is here:

http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-270-owners-club/4300_100#post_21808230

Same with my ASUS...I would rather have valley stable clocks posted than GPU-Z clocks, 'cause it gives more realistic expectations for anyone using the list to figure out what normal achievable clocks might be. Post is here:

http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-270-owners-club/4400_100#post_21823364

Thanks brah!


----------



## Devildog83

Quote:


> Originally Posted by *bardacuda*
> 
> Yo Dog, can I bug ya to update my Gigabyte clocks to my valley stable clocks? Post w/ screen cap is here:
> 
> http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-270-owners-club/4300_100#post_21808230
> 
> Same with my ASUS...I would rather have valley stable clocks posted than GPU-Z clocks, 'cause it gives more realistic expectations for anyone using the list to figure out what normal achievable clocks might be. Post is here:
> 
> http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-270-owners-club/4400_100#post_21823364
> 
> Thanks brah!


DONE!


----------



## hamzta09

Seems Elpida is garbage and no one? Can overclock the Mem much.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> Seems Elpida is garbage and no one? Can overclock the Mem much.


yarp it has been that way for years

hey with your dual cards what does the top card temp get to when gaming? I know have 2 xfx 280x so I would like to compare


----------



## hamzta09

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> yarp it has been that way for years
> 
> hey with your dual cards what does the top card temp get to when gaming? I know have 2 xfx 280x so I would like to compare


Top card roughly 10c higher than bottom card.
BF4 session 75c on top and 65 or so on bottom.

VRM on top is about 10-20c higher than bottom.
Roughly 80-90c BF4 or something.


----------



## Megadrone

I got these black blotches in far cry 3, always showing on the road and flickers when moving. http://i.imgur.com/rZff3ax.png

are they artifacts? I reduced clock to stock settings and it still shows.

also what is the prefered catalyst settings? I have it on default.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hamzta09*
> 
> Top card roughly 10c higher than bottom card.
> BF4 session 75c on top and 65 or so on bottom.
> 
> VRM on top is about 10-20c higher than bottom.
> Roughly 80-90c BF4 or something.


Thanks


----------



## rdr09

Quote:


> Originally Posted by *hamzta09*
> 
> Top card roughly 10c higher than bottom card.
> BF4 session 75c on top and 65 or so on bottom.
> 
> VRM on top is about 10-20c higher than bottom.
> Roughly 80-90c BF4 or something.


hamz, how is your i5 holding up in BF4 MP with the 280s?


----------



## chropose

Hello guys, I have a question. Is cheater allowed? I mean like, I was a 7970-er but now after bios flash I'm a legit R9 280X-er.








Quote:


> Originally Posted by *hamzta09*
> 
> Seems Elpida is garbage and no one? Can overclock the Mem much.


I think it's a known fact. I mean, with 1.6V on mem I can bench with 1950 MHz. Well, with artifacts but it still finishes the bench. Not that changing the volt to 1.7 clears the artifacts though.


----------



## jamor

Quote:


> Originally Posted by *Recr3ational*
> 
> Yeah i thought that too,
> I had artifacts. I though it was my bios flash but i think it was titanfall.
> Its a decent game though. Did you enjoy it as much as i did?


It's not the game its the card. Not sure why these Asus 280xs all have artifacts


----------



## Devildog83

I don;t know about the 280x's but I get 250 Mhz OC out of my 7870 if needed and 200 Mhz out of my 270x if needed, both have elpida chips.

I don't run them that high normally because it just doesn't give much better performance like the core clocks do. Core clocks make a much bigger difference than the memory clocks do, at least in my experience.

GPU-Z


Spoiler: Warning: Spoiler!


----------



## bardacuda

Heh I was asking for Valley-stable clocks (1155/1500 Gigabyte, 1220/1550 ASUS) but OK!


----------



## bajer29

Quote:


> Originally Posted by *Devildog83*
> 
> I don;t know about the 280x's but I get 250 Mhz OC out of my 7870 if needed and 200 Mhz out of my 270x if needed, both have elpida chips.
> 
> I don't run them that high normally because it just doesn't give much better performance like the core clocks do. Core clocks make a much bigger difference than the memory clocks do, at least in my experience.
> 
> GPU-Z
> 
> 
> Spoiler: Warning: Spoiler!


That's impressive to me... I'm done trying to push my 280x past factory OC. I think the Sapphire factory OC is pretty much where the card hits the OC wall. The artifacting and hot VRAM is freaking me out a bit.


----------



## hamzta09

Quote:


> Originally Posted by *rdr09*
> 
> hamz, how is your i5 holding up in BF4 MP with the 280s?


Was running fine, mostly 100-120 fps on ultra 4x msaa with hbao.
Can dip during smoke effects to 80fps. CPU usage was mostly 90-99% constantly.


----------



## rdr09

Quote:


> Originally Posted by *hamzta09*
> 
> Was running fine, mostly 100-120 fps on ultra 4x msaa with hbao.
> Can dip during smoke effects to 80fps. CPU usage was mostly 90-99% constantly.


I don't know my fps 'cause I never bothered to measure but my i7 @ 4.5 HT off is handling a single 290 (slower than your 280s) just fine. BF3, though, is different. During huge explosions in a 64 map it can get laggy. that was with the 7950/7970 crossfire.


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *Devildog83*
> 
> I don;t know about the 280x's but I get 250 Mhz OC out of my 7870 if needed and 200 Mhz out of my 270x if needed, both have elpida chips.
> 
> I don't run them that high normally because it just doesn't give much better performance like the core clocks do. Core clocks make a much bigger difference than the memory clocks do, at least in my experience.
> 
> GPU-Z
> 
> 
> Spoiler: Warning: Spoiler!


My 280X with Elpida chips can do about the same mem OC. For benching I can run +280MHz even (stock mem volts). Don't know about gaming though, haven't gone more than +200 there. Don't see the point. Memory clocks only seem to help out 3DMark 11 for me.

Oh yeah, after uninstalling CCC and going back to 13.12 I got my voltage tweaking working on TRIXX. I was already losing hope and thought my BIOS was wonky or something. Going to school tomorrow is gonna be hard, I'd much rather stay home and OC my GPU.


----------



## orlfman

I downloaded gpu-z just to look at the info specs of my msi r9 270x 4gb gaming and noticed its reporting that my cards memory is samsung. Kinda surprised considering that so many sites and other people are reporting elpida.


----------



## neurotix

Quote:


> Originally Posted by *orlfman*
> 
> 
> 
> I downloaded gpu-z just to look at the info specs of my msi r9 270x 4gb gaming and noticed its reporting that my cards memory is samsung. Kinda surprised considering that so many sites and other people are reporting elpida.


From what I've heard, you can either get Elpida (common), Hynix (common) or Samsung (rare).

I've never seen a card with Samsung and I very rarely see Samsung cards from other users.

Apparently, Samsung is supposed to overclock very well. Let us know high how you manage to get the RAM clock.


----------



## nitrubbb

I also have samsung memory. As im new to AMD side I have a question: from AMD Feature manager do I need to disable "gaming quality preference" or set to to "high performance"?


----------



## diggiddi

Quote:


> Originally Posted by *nitrubbb*
> 
> I also have samsung memory. As im new to AMD side I have a question: from AMD Feature manager do I need to disable "gaming quality preference" or set to to "high performance"?


What card do you have?


----------



## nitrubbb

MSI R9 270X 4G


----------



## kersoz2003

Guys good news I installed 14.2 and no stuttering in bf4 and I get over +10 + 15 fps more than directx









good job mantle is ready now I guess. And new page opened for nvida fan boy


----------



## orlfman

Quote:


> Originally Posted by *nitrubbb*
> 
> MSI R9 270X 4G


Seems like the 4gb msi 270x version ships with samsung ram then. I set mine to high performance.


----------



## DiceAir

Ok so I've tested gmaes like crysis 3 and bf4 and I get 100% cpu usage. I'm running 2x r9-280x and windows 8.1 with latest 14.2 drivers. I tried mantle but that just stutters. Seeing as games are starting to make use of hyperthreating will i see a perfomance indcrease going from i5-2500k @ 4.7GHz to i7-4770k @ 4.0-4.4GHz BTW I tried everything. from stopping all tasks and have only the game running to unparking cores and running timer resolution. also tried running at stock speed on bios. Oh and I have 16GB of corsair vengance 1600MHz ram so I'm sure that's not the issue. I also tried earleier drivers like 13.12 and even earlier beta ones same issue. I think due to the game not using gpu physx my cpu has to do all the work. Just wish AMD could also have physx on their gpu's so that you can offload that to gpu and get better performance.

So if I upgrade should the following upgrade be fine

i7-4770k
Asus z87-a (I think this is the cheapest decent motherboard I can find)

BTw I'm from South Africa and we don't get lot's of hardware here like in the USA


----------



## bajer29

Quote:


> Originally Posted by *DiceAir*
> 
> Ok so I've tested gmaes like crysis 3 and bf4 and I get 100% cpu usage. I'm running 2x r9-280x and windows 8.1 with latest 14.2 drivers. I tried mantle but that just stutters. Seeing as games are starting to make use of hyperthreating will i see a perfomance indcrease going from i5-2500k @ 4.7GHz to i7-4770k @ 4.0-4.4GHz BTW I tried everything. from stopping all tasks and have only the game running to unparking cores and running timer resolution. also tried running at stock speed on bios. Oh and I have 16GB of corsair vengance 1600MHz ram so I'm sure that's not the issue. I also tried earleier drivers like 13.12 and even earlier beta ones same issue. I think due to the game not using gpu physx my cpu has to do all the work. Just wish AMD could also have physx on their gpu's so that you can offload that to gpu and get better performance.
> 
> So if I upgrade should the following upgrade be fine
> 
> i7-4770k
> Asus z87-a (I think this is the cheapest decent motherboard I can find)
> 
> BTw I'm from South Africa and we don't get lot's of hardware here like in the USA


I don't think you'll see a noticeable difference. It's an issue with the drivers and the game. Try Thief if you have it and let us know what the results are.

On another note; I've had issues with stuttering too since 14.1 except I have a single 280x that will give me a FPS hiccup every 15-30 seconds (only with Mantle API). Like complete loss of 5-10 frames. Makes the game unplayable for me.

Try going back to DX11 and the stutter should go away. Mantle doesn't seem to be fully optimized yet.


----------



## Recr3ational

I'm on 14.2 and one 280x and one 7950 with 7970 bios. I haven't received any stuttering? Though I've only played bf4 recently to test out my hardware. Also guys. Eyefinity users what fps are you getting? Cos I'm getting over 100+ all the time. I feel like it's lying to me...


----------



## orlfman

Quote:


> Originally Posted by *DiceAir*
> 
> Ok so I've tested gmaes like crysis 3 and bf4 and I get 100% cpu usage. I'm running 2x r9-280x and windows 8.1 with latest 14.2 drivers. I tried mantle but that just stutters. Seeing as games are starting to make use of hyperthreating will i see a perfomance indcrease going from i5-2500k @ 4.7GHz to i7-4770k @ 4.0-4.4GHz BTW I tried everything. from stopping all tasks and have only the game running to unparking cores and running timer resolution. also tried running at stock speed on bios. Oh and I have 16GB of corsair vengance 1600MHz ram so I'm sure that's not the issue. I also tried earleier drivers like 13.12 and even earlier beta ones same issue. I think due to the game not using gpu physx my cpu has to do all the work. Just wish AMD could also have physx on their gpu's so that you can offload that to gpu and get better performance.
> 
> So if I upgrade should the following upgrade be fine
> 
> i7-4770k
> Asus z87-a (I think this is the cheapest decent motherboard I can find)
> 
> BTw I'm from South Africa and we don't get lot's of hardware here like in the USA


A 4770k isn't much faster over a 2500K. A 2500K is more than enough. Even more so if you have it overclocked to 4.7ghz like you said. If you really want hyperthreading then you might as well pick up a 2600K or a 3770K cheap.

It's mostly a driver issues. It's going to be awhile before things get settled out. BF4 in general just having a lot of issues.


----------



## DiceAir

Quote:


> Originally Posted by *orlfman*
> 
> A 4770k isn't much faster over a 2500K. A 2500K is more than enough. Even more so if you have it overclocked to 4.7ghz like you said. If you really want hyperthreading then you might as well pick up a 2600K or a 3770K cheap.
> 
> It's mostly a driver issues. It's going to be awhile before things get settled out. BF4 in general just having a lot of issues.


Do you have anything to back it up? I've been reading up on the net and getting mix match results. Some say it will help and some say don't upgrade. Some say get faster ram and some say I must rather wait for mantle but that could be a while some also say like you go 2600k or 3770K but then there is people telling me i should upgrade to a dead socket.


----------



## bajer29

Quote:


> Originally Posted by *DiceAir*
> 
> Do you have anything to back it up? I've been reading up on the net and getting mix match results. Some say it will help and some say don't upgrade. Some say get faster ram and some say I must rather wait for mantle but that could be a while some also say like you go 2600k or 3770K but then there is people telling me i should upgrade to a dead socket.


You can throw all the money you want to get a game to work the way you want it, but for a broken game like BF4, it's just not worth it until everything is optimized. Just be patient. Nearly everyone is having issues with the game (including me).

I have a decent rig with more than enough oomph for a "next-gen" game to work flawlessly. It's in the devs' hands at this point.


----------



## DiceAir

Quote:


> Originally Posted by *bajer29*
> 
> You can throw all the money you want to get a game to work the way you want it, but for a broken game like BF4, it's just not worth it until everything is optimized. Just be patient. Nearly everyone is having issues with the game (including me).
> 
> I have a decent rig with more than enough oomph for a "next-gen" game to work flawlessly. It's in the devs' hands at this point.


i must say I agree with you. I don't see why my i5-2500k @ 4.7GHz can bottleneck 2x r9-280x. it should be enough. i think games like Crysis 3 and BF4 is badly optimized. they trying to make use of 8 cores but don't even think about people with i5's or even i3's. This is bad really bad of them. i think they trying to make the engine to advanced and they failed. Yes you get massive fps in that engine but not efficient usage. To much cpu usage and not gpu usage maybe that engine is bugged and not feeding instructions to gpu correctly. maybe I should wait and see in about 5 months. if they haven't fixed it by then or if many newer games will lag i will upgrade then. for now it's not unplayable just that i want to get more use out of my cards to get my fps more stable on 120Hz. but by that time mantle should be 80-90% working.

Most people i spoke to say hey i don't get this issue my gpu usage is low but they don't have this extreme system I have. They have 1920x1080 60hz screens and run on fairly mediocre Graphics cards so that's why they won't see my cpu utilization.


----------



## DiceAir

Quote:


> Originally Posted by *bajer29*
> 
> I don't think you'll see a noticeable difference. It's an issue with the drivers and the game. Try Thief if you have it and let us know what the results are.
> 
> On another note; I've had issues with stuttering too since 14.1 except I have a single 280x that will give me a FPS hiccup every 15-30 seconds (only with Mantle API). Like complete loss of 5-10 frames. Makes the game unplayable for me.
> 
> Try going back to DX11 and the stutter should go away. Mantle doesn't seem to be fully optimized yet.


Ok so i tested thief. i noticed that i can get about 95% usage in my graphics cards and my cpu usage = 80% then and my fps = 120 on 2560x1440 @ 120hz I must say this game is very smooth on my system and mantle should help me tons in this game. You know I'm not to worried about the game itself but mostly the performance etc etc. I'm starting to think my resolution and refresh rate need cpu horsepower. i will check some other games to. Tomb Raider 2013 should also be a good test. It seems like AMD graphics cards requires a fast CPU but hey these cards are way faster than my old GTX570 sli cards


----------



## slick2500




----------



## Grzesiek1010

Hey

what are your OC results with MSI R9 270X 2GB GAMING ?
I use new AMD Catalyst 14.2 with MSI Afterburner 3.0.0 Beta 18

My settings:

Core Voltage : +100
Power Limit : +20%
Core Clock : 1200
Memory Clock : 1600

And works great








In BF4 after 3hours of playing temp max 71' with fan 22%


----------



## radier

1235/1565 MHz for some benchmarks.


----------



## bajer29

Quote:


> Originally Posted by *DiceAir*
> 
> Ok so i tested thief. i noticed that i can get about 95% usage in my graphics cards and my cpu usage = 80% then and my fps = 120 on 2560x1440 @ 120hz I must say this game is very smooth on my system and mantle should help me tons in this game. You know I'm not to worried about the game itself but mostly the performance etc etc. I'm starting to think my resolution and refresh rate need cpu horsepower. i will check some other games to. Tomb Raider 2013 should also be a good test. It seems like AMD graphics cards requires a fast CPU but hey these cards are way faster than my old GTX570 sli cards


Thief should have Mantle support, but I do not believe Tomb Raider does. I was asking about performance with the use of Mantle in Thief.

Anyway I was just clarifying in my previous post that Mantle will be buggy in BF4. There's no way around it and it would be silly to buy new hardware to replace the already new hardware you have for a game that isn't fully optimized. Your hardware isn't the issue, it's the lack of support and optimization by the devs for current hardware. Their software is glitchy. Wait for patches and drivers to catch up.


----------



## kzone75

I'll try to OC it when I have more time.


----------



## Pionir

Gigabyte GA R9 27OOC 2GB (4 pcs)


Spoiler: Warning: Spoiler!










ASUS R9 270X DC2T 2GB (4 pcs)


Spoiler: Warning: Spoiler!


----------



## Devildog83

*kzone & Pionir* have been added to the club, welcome. Nice cards guys.


----------



## DiceAir

Quote:


> Originally Posted by *bajer29*
> 
> Thief should have Mantle support, but I do not believe Tomb Raider does. I was asking about performance with the use of Mantle in Thief.
> 
> Anyway I was just clarifying in my previous post that Mantle will be buggy in BF4. There's no way around it and it would be silly to buy new hardware to replace the already new hardware you have for a game that isn't fully optimized. Your hardware isn't the issue, it's the lack of support and optimization by the devs for current hardware. Their software is glitchy. Wait for patches and drivers to catch up.


i kinda already ordered my parts. So will see how things go. I would actually like the extra cores so that i can maybe record some gameplay too if i can. The thing is Prices is going up and in fear I bought the 4770k knowing that new games might actually utilize more cores and HT etc etc. i hope i'm not disappointed in my CPU just need to replace thermal paste on my h100i for the new cpu and hope for a good CPU cause i was lucky with my i5-2500k.


----------



## bajer29

Quote:


> Originally Posted by *DiceAir*
> 
> i kinda already ordered my parts. So will see how things go. I would actually like the extra cores so that i can maybe record some gameplay too if i can. The thing is Prices is going up and in fear I bought the 4770k knowing that new games might actually utilize more cores and HT etc etc. i hope i'm not disappointed in my CPU just need to replace thermal paste on my h100i for the new cpu and hope for a good CPU cause i was lucky with my i5-2500k.


You can record easily with PlayClaw5. You can buy this from the steam store. They've optimized the program to run well and not hog resources while gaming. I'd look into that.

I still don't think you need the extra cores/ threads


----------



## bardacuda

Quote:


> Originally Posted by *DiceAir*
> 
> The thing is Prices is going up and in fear I bought the 4770k knowing that new games might actually utilize more cores and HT


The 2500K and 4770K are both quad core parts so you won't have any more cores. You will get hyperthreading though because it's an i7 instead of i5.
The main improvement with Haswell over Sandy Bridge is the power efficiency, but the cores are also slightly faster too. They may not clock as high though because of the TIM used between the CPU and heat spreader instead of solder.


----------



## Recr3ational

Quote:


> Originally Posted by *bardacuda*
> 
> The 2500K and 4770K are both quad core parts so you won't have any more cores. You will get hyperthreading though because it's an i7 instead of i5.
> The main improvement with Haswell over Sandy Bridge is the power efficiency, but the cores are also slightly faster too. They may not clock as high though because of the TIM used between the CPU and heat spreader instead of solder.


Don't forget about the heat!


----------



## hamzta09

Quote:


> Originally Posted by *slick2500*


Dat cablemanagement.


----------



## dabby91

Quote:


> Originally Posted by *hamzta09*
> 
> Dat cablemanagement.


You'll have a heart attack if you see my rig then! Damn short PSU cables -.-


----------



## ricko99

anyone having increase in temperature with catalyst 14.2 beta 1.3?


----------



## bajer29

Quote:


> Originally Posted by *ricko99*
> 
> anyone having increase in temperature with catalyst 14.2 beta 1.3?


I've noticed higher temps. As much as +5C at times since 14.2 update.


----------



## Recr3ational

Quote:


> Originally Posted by *bajer29*
> 
> I've noticed higher temps. As much as +5C at times since 14.2 update.


Aye. Mines gone from 20ish idle to about 24-25c ish. Though it's only idle temps my loads temps roughly the same. Thing is it could just be the ambient temp...


----------



## bajer29

Quote:


> Originally Posted by *Recr3ational*
> 
> Aye. Mines gone from 20ish idle to about 24-25c ish. Though it's only idle temps my loads temps roughly the same. Thing is it could just be the ambient temp...


Yeah I was thinking ambient too. Been getting -9C outside so we turned the heat up a bit yesterday. Heater's been on full blast lately.


----------



## Recr3ational

Quote:


> Originally Posted by *bajer29*
> 
> Yeah I was thinking ambient too. Been getting -9C outside so we turned the heat up a bit yesterday. Heater's been on full blast lately.


Minus NINE C? Jesus. I thought it was cold in the UK! Haha yeah it's probably the ambient temps. To be honest with you guys, I don't know why we all care to much about temps. It's way within its safe temps.


----------



## bardacuda

Quote:


> Originally Posted by *Recr3ational*
> 
> Minus NINE C? Jesus. I thought it was cold in the UK!


HAHAHAHAHAHAHAHAHAHA!!! That is all.....


----------



## Recr3ational

Quote:


> Originally Posted by *bardacuda*
> 
> HAHAHAHAHAHAHAHAHAHA!!! That is all.....


Right then.


----------



## bajer29

Quote:


> Originally Posted by *Recr3ational*
> 
> Right then.


He's Canadian. Canada is cold. He laughs at our whining about temps.









Pssshh... Canadians


----------



## Recr3ational

Quote:


> Originally Posted by *bajer29*
> 
> He's Canadian. Canada is cold. He laughs at our whining about temps.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Pssshh... Canadians


Right then. Well I'm Asian so cold is a major no for me. What ever the number is.


----------



## bartledoo

Anyone else's overclocking settings not work after the 14.2 catalyst update? When apply my oc profile from ab nothing happens, I think its at reference settings. 1100/1500 w/1.175 vddc


----------



## Arkanon

Any reports on overclocks being adversely affected by upping the TDP (W) in VBE?
I seem to be running into a problem once i touch the TDP adjustment and running the same overclock after flashing the bios with the new TDP values. Without doing so BF4 runs flawlessly @ 1280/1500 but once flashed with higher values the game, with the same OC freezes instantly upon getting into the game. Anyone else having experience on this matter?

On a sidenote:
http://www.3dmark.com/3dm11/7990823
3dm11 bench @ 1275/1500
Too bad that i5-750 is holding back the combined score a bit due to being relatively crappy at physics calculations.


----------



## Devildog83

Quote:


> Originally Posted by *bartledoo*
> 
> Anyone else's overclocking settings not work after the 14.2 catalyst update? When apply my oc profile from ab nothing happens, I think its at reference settings. 1100/1500 w/1.175 vddc


I had to uninstall and reinstall afterburner to get mine to work right, once I did that it was flawless.


----------



## Devildog83

Got a new high in Valley but I did a 3DMark11 run too and it was lower on everything with the same clocks but new drivers.

My old high on Valley was 68 Fps - 2845

With new drivers -


----------



## kzone75

Has anyone played BF4 with a A10-6800K and 270X using mantle? Can't find any benches..


----------



## cremelo

Well, i put Catalyst 14.2 and look what happen



Now I'm waiting for RMA send a response and I'm picking up a 280x Toxic by end of today =/


----------



## Devildog83

Quote:


> Originally Posted by *cremelo*
> 
> Well, i put Catalyst 14.2 and look what happen
> 
> 
> 
> Now I'm waiting for RMA send a response and I'm picking up a 280x Toxic by end of today =/


Nothing against Asus but the Toxic seems to be a beast from what I here.


----------



## cremelo

Quote:


> Originally Posted by *Devildog83*
> 
> Nothing against Asus but the Toxic seems to be a beast from what I here.


I would pick the Vapor-X but Toxic is cheaper here where I live


----------



## Devildog83

Quote:


> Originally Posted by *cremelo*
> 
> I would pick the Vapor-X but Toxic is cheaper here where I live


The Toxic is better.


----------



## cremelo

Quote:


> Originally Posted by *Devildog83*
> 
> The Toxic is better.


I just hope that he has no problem with it, if I have to switch to nvidia


----------



## jamor

Would a single 6 pin to 8 pin work for the 280x R9 or do you need to use two 6 pints to 8 pin?


----------



## Recr3ational

Quote:


> Originally Posted by *jamor*
> 
> Would a single 6 pin to 8 pin work for the 280x R9 or do you need to use two 6 pints to 8 pin?


Mine is one 6 pin and one 8 pin.

What's with my second gpu always at 100%usage. 14.2 is causing a lot of issues.


----------



## Tugz

Quote:


> Originally Posted by *Recr3ational*
> 
> Mine is one 6 pin and one 8 pin.
> 
> What's with my second gpu always at 100%usage. 14.2 is causing a lot of issues.


i get that as well. Common issue. Just disable crossfire then re-enable crossfire. That should fix it rather then restarting all the time.


----------



## smoke2

I'm owning new ASUS DCII 280X TOP.
I've been testing the card with Heaven 4.0 benchmark and some objects looks like a blurry.
For example the window on building but mostly the blue flag which as if continued with paler blue stripe in width of the flag.

I never run this benchmark before and was nVidia guy, so I'm asking if it is normal thing?


----------



## Devildog83

Quote:


> Originally Posted by *smoke2*
> 
> I'm owning new ASUS DCII 280X TOP.
> I've been testing the card with Heaven 4.0 benchmark and some objects looks like a blurry.
> For example the window on building but mostly the blue flag which as if continued with paler blue stripe in width of the flag.
> 
> I never run this benchmark before and was nVidia guy, so I'm asking if it is normal thing?


That would most likely have t do with the settings in CCC and clocks along with the settings you run Heaven at.


----------



## Devildog83

NewEgg is coming to the UK next month.


----------



## Northern Isnlad

i have a sapphire r9 270x vapor-x, the problem i'm facing is whenever i'm playing a game if i look carefully and closely i can see continuous blurry WAVES in the screen, what is that problem about? do i need to send it for replacement as mine is just 1 week old. please guide me.


----------



## neurotix

I have that same card and have never seen a problem like that.

You might need to RMA it.


----------



## smoke2

Quote:


> Originally Posted by *Northern Isnlad*
> 
> i have a sapphire r9 270x vapor-x, the problem i'm facing is whenever i'm playing a game if i look carefully and closely i can see continuous blurry WAVES in the screen, what is that problem about? do i need to send it for replacement as mine is just 1 week old. please guide me.


I have maybe the same problem.
Try to run Heaven 4.0 benchmark and in the second scene the nearest rope is blurred.
Also, please try to look at the blue flag on the pylon if you have blue strip going from the flag?


----------



## bardacuda

Quote:


> Originally Posted by *Northern Isnlad*
> 
> i have a sapphire r9 270x vapor-x, the problem i'm facing is whenever i'm playing a game if i look carefully and closely i can see continuous blurry WAVES in the screen, what is that problem about? do i need to send it for replacement as mine is just 1 week old. please guide me.


I think that might actually be your monitor. Has something to do with poor backlighting or something. You can really see it if you look at a screen that's all black. Had the same problem with a monitor before.

Did you notice it before with your old video card? If you have onboard video or another video card you can try, try running off that and see if the problem persists. If it does, it is probably the monitor.


----------



## Recr3ational

Hmm.

I think one of my cards is broken. My second gpu is always at max. So I try to reenable crossfire and it crashes.

After boot it says that there's no amd software on this computer.

It can't be driver issues as I've tried multiple drivers and it comes out with the same issues.


----------



## Tugz

Quote:


> Originally Posted by *Recr3ational*
> 
> Hmm.
> 
> I think one of my cards is broken. My second gpu is always at max. So I try to reenable crossfire and it crashes.
> 
> After boot it says that there's no amd software on this computer.
> 
> It can't be driver issues as I've tried multiple drivers and it comes out with the same issues.


does your other card work when running a single card? when i first got my 2 r9s one of my cards literally died on me so i returned it and got another on


----------



## Tugz

double post my bad


----------



## Recr3ational

Quote:


> Originally Posted by *Tugz*
> 
> does your other card work when running a single card? when i first got my 2 r9s one of my cards literally died on me so i returned it and got another on


Yeah man I think so. When it works it works.
It's quite hard for me to test single cards as I'm underwater using acrylic tubing. After I reinstall CCC it works fine perfectly fine but after I restart the problem occurs


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Yeah man I think so. When it works it works.
> It's quite hard for me to test single cards as I'm underwater using acrylic tubing. After I reinstall CCC it works fine perfectly fine but after I restart the problem occurs


Have you tried a new X-Fire bridge?


----------



## Mackem

I STILL get artefacts in League of Legends with my 280X DirectCUII TOP no matter what drivers I use. It's really annoying..


----------



## G2O415

Quote:


> Originally Posted by *Mackem*
> 
> I STILL get artefacts in League of Legends with my 280X DirectCUII TOP no matter what drivers I use. It's really annoying..


I've been having this problem from the start with my 270x, its the only game too that I get artifacts.


----------



## nitroxyl

Quote:


> Originally Posted by *Mackem*
> 
> I STILL get artefacts in League of Legends with my 280X DirectCUII TOP no matter what drivers I use. It's really annoying..


I believe it's Riot's graphics team integrating some newer shadow effects into the game that's causing our GPU's to use a whole lot of power to reproduce (Maybe coded infficently?).
I'm on a Gigabyte R9 270 non-X and I'm getting at least 1 artifact flash in the middle of my screen each game. My FPS also dips to 50 randomly when not in graphics intensive teamfights such as laning/jungling. I'm normally at a steady 60 FPS with V-Sync turned on and maxed out settings save for Shadows turned off.

With that said, there's been a ton of FPS issues since Patch 4.0 that's widely known. I just hope it gets fixed soon.


----------



## smoke2

Quote:


> Originally Posted by *bardacuda*
> 
> I think that might actually be your monitor. Has something to do with poor backlighting or something. You can really see it if you look at a screen that's all black. Had the same problem with a monitor before.
> 
> Did you notice it before with your old video card? If you have onboard video or another video card you can try, try running off that and see if the problem persists. If it does, it is probably the monitor.


I tried to run Heaven 4.0 through Intel PGU and I can see also those effects, so it's probably monitor?
I'm owning EIZO S1931.


----------



## dartuil

Hello , I think my 7950 will die soon what Gpu should I buy? I dont want to spend a lot of money just to play bf4 , and some sim planes.
http://www.ldlc.com/fiche/PB00156553.html
http://www.ldlc.com/fiche/PB00154999.html
http://www.ldlc.com/fiche/PB00158240.html


----------



## Archea47

Quote:


> Originally Posted by *kzone75*
> 
> Has anyone played BF4 with a A10-6800K and 270X using mantle? Can't find any benches..


My Girl's Box runs a 6800K and a 2*6*0X. I don't think you want to crossfire with the 6800k's IGP as that will downclock your 270X - you'll get better performance running just the 270. FYI I get 50-60FPS on Medium with the 6800K @ 4.5GHz on a 260X in BF4
Quote:


> Originally Posted by *Recr3ational*
> 
> Mine is one 6 pin and one 8 pin.
> 
> What's with my second gpu always at 100%usage. 14.2 is causing a lot of issues.


I have this problem with my 280xs in crossfire every time I update drivers. I have to use regedit to change all EnableULPS values from 1 to 0


----------



## Recr3ational

Quote:


> Originally Posted by *Archea47*
> 
> I have this problem with my 280xs in crossfire every time I update drivers. I have to use regedit to change all EnableULPS values from 1 to 0


Yeah I've done that. Still get problems. Don't know what it is to be honest with you. I've given up on it all..


----------



## kzone75

Quote:


> Originally Posted by *Archea47*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kzone75*
> 
> Has anyone played BF4 with a A10-6800K and 270X using mantle? Can't find any benches..
> 
> 
> 
> My Girl's Box runs a 6800K and a 2*6*0X. I don't think you want to crossfire with the 6800k's IGP as that will downclock your 270X - you'll get better performance running just the 270. FYI I get 50-60FPS on Medium with the 6800K @ 4.5GHz on a 260X in BF4
Click to expand...

Thanks. No crossfiring with the IGP is intended.


----------



## bardacuda

Quote:


> Originally Posted by *smoke2*
> 
> I'm owning new ASUS DCII 280X TOP.
> I've been testing the card with Heaven 4.0 benchmark and some objects looks like a blurry.
> For example the window on building but mostly the blue flag which as if continued with paler blue stripe in width of the flag.
> 
> I never run this benchmark before and was nVidia guy, so I'm asking if it is normal thing?


Quote:


> Originally Posted by *smoke2*
> 
> I have maybe the same problem.
> Try to run Heaven 4.0 benchmark and in the second scene the nearest rope is blurred.
> Also, please try to look at the blue flag on the pylon if you have blue strip going from the flag?


Quote:


> Originally Posted by *smoke2*
> 
> I tried to run Heaven 4.0 through Intel PGU and I can see also those effects, so it's probably monitor?
> I'm owning EIZO S1931.


In your case I'm not sure if it's anything. It could be just a normal part of the benchmark. I haven't ran Heaven before so I don't know if that is normal or not, but I know in Valley many things are blurry as it has a kind of "lens focusing" effect, as though it was filmed with an older style video camera. When it is "focused" on far away objects, nearby objects appear blurry, and vice-versa....when it's "focused" on nearby objects, far away objects get blurry. I'm thinking it's probably similar with heaven. This is a lot different than having ripples, or waves, of light moving up your screen.

Maybe if I get a chance I'll get Heaven and run it and let you know if I see the same blurry parts...or maybe some people here have ran Heaven before and they can tell you if that's a normal thing.

That said, if it isn't a normal thing and there actually is something wrong, the fact that you get the same issue with onboard video would _suggest_ the problem doesn't have anything to do with your video card, although I'm not sure what it would be.


----------



## Northern Isnlad

well previously i had HD 7770 and the problem didn't occur but one of my friend told me that as i'm using a dvi to VGA adapter as i don't have a DVI chord that might be the reason he advice me to grab a HDMI cable and operate with it, he also faced the same problem with his 760 now he's doing great. just have to try out that solution.


----------



## Devildog83

*smoke2* , the blurring in Heaven is intentional and happens no matter what you run or do. It's that way for me every time I run it no matter what card or clocks.

*Northern Island* I think a good HDMI cable will help you, I can not be 100% sure but I believe it will. Make sure you get a good one as a cheap one can give you issues.


----------



## Northern Isnlad

HDMi cable that's what i'm after. can you recommend some? what should i look for in a HDMi cable? damn it's Sunday here


----------



## hamzta09

Quote:


> Originally Posted by *Recr3ational*
> 
> Yeah man I think so. When it works it works.
> It's quite hard for me to test single cards as I'm underwater using acrylic tubing. After I reinstall CCC it works fine perfectly fine but after I restart the problem occurs


Just plug monitor into other card and unplug xfire.


----------



## Devildog83

Quote:


> Originally Posted by *Northern Isnlad*
> 
> HDMi cable that's what i'm after. can you recommend some? what should i look for in a HDMi cable? damn it's Sunday here


You just need a good high speed, say 15+ Gbs bandwidth.


----------



## KingT

I've got Powercolor R9 270X PCS+ card , clocked @ 1100/1425, it has almost identical performance of Asus HD7950 DC2 TOP V2 (900/1250MHz), for the half price.

I/ve paid 175 Euros for PCS+ R9 270X card, not bad.

The card is cool and silent, runs at 65C max in games, and very silent operation.

I got both GPU-s hashing, undervolted , 980 khash/s total, 410W pulling from the wall.

Here are some pics:


Spoiler: Warning: Spoiler!































CHEERS..


----------



## Archea47

As a data point:

With my crossfired Gigabyte R9 280X Rev2s the max temperature I hit in BF4 on the largest maps and one 1080p monitor is 66*C on the top card and 61*C on the bottom card. That's with a fan profile that @ 66*C the fan is at 78%. Max temps were the same on both the beta 14.1 and 14.2 drivers

Still the same problems with Mantle on 14.2 as 14.1 despite the patch notes - less frequent hanging but when it does it's a whole lot worse taking a long time to subside


----------



## mikemykeMB

Fianl OC for the XFX 270x 1175/1500..


----------



## Devildog83

Your clocks have been updated.


----------



## mikemykeMB

Cool thanks, might be a change in numbers later, fooling around with max temps b4 DCing the CC.


----------



## cremelo

Quote:


> Originally Posted by *Mackem*
> 
> I STILL get artefacts in League of Legends with my 280X DirectCUII TOP no matter what drivers I use. It's really annoying..


Already updated the Bios? If yes and still whit artefacts send to RMA....


----------



## dmfree88

Quote:


> Originally Posted by *dartuil*
> 
> Hello , I think my 7950 will die soon what Gpu should I buy? I dont want to spend a lot of money just to play bf4 , and some sim planes.
> http://www.ldlc.com/fiche/PB00156553.html
> http://www.ldlc.com/fiche/PB00154999.html
> http://www.ldlc.com/fiche/PB00158240.html


Id recommnd the toxic for performance. Giga is great tho too if you need bf4 no reason it shouldnt do fine. if u dont oc it much the giga 4gb version is usually best bang for buck


----------



## dartuil

Quote:


> Originally Posted by *dmfree88*
> 
> Id recommnd the toxic for performance. Giga is great tho too if you need bf4 no reason it shouldnt do fine. if u dont oc it much the giga 4gb version is usually best bang for buck


THank you


----------



## Spade616

Just got this 2GB HIS 270X IceQ X2, and thought i'd share it.(was supposed to get just a 260x or a 750ti but you can guess what happened at the store lol) Sexy heatsink, and the stock clocks are pretty fast at 1140/1400. I stuck with my 9800GTX+ for way too long lol


----------



## neurotix

Nice looking card


----------



## Devildog83

*Spade616* has been added to the club, Welcome ! That is a very nice looking card.


----------



## Spade616

Quote:


> Originally Posted by *Devildog83*
> 
> *Spade616* has been added to the club, Welcome ! That is a very nice looking card.


Thanks man! Yeah, this is definitely the nicest looking card I've owned.(I mostly used low midrange cards in the bast)


----------



## JCH979

What happens when one gets black screens? I'm not sure if I'm getting them but yesterday I was playing a game and the tv lost video signal, I unplugged the hdmi and plugged it back in then all was fine, no errors or crashes so not sure what that was all about.


----------



## Devildog83

Quote:


> Originally Posted by *JCH979*
> 
> What happens when one gets black screens? I'm not sure if I'm getting them but yesterday I was playing a game and the tv lost video signal, I unplugged the hdmi and plugged it back in then all was fine, no errors or crashes so not sure what that was all about.


It could be just Windows, do you use sleep mode? It could have also been a driver glitch, if it happens again you could also try replacing the HDMI cable.
It may even turn out to be the HDMI plug in the Video card but if it doesn't happen again I would not be worried about it.


----------



## JCH979

I don't use sleep mode. It may be the cable is loose or something as you say. I suppose I'll leave things be for now unless it happens again then I'll switch from the 14.1 drivers I'm still using and I should have a spare cable somewhere around here just in case. I haven't had any issues since I got the card back in November so hopefully it's not a big deal..


----------



## Devildog83

Quote:


> Originally Posted by *JCH979*
> 
> I don't use sleep mode. It may be the cable is loose or something as you say. I suppose I'll leave things be for now unless it happens again then I'll switch from the 14.1 drivers I'm still using and I should have a spare cable somewhere around here just in case. I haven't had any issues since I got the card back in November so hopefully it's not a big deal..


I hear ya, I would double check the cable and the HDMI connections first thing if it happens again. Try the 14.2 drivers anyhow, I had issue with 14,1 and all is running smooth with the 14.2. I have about 50 HDMI cables, if I could I would give you a new one.


----------



## JCH979

I haven't had any issue with 14.1 in windows, the few times I ran benches (pretty similar results to 13.12 if not better), or in the few games I've gotten a chance to play but I'll update sometime soon when I'm bored. I'm pretty sure there's gotta be at least one hdmi cable in one of these drawers but thanks anyway









Actually I'll just go ahead and update drivers now, I don't have anything to do for the next few hours lol.


----------



## Devildog83

Quote:


> Originally Posted by *JCH979*
> 
> I haven't had any issue with 14.1 in windows, the few times I ran benches (pretty similar results to 13.12 if not better), or in the few games I've gotten a chance to play but I'll update sometime soon when I'm bored. I'm pretty sure there's gotta be at least one hdmi cable in one of these drawers but thanks anyway
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Actually I'll just go ahead and update drivers now, I don't have anything to do for the next few hours lol.


LOL, it only takes me about 15 minutes to remove sweep and update to new drivers.

The 14.2 solved my X-Fire issues and some of the flickering in web browsers.


----------



## JCH979

I didn't mean it was gonna take me hours to update









Anyways I updated to 14.2 a while ago and replaced the hdmi cable. The connector felt a bit loose in the cards hdmi port so found another cable that had a better fit. I never seen flickering with the 14.1 drivers but I do remember seeing it with the 13.12 oddly enough.


----------



## Arkanon

Quote:


> Originally Posted by *Arkanon*
> 
> Any reports on overclocks being adversely affected by upping the TDP (W) in VBE?
> I seem to be running into a problem once i touch the TDP adjustment and running the same overclock after flashing the bios with the new TDP values. Without doing so BF4 runs flawlessly @ 1280/1500 but once flashed with higher values the game, with the same OC freezes instantly upon getting into the game. Anyone else having experience on this matter?
> 
> On a sidenote:
> http://www.3dmark.com/3dm11/7990823
> 3dm11 bench @ 1275/1500
> Too bad that i5-750 is holding back the combined score a bit due to being relatively crappy at physics calculations.


Anyone?


----------



## Devildog83

Quote:


> Originally Posted by *JCH979*
> 
> I didn't mean it was gonna take me hours to update
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyways I updated to 14.2 a while ago and replaced the hdmi cable. The connector felt a bit loose in the cards hdmi port so found another cable that had a better fit. I never seen flickering with the 14.1 drivers but I do remember seeing it with the 13.12 oddly enough.


Excellent!!


----------



## bajer29

Quote:


> Originally Posted by *Arkanon*
> 
> Anyone?


You're no being ignored... I just don't know a whole lot about GPU OCing, Or maybe I don't understand the question, sorry :/


----------



## HunnoPT

Hi Guys, its been a while since i didnt post my promissed values on Valley and 3d Mark 2013 Firestrike Extreme, so here they are. The system is:_

Fatality Killer 990fx
Fx 6300 Black Box [email protected] Mhz 1.52v
Corsair Vengeance 1600mhz 8GB (managed to reduce timings to 9-8-8-20-30 1T 1.6v)
Nox Krypton 900W 80+ Bronze
AMD MSI 270x Gaming Edition 2GB + AMD Gigabyte 270x WF OC 2GB


screen shot on pc














Soon as we crack up the voltage mod on those cards, i think i can do much better..still its better then a 290x score single lol:thumb:


----------



## neurotix

Quote:


> Originally Posted by *Arkanon*
> 
> Anyone?


Are you talking about the power limit or the voltage adjustment?


----------



## Arkanon

Quote:


> Originally Posted by *neurotix*
> 
> Are you talking about the power limit or the voltage adjustment?


The power limit in VBE



Marked the thing i was talking about, sorry for the confusion. Whenever i raise those values my card becomes instable with my overclock of 1275/1500 and games hang nearly instantly. As far as I know, raising the power limiter or TDP or whatever it's called exactly should give me some more overclocking headroom due to the fact the card is allowed to draw more power or am I wrong?


----------



## CravinR1

3x Sapphire Dual-X 280x mining at 1080/1500


----------



## Devildog83

*CravinR1* has been added, now I know where all of the GPU's are going.


----------



## neurotix

Quote:


> Originally Posted by *Arkanon*
> 
> The power limit in VBE
> 
> 
> 
> Marked the thing i was talking about, sorry for the confusion. Whenever i raise those values my card becomes instable with my overclock of 1275/1500 and games hang nearly instantly. As far as I know, raising the power limiter or TDP or whatever it's called exactly should give me some more overclocking headroom due to the fact the card is allowed to draw more power or am I wrong?


I just made a post a week or two ago about this. I'll try and sum it up.

The power limit doesn't actually do anything or affect the TDP of your card because it's all in software. The power draw of the card is determined by it's power connectors (and as someone corrected me, the PCI-E slot). I don't know what card you're using but generally a 270X takes two 6-pins and a 280X takes an 8-pin and a 6-pin. 8-pin power connectors provide 150w and 6-pin power connectors provide 75W in addition to the 75W from the PCI-E slot. This means that a 270X will have 150W from the power connectors and 75W from the PCI-E slot for 225W. A 280X will have 225W from the power connectors and 75W from the slot for 300W total. Of course, your system can draw more power under load, my R9 290 has been shown to draw as much as 400W under load with a high VDDC voltage and 50% power limit.

Anyway, the point is that nobody really knows for sure what the power limit does, because regardless of what you set it to, the cards power draw is determined by the connectors and the slot. This determines it's TDP. Raising the power limit influences stability during overclock, but most importantly it reduces clock throttling under load. The best way to test this is to overclock, raise the voltage, with a power limit of 0% and run something like Unigine Valley with a GPU-Z sensor log. If you see your card unable to maintain the clocks you set, and it downclocks, in the sensor log then you know you need to raise the power limit. Raising it to 20% should be more than enough, but newer cards can go to 50% by default. There is nothing wrong with maxing out the slider and running the card with 50% power limit, because all this really does is tell the software and the drivers not to throttle the card under load. It's all done in software, from what I've seen many people say here. So, it still doesn't really affect the TDP of the card or it's power draw at all, other than the card will draw more wattage over a prolonged period of time since it's not downclocking or throttling.

Really, I have no solution to your problem and neither does anyone else here, other than to tell you that if it ain't broke don't fix it. If you can run your overclock fine without messing with the power tune, but it crashes when you edit the powertune, then just DON'T EDIT THE POWERTUNE and you'll have no problem with your overclock.

I have no idea what card you have and neither does anyone else because you don't have a rigbuilder and you haven't stated it. This makes it hard to help you. Click rigbuilder in the top left, fill out your system specs, then go to your profile -> edit signature -> show off stuff in my signature -> select the rig in the drop down box.

Hope this helps.


----------



## Arkanon

Thanks for the help and the in depth explaining. Guess i'll have to settle for current clocks then, which are far from being bad anyway.


----------



## CravinR1

Quote:


> Originally Posted by *Devildog83*
> 
> *CravinR1* has been added, now I know where all of the GPU's are going.


You added my cards but not my name


----------



## Devildog83

Quote:


> Originally Posted by *CravinR1*
> 
> You added my cards but not my name


Fixed, it's old man syndrome.


----------



## jamor

Sorry guys i sold the 280x R9 for $100 profit because of the artifacts and horrible drivers.

Got an EVGA 770 and never looking back.

It feels so good to have a working card again.


----------



## Northern Isnlad

i have a r9 270x sloted into a PCIE 2.0 slot then why on earth gpu-z is showing the card to be in PCIE 1.1 slot? i've attached images.


----------



## rdr09

Quote:


> Originally Posted by *Northern Isnlad*
> 
> i have a r9 270x sloted into a PCIE 2.0 slot then why on earth gpu-z is showing the card to be in PCIE 1.1 slot? i've attached images.


load it. click that ?. start rendering and see it change.


----------



## Northern Isnlad

could not understand what you are trying to say...


----------



## rdr09

Quote:


> Originally Posted by *Northern Isnlad*
> 
> could not understand what you are trying to say...


see the ? beside the Bus interface in GPUZ? Click it and start the render program.

edit: watch your temp, though. it can load your gpu like BF4.


----------



## Northern Isnlad

ohh.k that test. sure will do right away!!!!


----------



## rdr09

Quote:


> Originally Posted by *Northern Isnlad*
> 
> ohh.k that test. sure will do right away!!!!


open another GPUZ and go to the Sensor tab to see temps. you may have to use the slider to see the vrm temps if your card has sensors for them.


----------



## neurotix

If you click the ? then click start render test you'll see the bus interface change from 1.1 to 2.0.

Basically, your card only runs at the full speed and bandwidth under load. The rest of the time it goes into PCI-E 1.1 to conserve power. Mine even goes to PCI-E x1 1.1 when idle, then under load it goes to PCI-E x16 2.0.


----------



## Northern Isnlad

thanks for the help


----------



## Pionir

I also have a problems with 13.12 driver.

Boot takes longer, OS start up etc...sometimes GPU fans stop spinning after entering to the desktop (Single GPU, GA R9 27OC).

My second card has no power connected but it is plugged into the motherboard, CFX is disabled.


----------



## Recr3ational

I might of bought another 280x....


----------



## bajer29

Quote:


> Originally Posted by *Recr3ational*
> 
> I might of bought another 280x....


Oops! Gee golly, how did that happen


----------



## Recr3ational

Quote:


> Originally Posted by *bajer29*
> 
> Oops! Gee golly, how did that happen


I was browsing ebay, and somehow my fingers typed R9 280x and pressed buy.
So i had to press pay.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> I was browsing ebay, and somehow my fingers typed R9 280x and pressed buy.
> So i had to press pay.


I hate it when that happens ! OK, not really but my wife does.


----------



## neurotix

Quote:


> Originally Posted by *Devildog83*
> 
> I hate it when that happens ! OK, not really but my wife does.


+1


----------



## M1kuTheAwesome

Has anyone else had driver crashes with 13.12? I had one that looked excactly like the ones caused by OC, but I had already gone back to stock because of crashes hours before. How does 14.2 work, should I try that instead?


----------



## Devildog83

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Has anyone else had driver crashes with 13.12? I had one that looked excactly like the ones caused by OC, but I had already gone back to stock because of crashes hours before. How does 14.2 work, should I try that instead?


I was OK with 13.12, had issues with 14.1 but 14.2 has worked well for me.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> I was OK with 13.12, had issues with 14.1 but 14.2 has worked well for me.


Same here. I had issues with all three but 14.2 is better.


----------



## Recr3ational

Can someone confirm the max voltage for the power colour 280x? Anyone with the same card?


----------



## Northern Isnlad

I'm playing BF:4 with mantle so could please tell me a way to check FPS as FRAPS not working in mantle condition?


----------



## Recr3ational

Quote:


> Originally Posted by *Northern Isnlad*
> 
> I'm playing BF:4 with mantle so could please tell me a way to check FPS as FRAPS not working in mantle condition?


Afterburners in game display thing?


----------



## Northern Isnlad

sorry couldn't decipher what you were trying to say but anyway got the FPS thingy taken care of. now the problem is through out the mission fps does not go below 50 but suddenly for a second it goes to 13 fps and come back and little laggy when videos start.. why is it happening do i need to thinker with catalyst control center please advice, I've the latest origin updates available for BF:4....


----------



## bartledoo

Quote:


> Originally Posted by *Northern Isnlad*
> 
> I'm playing BF:4 with mantle so could please tell me a way to check FPS as FRAPS not working in mantle condition?


type into the console perfoverlay.drawfps 1, it should show in the top right corner I think


----------



## Devildog83

Off topic but since nobody is posting, Newegg sent me a new Corsair toy to play with (OK review) -


Spoiler: Warning: Spoiler!


----------



## neurotix

Nice looking mouse.


----------



## Pionir

Finally everything works fine









CFX=OC [email protected] [email protected]=2x270X









2014-03-08 21:38:44 - Crysis3 sp @ 19" 1440x900








Frames: 1753 - Time: 33369ms - Avg: 52.534 - Min: 32 - Max: 101


----------



## Recr3ational

Quote:


> Originally Posted by *Pionir*
> 
> Finally everything works fine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> CFX=OC [email protected] [email protected]=2x270X
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2014-03-08 21:38:44 - Crysis3 sp @ 19" 1440x900
> 
> 
> 
> 
> 
> 
> 
> 
> Frames: 1753 - Time: 33369ms - Avg: 52.534 - Min: 32 - Max: 101


No voltage control?


----------



## Pionir

Hmmm...Gigabyte GA-R9-27OC-2GB rev 1.0....

I do not need more power, it is stable from 975 to 1050 MHz without the addition of 0-20% more energy.

The problem is that I can not go more with Core Clock-locked @1050 MHz.

There is a voltage control ;



EDIT. I forgot ;


----------



## Recr3ational

Quote:


> Originally Posted by *Pionir*
> 
> Hmmm...Gigabyte GA-R9-27OC-2GB rev 1.0....
> 
> I do not need more power, it is stable from 975 to 1050 MHz without the addition of 0-20% more energy.
> 
> The problem is that I can not go more with Core Clock-locked @1050 MHz.
> 
> There is a voltage control ;
> 
> 
> 
> EDIT. I forgot ;


Have you tried Sapphire Trixx?
Maybe it will give you extra head room?


----------



## Northern Isnlad

in the memory tab of gpu-z it shows ddr5 and hynix.. does it mean my card got memory from hynix or it's just a mere example of memory manufacturer?


----------



## alancsalt

Quote:


> Originally Posted by *Northern Isnlad*
> 
> in the memory tab of gpu-z it shows ddr5 and hynix.. does it mean my card got memory from hynix or it's just a mere example of memory manufacturer?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


That means your cards have Hynix ram.


----------



## Pionir

@Recr3ational, no I have not tried Trixx









I tried with Gigabyte OC Guru II, Asus Tweak, MSI Afterburner...









What do you think about the BIOS update to Gigabyte R9 270X 2GB







?

Gigabyte BIOS 270X = 015.039.000.001.000000
I have = 015.040.000.003.000000

http://www.techpowerup.com/vgabios/index.php?architecture=ATI&manufacturer=Gigabyte&model=R9+270X&interface=PCI-E&memType=GDDR5&memSize=2048&did=1002-6810-1458-2272

http://www.gigabyte.com/WebPage/40/imgaes/Easy%20Boost_ENG.pdf


----------



## Devildog83

Quote:


> Originally Posted by *Pionir*
> 
> @Recr3ational, no I have not tried Trixx
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I tried with Gigabyte OC Guru II, Asus Tweak, MSI Afterburner...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What do you think about the BIOS update to Gigabyte R9 270X 2GB
> 
> 
> 
> 
> 
> 
> 
> ?
> 
> Gigabyte BIOS 270X = 015.039.000.001.000000
> I have = 015.040.000.003.000000
> 
> http://www.techpowerup.com/vgabios/index.php?architecture=ATI&manufacturer=Gigabyte&model=R9+270X&interface=PCI-E&memType=GDDR5&memSize=2048&did=1002-6810-1458-2272
> 
> http://www.gigabyte.com/WebPage/40/imgaes/Easy%20Boost_ENG.pdf


It looks like the one you have is newer.


----------



## Northern Isnlad

Quote:


> Originally Posted by *alancsalt*
> 
> That means your cards have Hynix ram.


wow!! I've heard good things abut hynix memory.. thanks for the answer.


----------



## Majentrix

Is it possible to flash the BIOS of a 270 to that of a 270x?


----------



## Pionir

@Devildog83, thanks









@Northern Isnlad, also depends of the memory controller limitation, the BIOS version and VRM ;
Quote:


> ASUS 270X = DIGI+ VRM with 8-phase Super Alloy Power delivers precise digital power for superior efficiency, reliability, and performance.


@Majentrix, I'm not sure, it is risky...but Sappire TriXX offers greater OC







:


Spoiler: Warning: Spoiler!







Edit, ohhh I found publication dates of the BIOS
http://www.techpowerup.com/vgabios/index.php?architecture=ATI&manufacturer=Gigabyte&model=R9+270&interface=PCI-E&memType=GDDR5&memSize=2048

2013-*10*-28 01:01:00 = 27OC
2013-*09*-27 08:11:00 = 270X

@Devildog83


----------



## mAs81

Quote:


> Originally Posted by *alancsalt*
> 
> That means your cards have Hynix ram.


Hmm mine says Elpida.I've heard a lot of rumors about that brand..Apparently it filed for bankruptcy in mid 2013 and got bought by MIcron..
And for some reason 290/290X miners don't like having Elpida memory on their chips..As i see it there is only a little bit of a difference in benchmarks mostly,than in gaming than the other chips(Hynix/Samsung)..


----------



## Majentrix

Thanks for that Pionir! All the other vendor OCing software was limited to 1050 on the core, TriXX is for all intents and purposes unlimited.
After some quick benchmarking I've found that Firestrike is stable on 1175, though I'm going to keep it at 1125 just to be on the safe side.
My card (Gigabyte) doesn't actively cool the memory so I'm going to leave that alone.

I'm thinking that the other software vendors have limited overclocking on the 270 so that it can't surpass the 270x on performance while being cheaper.


----------



## Devildog83

Quote:


> Originally Posted by *Majentrix*
> 
> Thanks for that Pionir! All the other vendor OCing software was limited to 1050 on the core, TriXX is for all intents and purposes unlimited.
> After some quick benchmarking I've found that Firestrike is stable on 1175, though I'm going to keep it at 1125 just to be on the safe side.
> My card (Gigabyte) doesn't actively cool the memory so I'm going to leave that alone.
> 
> I'm thinking that the other software vendors have limited overclocking on the 270 so that it can't surpass the 270x on performance while being cheaper.


I don't think it could surpass the 270x anyway, there is a reason it's less expensive. Not that it's a bad card or anything, your not going to see a 7850 that will outperform a 7870 either. You may get a freak instance I suppose in either case but it would be an anomaly.


----------



## bartledoo

Anyone with the toxic cards able to unlock the voltages? I tried flashing a bios, but it didn't unlock it.


----------



## Northern Isnlad

well i'm very new to GPU overclocking so i would like to know whether it is safe to overclock both the memory and core clocks of my sapphire DUAL X R9 270X!!! or should i strictly stick to overclocking the core clock and leav the memory to deafult 5600 mHZ? please help.


----------



## mAs81

Quote:


> Originally Posted by *Northern Isnlad*
> 
> well i'm very new to GPU overclocking so i would like to know whether it is safe to overclock both the memory and core clocks of my sapphire DUAL X R9 270X!!! or should i strictly stick to overclocking the core clock and leav the memory to deafult 5600 mHZ? please help.
> 
> 
> Spoiler: Warning: Spoiler!


Tweaking the clocks is safe most of the times, overvolting your card might cause problems..Follow _this_ guide and,as always,watch your temps..


----------



## Northern Isnlad

Quote:


> Originally Posted by *mAs81*
> 
> Tweaking the clocks is safe most of the times, overvolting your card might cause problems..Follow _this_ guide and,as always,watch your temps..


i want to know between memory and core clock which influences the FPS most? increasing which one also results in more heat for the card?


----------



## mAs81

Quote:


> Originally Posted by *Northern Isnlad*
> 
> i want to know between memory and core clock which influences the FPS most? increasing which one also results in more heat for the card?


Higher gpu core clock means more processing power,and memory will result in more bandwidth for huge textures.
So firstly push your core clock to a stable OC and then turn up the memory as high you can without having stability issues.


----------



## Northern Isnlad

Quote:


> Originally Posted by *mAs81*
> 
> Tweaking the clocks is safe most of the times, overvolting your card might cause problems..Follow _this_ guide and,as always,watch your temps..


i want to know between memory and core clock which influe
Quote:


> Originally Posted by *mAs81*
> 
> Higher gpu core clock means more processing power,and memory will result in more bandwidth for huge textures.
> So firstly push your core clock to a stable OC and then turn up the memory as high you can without having stability issues.


well my GPU hits max 75 degree on 1100 on core and 1450 on memory is it good? or should i stick to it or down bit?


----------



## mAs81

Quote:


> Originally Posted by *Northern Isnlad*
> 
> i want to know between memory and core clock which influe
> well my GPU hits max 75 degree on 1100 on core and 1450 on memory is it good? or should i stick to it or down bit?


It's okay i guess but it's never bad to be on the safe side..Make a higher custom fan mode if you're okay with the noise..I presume you have fan control on auto , right?


----------



## Recr3ational

Why isn't my second gpu being used?
It's in crossfire but not being used in games or benchmarks


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Why isn't my second gpu being used?
> It's in crossfire but not being used in games or benchmarks


Do you have ULPS disabled, is graphics overdrive enabled. If you are running Trixx or AB or something it should be disabled.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Do you have ULPS disabled, is graphics overdrive enabled. If you are running Trixx or AB or something it should be disabled.


Yeah i have, i sorted it, Had to reinstall 14.2 (again)


----------



## mAs81

For some reason I can't seem to keep my current OC(1200/1600) stable in gaming anymore
( could it be the back plate? the 14.2 drivers ? )Anyone have the same problem?


----------



## runelotus

Can I please join this club

this is my new HIS R9 270X IceQ X2 Turbo Boost Clock 1140/1400


----------



## Northern Isnlad

Quote:


> Originally Posted by *mAs81*
> 
> It's okay i guess but it's never bad to be on the safe side..Make a higher custom fan mode if you're okay with the noise..I presume you have fan control on auto , right?


yes you are right so how to make it custom huh?


----------



## mAs81

Quote:


> Originally Posted by *Northern Isnlad*
> 
> yes you are right so how to make it custom huh?


Either use overdrive in the catalyst center :

or any other tuner program you prefer


----------



## Devildog83

Quote:


> Originally Posted by *runelotus*
> 
> Can I please join this club
> 
> this is my new HIS R9 270X IceQ X2 Turbo Boost Clock 1140/1400
> 
> Your in, welcome to the club.


----------



## Devildog83

I have been off for most of the day tearing down my PC and taking photos for a Build log. If anyone is interested take a look and leave a comment. Giving me crap is encouraged.

http://www.overclock.net/t/1472883/the-devils-own#post_21921919


----------



## Recr3ational

Looks good, a bit red for my liking but good never the less


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Looks good, a bit red for my liking but good never the less










I did join the RED club. Thanks


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> 
> 
> 
> 
> 
> 
> 
> I did join the RED club. Thanks


colour that blue cathodes box thing though.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> colour that blue cathodes box thing though.


It's hidden by the side panel, when I switch to the new case I will hide it away so it can't be seen.


----------



## runelotus

Quote:


> Originally Posted by *Devildog83*
> 
> Quote:
> 
> 
> 
> Originally Posted by *runelotus*
> 
> Can I please join this club
> 
> this is my new HIS R9 270X IceQ X2 Turbo Boost Clock 1140/1400
> 
> Your in, welcome to the club.
> 
> 
> 
> Thanks
Click to expand...


----------



## Unknownm

http://www.3dmark.com/3dm/2631226

for some reason, each card had a defective fan. Removed each one which caused higher temps (derp). Added 120mm fan blowing air out and everything is well


----------



## Pionir

I do not think they are defective.

My card is spinning from 1700 to 2500 rpm = extreme airflow, can be felt under the hand (more and faster then Asus).

You have the Corsair 330R *Quiet* Mid-Tower Case, are there any openings on the side panel ?

Gigabyte card disperses the air, there is no directional flow...


----------



## Unknownm

Quote:


> Originally Posted by *Pionir*
> 
> I do not think they are defective.
> 
> My card is spinning from 1700 to 2500 rpm = extreme airflow, can be felt under the hand (more and faster then Asus).
> 
> You have the Corsair 330R *Quiet* Mid-Tower Case, are there any openings on the side panel ?
> 
> Gigabyte card disperses the air, there is no directional flow...


They are. It feels much harder to spin than all the rest of them, and makes a noise when I enable 100% it's not fully spinning like the others.

I have the side panel off, want to cut a fan mount on and hook up 2x 80mm fans to blow air out.


----------



## Arkanon

Gave my 270x a new home.


----------



## rdr09

Quote:


> Originally Posted by *Unknownm*
> 
> http://www.3dmark.com/3dm/2631226
> 
> for some reason, each card had a defective fan. Removed each one which caused higher temps (derp). Added 120mm fan blowing air out and everything is well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


you need 2 bridges?


----------



## Majentrix

No, but it looks cooler!


----------



## rdr09

Quote:


> Originally Posted by *Majentrix*
> 
> No, but it looks cooler!


true. it does look better. check this out . . .

http://www.overclock.net/t/1464340/guide-mod-your-crossfire-sli-bridge-many-pics-links


----------



## cremelo

Well my new 280X comes








I only had time to test to see if there is no problem ( as you can see in the photo or the plastic removed ) :c

Comparison between Asus R9 280X DC2 Top burned and new Sapphire R9 280X Toxic














when I get home put photos of GPU-Z


----------



## Devildog83

*cremelo* I have added your new card, very sweet. I will post your X-fire clocks once you have them.


----------



## Majentrix

Sapphire have really stepped up their game this gen, Toxic cards are some of the highest clocked and binned cards this gen, not to mention the cooler's probably the best on the market at the moment.
The box art is much better than previous too.


----------



## cremelo

Quote:


> Originally Posted by *Devildog83*
> 
> *cremelo* I have added your new card, very sweet. I will post your X-fire clocks once you have them.


Tanks











Here is GPU-Z.... when'll keep in stock to make sure she is 100%


----------



## Unknownm

Quote:


> Originally Posted by *rdr09*
> 
> you need 2 bridges?


You need 2 bridges or else drivers won't enable CF
Quote:


> Originally Posted by *Majentrix*
> 
> No, but it looks cooler!


You need 2 bridges or else drivers won't enable CF

It even says it on AMD CCC

"Both bridge interconnects must be attached".


----------



## rdr09

Quote:


> Originally Posted by *Unknownm*
> 
> You need 2 bridges or else drivers won't enable CF
> You need 2 bridges or else drivers won't enable CF
> 
> It even says it on AMD CCC
> 
> "Both bridge interconnects must be attached".


how is tri or quad gonna work? they got rid of bridges in the 290s and added on the 280s.


----------



## Devildog83

Quote:


> Originally Posted by *cremelo*
> 
> Tanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is GPU-Z.... when'll keep in stock to make sure she is 100%


Got you covered,


----------



## Unknownm

Quote:


> Originally Posted by *rdr09*
> 
> how is tri or quad gonna work? they got rid of bridges in the 290s and added on the 280s.


question is why would you. If you are going overkill just get 3x r9 290x instead.

I have no idea why but drivers won't enable CF unless both are connected.


----------



## M1kuTheAwesome

Would appreciate clocks update in OP.

My current highest clocks. The Core is at its limit, the memory might clock higher (I can bench, but not game at 1780), but I don't have the time to try it out: I now have a brand new FX-8350 to play around with. Maybe in the summer I'll try flashing a custom BIOS as well for higher voltage.


----------



## Majentrix

Add me to the club with this awful photo!

Gigabyte GV-R927OC-2GD, stock is 975/1400, I have mine running at 1130/1400 stable. I can benchmark at 1160, get halfway through Firestrike at 1175 and BSOD at 1200.
That radiator you see in the background will eventually be the thing cooling this card, once my waterblock and memory heatsinks arrive tomorrow. It may sound pointless and expensive, but when you already have a CPU loop. and EOL EK waterblocks are $30 it's hard to say no.


----------



## rdr09

Quote:


> Originally Posted by *Unknownm*
> 
> question is why would you. If you are going overkill just get 3x r9 290x instead.
> 
> I have no idea why but drivers won't enable CF unless both are connected.


all that matters is it is working. i crossfired 7950 and 7970 in the past and i only used one bridge but was installed differently. your bridges seem to be installed the other way around.


----------



## cremelo

Quote:


> Originally Posted by *Devildog83*
> 
> Got you covered,


Devil on the list i am not using the Asus, the Asus burned the chip









Btw, anyone whit 280X Toxic?? yesterday after leaving games she showed some artifacts in Chrome








Using Catalyst 13.12....


----------



## Recr3ational

Quote:


> Originally Posted by *rdr09*
> 
> all that matters is it is working. i crossfired 7950 and 7970 in the past and i only used one bridge but was installed differently. your bridges seem to be installed the other way around.


Yeah I have a 7950 (with 7970 bios) currently crossfire with a 280x with a single bridge works fine. Though my second 280x is coming this week. I'll see if it needs 2 bridges.

@M1kuYheAwesome
What's the ASIC on your power colour?
Also what is the max voltage your card allows?


----------



## Devildog83

*cremelo* the Asus is pththhththth! Gone.

*M1kuTheAwesome* outstanding clocks have been updated.


----------



## Devildog83

*Majentrix* has been added. Welcome!!


----------



## Devildog83

Huh?


----------



## smoke2

I'm owning ASUS DCII TOP 280X and ASUS posted new BIOS on their support site, where is description:
"BIOS update to remove rare artifacting events during gaming, Please also update the Catalyst driver to 13.251 or above"
I was wondering to flash it immediately, but then make a decision to ask on ASUS tech. support if new BIOS is affecting cooling.

Their reply surprised me: "Please, don't update your BIOS if your card don't have any problem. It's very serious interference into graphic card and if it is not necessary, it's better to avoid it.

I'm operate it in Fractal R4 case and my max. temps are to 80 degrees, but in Europe is relatively not much hot now.
I was reading some VRAM chips (my card have SK Hynix) are overheating on this card, so the card have a potential to have burned VRAM?
I don't understand it much good and now I'm disappointed what to expect and if something to do


----------



## cremelo

Quote:


> Originally Posted by *smoke2*
> 
> I'm owning ASUS DCII TOP 280X and ASUS posted new BIOS on their support site, where is description:
> "BIOS update to remove rare artifacting events during gaming, Please also update the Catalyst driver to 13.251 or above"
> I was wondering to flash it immediately, but then make a decision to ask on ASUS tech. support if new BIOS is affecting cooling.
> 
> Their reply surprised me: "Please, don't update your BIOS if your card don't have any problem. It's very serious interference into graphic card and if it is not necessary, it's better to avoid it.
> 
> I'm operate it in Fractal R4 case and my max. temps are to 80 degrees, but in Europe is relatively not much hot now.
> I was reading some VRAM chips (my card have SK Hynix) are overheating on this card, so the card have a potential to have burned VRAM?
> I don't understand it much good and now I'm disappointed what to expect and if something to do


I make BiosUpdate on my old Asus R9 280X DC2 Top and worked fine and I never saw artifacts after update, Asus suport say to dont make to u dont lost the warranty....
But my old Asus burned updating driver for 14.2


----------



## smoke2

Quote:


> Originally Posted by *cremelo*
> 
> I make BiosUpdate on my old Asus R9 280X DC2 Top and worked fine and I never saw artifacts after update, Asus suport say to dont make to u dont lost the warranty....
> But my old Asus burned updating driver for 14.2


And what before update? Did you see the artifacts?
Do you think it's better to update my current BIOS despite I haven't a problem with artifacts yet?


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *Recr3ational*
> 
> What's the ASIC on your power colour?
> Also what is the max voltage your card allows?


TBH I had to google what ASIC is.








If you mean ASIC quality, GPU-Z says it's 64.9%
1300mV is max, that's also what I run to achieve these clocks.


----------



## Recr3ational

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> TBH I had to google what ASIC is.
> 
> 
> 
> 
> 
> 
> 
> 
> If you mean ASIC quality, GPU-Z says it's 64.9%
> 1300mV is max, that's also what I run to achieve these clocks.


Hmm i was hoping that your had more, cos mines the same.
I want moar!


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *Recr3ational*
> 
> Hmm i was hoping that your had more, cos mines the same.
> I want moar!


I'm with you there. But then, for overclockers, nothing but infinite is enough.


----------



## Arkanon

I wouldn't pay too much attention to that whole ASIC quality report. I had a gtx 660 with a ASIC quality of over 85%, yet it didn't overclock for one bit. Current 270x has a ASIC quality of 76% and overclocks like a dream, so for me personally i don't see any line in the measurement of ASIC quality. Just saying.


----------



## Recr3ational

Quote:


> Originally Posted by *Arkanon*
> 
> I wouldn't pay too much attention to that whole ASIC quality report. I had a gtx 660 with a ASIC quality of over 85%, yet it didn't overclock for one bit. Current 270x has a ASIC quality of 76% and overclocks like a dream, so for me personally i don't see any line in the measurement of ASIC quality. Just saying.


i dont either i use it for temps.
scratch that i dont know what i was on about. sorry tired.


----------



## cremelo

Quote:


> Originally Posted by *smoke2*
> 
> And what before update? Did you see the artifacts?
> Do you think it's better to update my current BIOS despite I haven't a problem with artifacts yet?


No artifacts....
I can tell u to make BiosUpdate... i make if u tink is better '-' For me is better to make biosupdate, bad they burned latter i install the 14.2 Beta =/


----------



## Unknownm

Quote:


> Originally Posted by *rdr09*
> 
> all that matters is it is working. i crossfired 7950 and 7970 in the past and *i only used one bridge but was installed differently. your bridges seem to be installed the other way around.*


lol it doesn't matter which way it goes. You can even have each going opposite direction and CF still works, so pretty sure bridge placement does not matter. Personally I rather have one bridge so if anything was to happen to it could install the other, knowing CF bridges are 5-10 dollars after shipping.

If there is driver hacks or another method of just having one bridge let me know.

Quote:


> Originally Posted by *Recr3ational*
> 
> Yeah I have a 7950 (with 7970 bios) currently crossfire with a 280x with a single bridge works fine. Though my second 280x is coming this week. I'll see if it needs 2 bridges.
> 
> @M1kuYheAwesome
> What's the ASIC on your power colour?
> Also what is the max voltage your card allows?


Also let me know if you have to use both bridge connects.


----------



## Majentrix

Is it possible to raise the voltage of the 270 above 1206 at all?


----------



## smoke2

Quote:


> Originally Posted by *cremelo*
> 
> No artifacts....
> I can tell u to make BiosUpdate... i make if u tink is better '-' For me is better to make biosupdate, bad they burned latter i install the 14.2 Beta =/


Have your card overclocked?
How about max. temperature in load?
Didn't you noticed different fan RPM's with new BIOS or change of VRAM or GPU frequencies?


----------



## cremelo

Quote:


> Originally Posted by *smoke2*
> 
> Have your card overclocked?
> How about max. temperature in load?
> Didn't you noticed different fan RPM's with new BIOS or change of VRAM or GPU frequencies?


Before it burns the answers are:
- Already overclocked to 1170Mhz/6500 on Power Limit 120% Vcore 1.27
- Max temp. is 74ºC
- No because I always used the fan at 50%


----------



## smoke2

Quote:


> Originally Posted by *cremelo*
> 
> Before it burns the answers are:
> - Already overclocked to 1170Mhz/6500 on Power Limit 120% Vcore 1.27
> - Max temp. is 74ºC
> - No because I always used the fan at 50%


I have Fractal R4 silenced with 2 front 140mm fans and one 140mm as an outtake and have temperatures to 78ºC.
Must to say outside is not much hot and fans have about 1540 rpms.
Im on stock overclocked frequncies...


----------



## cremelo

Quote:


> Originally Posted by *smoke2*
> 
> I have Fractal R4 silenced with 2 front 140mm fans and one 140mm as an outtake and have temperatures to 78ºC.
> Must to say outside is not much hot and fans have about 1540 rpms.
> Im on stock overclocked frequncies...


which ambient temperature????
Here on Ambient temperature on 18ºC stay on 71ºC~72ºC and Ambient Temperature on 27ºC around 72ºC~74ºC....

PS: tried to increase the power limit to 120%? and Vcore to 1.21?

I sent my old Asus R9 280X DC2 Top to RMA Store








Within 1 month maybe she gets here







even think about doing it with CrossFireX or try to sell it and get a Vapor-X


----------



## Recr3ational

Quote:


> Originally Posted by *cremelo*
> 
> which ambient temperature????
> Here on Ambient temperature on 18ºC stay on 71ºC~72ºC and Ambient Temperature on 27ºC around 72ºC~74ºC....
> 
> PS: tried to increase the power limit to 120%? and Vcore to 1.21?
> 
> I sent my old Asus R9 280X DC2 Top to RMA Store
> 
> 
> 
> 
> 
> 
> 
> 
> Within 1 month maybe she gets here
> 
> 
> 
> 
> 
> 
> 
> even think about doing it with CrossFireX or try to sell it and get a Vapor-X


Underwater. The 280x underwater owns. Just saying


----------



## smoke2

Quote:


> Originally Posted by *cremelo*
> 
> which ambient temperature????
> Here on Ambient temperature on 18ºC stay on 71ºC~72ºC and Ambient Temperature on 27ºC around 72ºC~74ºC....
> 
> PS: tried to increase the power limit to 120%? and Vcore to 1.21?
> 
> I sent my old Asus R9 280X DC2 Top to RMA Store
> 
> 
> 
> 
> 
> 
> 
> 
> Within 1 month maybe she gets here
> 
> 
> 
> 
> 
> 
> 
> even think about doing it with CrossFireX or try to sell it and get a Vapor-X


Here is also to 18ºC and I have to 78ºC, but my case is silenced, so your probably have better possibilities to suck the fresh air.

No, I didn't overclocked it.

What was bad with your RMAed old ASUS DC2 TOP?


----------



## cremelo

Quote:


> Originally Posted by *smoke2*
> 
> Here is also to 18ºC and I have to 78ºC, but my case is silenced, so your probably have better possibilities to suck the fresh air.
> 
> No, I didn't overclocked it.
> 
> What was bad with your RMAed old ASUS DC2 TOP?


It updated the driver to 14.2 and it did not work any more with any driver installed, connect without driver works + has no information about it, as if it did not exist then sent to a friend and the same could have been said that the Chip after 1 whole day trying to fix it in some way she did not turn on the fan... Dead GPU =/

and the worst: it was imported then it will take 1 month to reach a new









But now my Toxic only arrived in 68 º C on Fan in Auto








Quote:


> Originally Posted by *Recr3ational*
> 
> Underwater. The 280x underwater owns. Just saying


In my country if I only import =/


----------



## Majentrix

Can I get a clock update?
I've lost the little heatsinks I was going to use for the memory, so I have a big fan pointed at the chips until I can get some more.


----------



## bardacuda

Quote:


> Originally Posted by *Majentrix*
> 
> Is it possible to raise the voltage of the 270 above 1206 at all?


Quote:


> Originally Posted by *Majentrix*
> 
> Can I get a clock update?
> I've lost the little heatsinks I was going to use for the memory, so I have a big fan pointed at the chips until I can get some more.


I haven't been able to change the voltage on the Gigabyte, either up or down, even with a modded BIOS.
I had an ASUS 270 though, and that one was changeable through BIOS editing. Had to send it back though because the cooler was defective and I'm still waiting on the replacement.








So to answer your question I guess it depends on the board maker. For the Gigabyte 270 (rev. 1.0 at least) the answer appears to be 'no'.
Too bad though cuz with those frosty temps it's begging for more juice! :/


----------



## Devildog83

*Majentrix* , clocks are updated.

Small update to my build.


----------



## Majentrix

It's a shame really, were it not for the voltage I'd be getting 1200MHz and over easily.
Between you and me I bought the Gigabyte version only because it had the two six pin power connectors. Later found out the cooler was awful (that tiny heatsink really doesn't cut it for how much heat the card can put out) and now I find out the card is voltage locked! Looks like I should've bought the Asus model hah.


----------



## cremelo

Quote:


> Originally Posted by *Devildog83*
> 
> *Majentrix* , clocks are updated.
> 
> Small update to my build.


LoL
What is your case?


----------



## Recr3ational

Go underwater!
Nice rig dude!


----------



## DiceAir

Hi all

I'm sitting with a simple issue. I have 2x R9-280x club3d cards. So I'm playing @ 120hz 2560x1440 so as you can see my cards should get warm but I feel it was getting a bit to warm for my taste.

Having all my fans spinning at 100% except for my h100i fans. Cpu temps is still fine but my GPU went up to 90C and i feel that's a bit hot. I have the following fans in my case

Back; normal af140 that came with the fan
Front 3x Cougar vortex PWM fans
Top Corsair H100i radiator with stock fans as pull.

So is there any way of getting more air between my cards or puling out more air. Should i change the fans for better ones or will i waste my money or should i still be fine running my GPU on 90C as long as it's not for super long periods of rime? It's only like a few games where it will get that hot so not a super big deal i think just want to get my temps a bit lower if can.

i was thinking about making a whole in the side so that I can add another 120-140mm fan there to cool my gpu's but i'm not so skilled with that and i don't know of anyone that can do it for me as i'm in South Africa. So please suggest me what I should do or if there is anything I can do to keep my temps lower.

Oh before i forget my ambient temps = 23-25C

Thanks


----------



## Recr3ational

Best solution is probably to add a 120 in the side panel.


----------



## Devildog83

Quote:


> I'm sitting with a simple issue. I have 2x R9-280x club3d cards. So I'm playing @ 120hz 2560x1440 so as you can see my cards should get warm but I feel it was getting a bit to warm for my taste.
> 
> Having all my fans spinning at 100% except for my h100i fans. Cpu temps is still fine but my GPU went up to 90C and i feel that's a bit hot. I have the following fans in my case
> 
> Back; normal af140 that came with the fan
> Front 3x Cougar vortex PWM fans
> Top Corsair H100i radiator with stock fans as pull.
> 
> So is there any way of getting more air between my cards or puling out more air. Should i change the fans for better ones or will i waste my money or should i still be fine running my GPU on 90C as long as it's not for super long periods of rime? It's only like a few games where it will get that hot so not a super big deal i think just want to get my temps a bit lower if can.
> 
> i was thinking about making a whole in the side so that I can add another 120-140mm fan there to cool my gpu's but i'm not so skilled with that and i don't know of anyone that can do it for me as i'm in South Africa. So please suggest me what I should do or if there is anything I can do to keep my temps lower.
> 
> Oh before i forget my ambient temps = 23-25C
> 
> Thanks


Are the fans GPU fans running at 100% too. If not you might have to adjust your fan profile higher to compensate. I have a profile that ramps my fans to 60% @ 60c and 75% @ 70c. I sit at 40% fan at idle and I can barely hear them, It might be a tad loud but mine never really get above 70c in gaming except occasionally in BF4.


----------



## DiceAir

Quote:


> Originally Posted by *Recr3ational*
> 
> Best solution is probably to add a 120 in the side panel.


i can try removing one of my hdd at the bottom and add a 140mm fan that came with my case to help with the cooling. i was thinking of changing the Thermal paste but I'm scared doing that and make things worse and that's why i will first try doing a safer and cheap solution to fix it.

Do you guys think 90C is fine. it's not doing that all the time in gaming. BF4 will bring it to about 80-84C but tomb Raider and Crysis 3 will let it go 90C


----------



## Devildog83

Quote:


> Originally Posted by *DiceAir*
> 
> i can try removing one of my hdd at the bottom and add a 140mm fan that came with my case to help with the cooling. i was thinking of changing the Thermal paste but I'm scared doing that and make things worse and that's why i will first try doing a safer and cheap solution to fix it.
> 
> Do you guys think 90C is fine. it's not doing that all the time in gaming. BF4 will bring it to about 80-84C but tomb Raider and Crysis 3 will let it go 90C


I know the R9 290/x cards can go there but not sure if it's OK for the 280x.


----------



## DiceAir

Quote:


> Originally Posted by *Devildog83*
> 
> Are the fans GPU fans running at 100% too. If not you might have to adjust your fan profile higher to compensate. I have a profile that ramps my fans to 60% @ 60c and 75% @ 70c. I sit at 40% fan at idle and I can barely hear them, It might be a tad loud but mine never really get above 70c in gaming except occasionally in BF4.


Yip fans at 100%. now I'm not sure if the graphics cards fans intake or exhaust does anyone know?


----------



## Recr3ational

Actually if you can change the thermal paste do it.

I did that with my old msi 7950 and dropped 10c with IC diamond. Msi put well to much on there


----------



## DiceAir

I also noticed when I change my voltages my gpu doesn't revert back to idle voltages and stay on my new voltage.

I lowered the voltage to 1.250 and played about 1hr of battlefield and highest temps was 75C


----------



## Recr3ational

Quote:


> Originally Posted by *DiceAir*
> 
> I also noticed when I change my voltages my gpu doesn't revert back to idle voltages and stay on my new voltage.
> 
> I lowered the voltage to 1.250 and played about 1hr of battlefield and highest temps was 75C


75c is decent? I don't think you have to worry about it. Also what do you mean with the voltages exactly?


----------



## DiceAir

Quote:


> Originally Posted by *Recr3ational*
> 
> 75c is decent? I don't think you have to worry about it. Also what do you mean with the voltages exactly?


i mean my voltage after changing it to 1.250 on desktop usage is still 1.250V and doesn't drop to 0.950 so it's still good. Will test setting my voltage level lower but i think 1.250 is ok. I also asked about temps in the Air 540 thread and someone told me there that i should rather make my top fans as intake rather than exhaust. i should be getting lower cpu temps and lower gpu temps.


----------



## Recr3ational

Quote:


> Originally Posted by *DiceAir*
> 
> i mean my voltage after changing it to 1.250 on desktop usage is still 1.250V and doesn't drop to 0.950 so it's still good. Will test setting my voltage level lower but i think 1.250 is ok. I also asked about temps in the Air 540 thread and someone told me there that i should rather make my top fans as intake rather than exhaust. i should be getting lower cpu temps and lower gpu temps.


I'm not very good with air as my rigs under water but what mine is all of my fans apart from the rear exhaust is intake. It was like that when it was running on air and it was fine so you if can it's a good idea to make extra fans as intake.

And with the voltage is your pc running performance mode? Some people say that causes it some doesn't. My first gpu is always at 1.3v as I have 3 monitors but the second gpu idles at it's supposed to. If you have a 120hz monitor, that seems to cause it too.


----------



## DiceAir

I fixed the voltage thing by unchecking "Force constant voltage" in MSI afterburner. So now my gpu is running 73C on 83% fan speed when playing bf4. Maybe some more demanding games will push it a bit further but for now it's ok. Will still change my fans as intake at the top. BTW my fans on top is sitting above the radiator


----------



## Devildog83

So much fun !!


----------



## Recr3ational

Quote:


> Originally Posted by *DiceAir*
> 
> I fixed the voltage thing by unchecking "Force constant voltage" in MSI afterburner. So now my gpu is running 73C on 83% fan speed when playing bf4. Maybe some more demanding games will push it a bit further but for now it's ok. Will still change my fans as intake at the top. BTW my fans on top is sitting above the radiator


Oh forgot about constant voltage. It's really up to you mate. What do you think it's best.


----------



## DiceAir

Quote:


> Originally Posted by *Recr3ational*
> 
> Oh forgot about constant voltage. It's really up to you mate. What do you think it's best.


well so far so good. Without constant voltage I can get better temps at idle and stable in BF4 so far so happy with my cards now.


----------



## repo_man

I'm looking to watercool my R9 270X, what options are there out there? What is everyone using? I'd prefer a uni-block, but I'm open to other options.


----------



## Majentrix

EK make a great universal block, look into the EK VGA Supremacy block, which is what I'm using at the moment. I can show you photos if you'd like.


----------



## repo_man

Quote:


> Originally Posted by *Majentrix*
> 
> EK make a great universal block, look into the EK VGA Supremacy block, which is what I'm using at the moment. I can show you photos if you'd like.


Yea, I'd love some pics!


----------



## Recr3ational

Quote:


> Originally Posted by *DiceAir*
> 
> well so far so good. Without constant voltage I can get better temps at idle and stable in BF4 so far so happy with my cards now.


Then you don't have to change anything.
I doubt it would make a lot of difference anyway


----------



## cremelo

Small update '-'
New PSU, now just need a new case and get another Toxic


----------



## mikemykeMB

Quote:


> Originally Posted by *repo_man*
> 
> I'm looking to watercool my R9 270X, what options are there out there? What is everyone using? I'd prefer a uni-block, but I'm open to other options.


Have to check the AMD/ATI GPU Mod Club - AKA "The Red Mod"', if your dig is modding or go cooly "custom" loop with EK-Swifttech-AlphaCool... piece part together--- etc...etc ya know ones that the have kits. Did a mod with Kuhler 620 pump head and radiators with stand alone pump MCP655 and tubing.


----------



## Majentrix

Lightning is awful but I'm sure you can see what matters.
One thing to note is that any memory chips above the block will need very low profile heatsinks on them. I don't have mine installed at the moment but when I do put them on I'll be sure to post a photo.


----------



## repo_man

Quote:


> Originally Posted by *Majentrix*
> 
> 
> 
> Lightning is awful but I'm sure you can see what matters.
> One thing to note is that any memory chips above the block will need very low profile heatsinks on them. I don't have mine installed at the moment but when I do put them on I'll be sure to post a photo.


Yea I was just about to ask what memory sinks are you using. I've WCed before, so it's nothing too new for me. Just not sure of what's good anymore atm, lol. Last time I had a full loop I was running an intel 775 Q6600 and a GTX275.


----------



## NostraD

Thought this would be the place to get some opinions regarding R9 280 just showing up on N ewegg:
Looks like the cheapest R9 280 is just $10 less than the cheapest R9 280X. I've been waiting on this card, but now i'm thinking for $10 I might as well get the "X" ?
What is the concensus? Oh, primarily using it for mining at night and gaming by day.


----------



## Recr3ational

Quote:


> Originally Posted by *NostraD*
> 
> Thought this would be the place to get some opinions regarding R9 280 just showing up on N ewegg:
> Looks like the cheapest R9 280 is just $10 less than the cheapest R9 280X. I've been waiting on this card, but now i'm thinking for $10 I might as well get the "X" ?
> What is the concensus? Oh, primarily using it for mining at night and gaming by day.


$10 for a better card is a no brainer. Go for it. I love the 280x.


----------



## repo_man

Quote:


> Originally Posted by *NostraD*
> 
> Thought this would be the place to get some opinions regarding R9 280 just showing up on N ewegg:
> Looks like the cheapest R9 280 is just $10 less than the cheapest R9 280X. I've been waiting on this card, but now i'm thinking for $10 I might as well get the "X" ?
> What is the concensus? Oh, primarily using it for mining at night and gaming by day.


I originally bought an R9 270 and ended up RMAing it. I spent another $40 and got a R9270X and haven't looked back. $10 for the better card is TOTALLY worth it. I mean, don't eat out twice this week and you've saved that much, lol.


----------



## Recr3ational

My second 280x came today. Tri fire?








I'll update my clocks when I get the blocks on it.


----------



## Devildog83

Quote:


> Originally Posted by *repo_man*
> 
> I originally bought an R9 270 and ended up RMAing it. I spent another $40 and got a R9270X and haven't looked back. $10 for the better card is TOTALLY worth it. I mean, don't eat out twice this week and you've saved that much, lol.


I agree, that's my days worth of munchies from the mini-mart.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> My second 280x came today. Tri fire?
> 
> 
> 
> 
> 
> 
> 
> 
> I'll update my clocks when I get the blocks on it.


Looking forward to that.


----------



## Recr3ational

So tempting man, I really want to go Trifire.


I added a third rad to the rig.
UK has ran out of XSPC 45 degree fitting for some reason, so this has to do for a while.


----------



## smoke2

I'm owning Fractal R4 case some weeks and bought also new ASUS DCII 280X TOP.
I've removed top HDD cage, added Enermax T.B.Silence UCTB14A fan into the upper front position and regulator on Enermax fan adjusted to maximum.
On regulator case it is set 7V value.
I have three installed fans (two original R2 and one Enermax).

I've played Company of Heroes 2 in hotter room about an hour and the max. temperature according to HWinfo was 81 degrees on GPU and 88 degrees on VRM.

Do you think it's OK temperature or is something seriously wrong?
What will be happened if I haven't got any of the fans?

Would be glad for any insights.


----------



## Recr3ational

Quote:


> Originally Posted by *smoke2*
> 
> I'm owning Fractal R4 case some weeks and bought also new ASUS DCII 280X TOP.
> I've removed top HDD cage, added Enermax T.B.Silence UCTB14A fan into the upper front position and regulator on Enermax fan adjusted to maximum.
> On regulator case it is set 7V value.
> I have three installed fans (two original R2 and one Enermax).
> 
> I've played Company of Heroes 2 in hotter room about an hour and the max. temperature according to HWinfo was 81 degrees on GPU and 88 degrees on VRM.
> 
> Do you think it's OK temperature or is something seriously wrong?
> What will be happened if I haven't got any of the fans?
> 
> Would be glad for any insights.


Its hot but not "blowing up" hot.
It could be better though. So if you could make it cooler it would be advisable. Add extra fans?

My new clocks are
1280 / 1650
Temps are 25c idle 48c after heaven so that's good.

I think I can get more.


----------



## repo_man

Quote:


> Originally Posted by *mikemykeMB*
> 
> Have to check the AMD/ATI GPU Mod Club - AKA "The Red Mod"', if your dig is modding or go cooly "custom" loop with EK-Swifttech-AlphaCool... piece part together--- etc...etc ya know ones that the have kits. Did a mod with Kuhler 620 pump head and radiators with stand alone pump MCP655 and tubing.


Thanks man. I've built my own loops before, though. I was asking here to see what specific blocks guys were using on these cards. Figured I'd start at the horse's mouth, so to say.


----------



## mikemykeMB

Quote:


> Originally Posted by *repo_man*
> 
> Thanks man. I've built my own loops before, though. I was asking here to see what specific blocks guys were using on these cards. Figured I'd start at the horse's mouth, so to say.


Gotcha..was leaning on that notion, just my two cents. I'm on the fence with changing the block to something slim-sleek, better..universal what not. Don't see any full cover's for 270x's.


----------



## repo_man

Quote:


> Originally Posted by *mikemykeMB*
> 
> Gotcha..was leaning on that notion, just my two cents. I'm on the fence with changing the block to something slim-sleek, better..universal what not. Don't see any full cover's for 270x's.


I ran a uniblock on my last three cards and can't speak highly enough of them. For me, they pay dividends for longevity and usefullness. Plus, you'll get a better resell out of them later one. No one is paying good money for full coverage blocks on cards 3 gens behind, ya know?


----------



## mikemykeMB

Quote:


> Originally Posted by *repo_man*
> 
> I ran a uniblock on my last three cards and can't speak highly enough of them. For me, they pay dividends for longevity and usefullness. Plus, you'll get a better resell out of them later one. No one is paying good money for full coverage blocks on cards 3 gens behind, ya know?


That's right, $$ the root of ...well, good idea then to stay forward on what makes most sense on that, the whole liquid cooled, overclocked, memory tweaks is just like putting the power to the ground and see how it handles, like they say...deeper pockets get you faster down the road.


----------



## cremelo

Quote:


> Originally Posted by *smoke2*
> 
> I'm owning Fractal R4 case some weeks and bought also new ASUS DCII 280X TOP.
> I've removed top HDD cage, added Enermax T.B.Silence UCTB14A fan into the upper front position and regulator on Enermax fan adjusted to maximum.
> On regulator case it is set 7V value.
> I have three installed fans (two original R2 and one Enermax).
> 
> I've played Company of Heroes 2 in hotter room about an hour and the max. temperature according to HWinfo was 81 degrees on GPU and 88 degrees on VRM.
> 
> Do you think it's OK temperature or is something seriously wrong?
> What will be happened if I haven't got any of the fans?
> 
> Would be glad for any insights.


Send to RMA our make like Recr3ational say, Underwater








Its a very strange temps....


----------



## smoke2

Quote:


> Originally Posted by *Recr3ational*
> 
> Its hot but not "blowing up" hot.
> It could be better though. So if you could make it cooler it would be advisable. Add extra fans?
> 
> My new clocks are
> 1280 / 1650
> Temps are 25c idle 48c after heaven so that's good.
> 
> I think I can get more.


I'm only on stock overclocking frequencies (1070/1600)

I was wondering to put one fan on the bottom.
How to orientate the fan? As intake or an out-take?
Wouldn't then stealing a necessary air for the PSU fan which is next to it?
I have a carpet with little hair.


----------



## Recr3ational

Put wood underneath you pc. I've done the same it helps. Also intake is always handy.


----------



## Devildog83

I used 2 pieces of wood and a "Lazy Susan" so I could rotate the rig a bit.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> I used 2 pieces of wood and a "Lazy Susan" so I could rotate the rig a bit.


"Lazy Susan"
Such an awesome name lol.


----------



## Devildog83

Should be "lazy Devil",







. That thing makes it so much easier work on too but I have to figure out a way to stop it better because it moves a bit. I think I will use a dowel and holes for different spots.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Should be "lazy Devil",
> 
> 
> 
> 
> 
> 
> 
> . That thing makes it so much easier work on too but I have to figure out a way to stop it better because it moves a bit. I think I will use a dowel and holes for different spots.


Self adhesive foam!


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Self adhesive foam!


I need it to move but be able to lock it down once I have it where I need it.


----------



## lightsout

Whats up guys. I got into mining recently but am getting out of it. Going to keep two of my Sapphire Dual X 270's for gaming. They seem to bench as good as the 780 I sold so I should be good there.

I have been away from AMD for a while, these cards have voltage unlocked in trixx. But in afterburner even if I check unlock voltage I can not adjust it in AB itself. Is there any trick I can do to get this to work? Or do I need to use trixx for these sapphire cards?

My conf file

Code:



Code:


UnofficialOverclockingEULA   = I confirm that I am aware of unofficial overclocking limitations and fully understand that MSI will not provide me any support on it
UnofficialOverclockingMode      = 2


----------



## Recr3ational

My 280x's can't be adjusted by AB either so just use trixx it's not much of a different anyway.


----------



## lightsout

OK. I'm just used to ab. Didn't really give trixx a chance I'll have to check it out. Thanks.


----------



## Recr3ational

Quote:


> Originally Posted by *lightsout*
> 
> OK. I'm just used to ab. Didn't really give trixx a chance I'll have to check it out. Thanks.


Yeah me too. I loved afterburner. You can still use the features, just change you clocks using trixx


----------



## lightsout

Quote:


> Originally Posted by *Recr3ational*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> OK. I'm just used to ab. Didn't really give trixx a chance I'll have to check it out. Thanks.
> 
> 
> 
> Yeah me too. I loved afterburner. You can still use the features, just change you clocks using trixx
Click to expand...

Thank you. +rep


----------



## Pionir

Which drivers should I install for R9 270 ?

With 13.12. I have installation complete ( warnings occurred )...

Everything works fine until I install CFX = BSOD ;

Problem signature:
Problem Event Name: BlueScreen
OS Version: 6.1.7600.2.0.0.256.1
Locale ID: 4122

Additional information about the problem:
BCCode: 116
BCP1: FFFFFA80045194E0
BCP2: FFFFF880043816CC
BCP3: 0000000000000000
BCP4: 0000000000000002
OS Version: 6_1_7600
Service Pack: 0_0
Product: 256_1

I have 2x R9 : GA 27OC PCIe1(1050)+ASUS 270X PCIe2 (1050 MHz) and it seems as the MSI afterbunrer have problems with load the application profile....


----------



## Recr3ational

Try out, the new beta drivers, maybe that can sort your issues?


----------



## Pionir

Ok I will try these amd_catalyst_13.11_beta v9.5.


----------



## Recr3ational

Quote:


> Originally Posted by *Pionir*
> 
> Ok I will try these amd_catalyst_13.11_beta v9.5.


Try 14.2


----------



## Unknownm

Speaking of 14.2 beta I have a issue with them. In bf4 after finishing a game and it loads the next level the second card is only kicking in 2d clocks (500Mhz compared to 1.1Ghz). 13.12 works fine, only seems to 14.2


----------



## Recr3ational

Quote:


> Originally Posted by *Unknownm*
> 
> Speaking of 14.2 beta I have a issue with them. In bf4 after finishing a game and it loads the next level the second card is only kicking in 2d clocks (500Mhz compared to 1.1Ghz). 13.12 works fine, only seems to 14.2


Yeah its causing a lot of issues.
revert back if it causes you to much. After like my 5th reinstall it seems to be fixed.
Make sure you uninstall it properly though.


----------



## Unknownm

Quote:


> Originally Posted by *Recr3ational*
> 
> Yeah its causing a lot of issues.
> revert back if it causes you to much. After like my 5th reinstall it seems to be fixed.
> Make sure you uninstall it properly though.


Display Driver Uninstaller. 13.12 are the only stable drivers that don't cause any random errors or 2d clock. When will AMD release stable drivers and not "beta"


----------



## Recr3ational

Quote:


> Originally Posted by *Unknownm*
> 
> Display Driver Uninstaller. 13.12 are the only stable drivers that don't cause any random errors or 2d clock. When will AMD release stable drivers and not "beta"


Yeah sometimes its like that, I dont know why.
Next time you uninstall try AMD CleanUp Utility.

Don't know just trying to help,


----------



## rickyman0319

I am looking to buy 270 (x) for mining rig. I am wondering what brand and model shall I buy . it has to be quiet and not loud. do u guys have any suggestion of what brand and model I shall buy?


----------



## mAs81

The msi Gaming series is very quiet..Have a look at them


----------



## lightsout

Quote:


> Originally Posted by *rickyman0319*
> 
> I am looking to buy 270 (x) for mining rig. I am wondering what brand and model shall I buy . it has to be quiet and not loud. do u guys have any suggestion of what brand and model I shall buy?


I have been happy with the sapphire dual x. Had four of them (non x better since its only one six pin) mining. Selling them though as mining is in a down state.


----------



## Recr3ational

Whats the safe temps for VRms? mines at 70 is it okay?


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Whats the safe temps for VRms? mines at 70 is it okay?


If that's max it's not bad, I could live with that at max if I was overclocked.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> If that's max it's not bad, I could live with that at max if I was overclocked.


Yeah thats max, think theres something wrong with my loop. Higher than before. Gonna reloop it i think.


----------



## mikemykeMB

Quote:


> Originally Posted by *Recr3ational*
> 
> Yeah thats max, think theres something wrong with my loop. Higher than before. Gonna reloop it i think.


How much than before?..600T rig u built and doing a re-loop might get you better for worse..be a damn good time to check for flow-blockage-pump hiccups.. seems to be a headache to do a re-do on a nice build-bend of tubes.


----------



## Recr3ational

Quote:


> Originally Posted by *mikemykeMB*
> 
> How much than before?..600T rig u built and doing a re-loop might get you better for worse..be a damn good time to check for flow-blockage-pump hiccups.. seems to be a headache to do a re-do on a nice build-bend of tubes.


Like by 20 degrees. I think I know what it is. UK has ran out of 45 degree fitting so I used a ghetto loop the loop when I added an extra rad. Thats probably why. It's still with its limits just annoying lol


----------



## mikemykeMB

Quote:


> Originally Posted by *Recr3ational*
> 
> Like by 20 degrees. I think I know what it is. UK has ran out of 45 degree fitting so I used a ghetto loop the loop when I added an extra rad. Thats probably why. It's still with its limits just annoying lol


Oh that doesn't seem to be functional stylish, lol..you'll get it back in limits.


----------



## Pionir

Quote:


> Originally Posted by *Recr3ational*
> 
> Try 14.2


I will try now 14.2.

Everything works pretty well, and I have no BSOD with one card.

The main problem is the GPU load, Crysis 3 Welcome to the Jungle stage, 57 % AVG for Gigabyte R9 27OC, single card test.

2600K runs smoothly cca 55% max load.

I tried this ; http://answers.microsoft.com/en-us/windows/forum/windows_vista-hardware/catalyst-control-centre-host-application-has/0a0ef1b0-117d-43d8-a43f-9b24228b660b

I found some viruses and solve that, but it did not help.

I try Driver Sweeper and Driver Fusion, both the same result = It did not help = installation complete warnings occurred !!


----------



## Recr3ational

Quote:


> Originally Posted by *Pionir*
> 
> I will try now 14.2.
> 
> Everything works pretty well, and I have no BSOD with one card.
> 
> The main problem is the GPU load, Crysis 3 Welcome to the Jungle stage, 57 % AVG for Gigabyte R9 27OC, single card test.
> 
> 2600K runs smoothly cca 55% max load.
> 
> I tried this ; http://answers.microsoft.com/en-us/windows/forum/windows_vista-hardware/catalyst-control-centre-host-application-has/0a0ef1b0-117d-43d8-a43f-9b24228b660b
> 
> I found some viruses and solve that, but it did not help.
> 
> I try Driver Sweeper and Driver Fusion, both the same result = It did not help = installation complete warnings occurred !!


Oh okay, Thats why then. When you uninstalled theres still remnants of old drivers still left.
Thats why you get the errors. Meaning the driver isnt properly installed.

Uninstall using AMD CleanUP Utility,
Reboot.
Scan for registry, using CCleaner or somethign similar.
Reboot.
Then try to install again.

If its all working fine now. Just leave it as it is.
That's what i do.


----------



## rdr09

Quote:


> Originally Posted by *Recr3ational*
> 
> Oh okay, Thats why then. When you uninstalled theres still remnants of old drivers still left.
> Thats why you get the errors. Meaning the driver isnt properly installed.
> 
> Uninstall using AMD CleanUP Utility,
> Reboot.
> Scan for registry, using CCleaner or somethign similar.
> Reboot.
> Then try to install again.
> 
> If its all working fine now. Just leave it as it is.
> That's what i do.


Rec, so do you really need 2 bridges? those cards are fast, huh?


----------



## Recr3ational

Quote:


> Originally Posted by *rdr09*
> 
> Rec, so do you really need 2 bridges? those cards are fast, huh?


Oh yeah, 2 secs. Ill take one out. And the cards are fast. Like 15%-20% faster than my 7950 crossfire setup.

Edit: It seems like mines working with one bridge. Doesnt say anything about needing two bridges either.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Oh yeah, 2 secs. Ill take one out. And the cards are fast. Like 15%-20% faster than my 7950 crossfire setup.
> 
> Edit: It seems like mines working with one bridge. Doesnt say anything about needing two bridges either.


Isn't the second bridge for tri-fire?


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Isn't the second bridge for tri-fire?


Yes, thats what i was confirming, someone said you needed two bridges.


----------



## rdr09

Quote:


> Originally Posted by *Recr3ational*
> 
> Oh yeah, 2 secs. Ill take one out. And the cards are fast. Like 15%-20% faster than my 7950 crossfire setup.
> 
> Edit: It seems like mines working with one bridge. Doesnt say anything about needing two bridges either.


----------



## Pionir

Sorry Recr3ational, but I do not want to use AMD Clean UP Utility









It is a 65:35 chance - http://forums.guru3d.com/showthread.php?t=373855

I do it in the old fashion way








- DISABLE the iGPU in the motherboard BIOS
- Control Panel uninstall
- Reboot
- Device Manager under Display adapters must be Standard VGA Graphics Adapter ...OK!
- Safe mode + Driver Sweeper (with and without delete all files, backups and log)
- Reboot
- CCleaner Registy
- Program files (x64 and x86) delete folder AMD








- Reboot and Install new driver - amd_catalyst_14.2_beta1.3 (current)

I did it so many times and again "warnings occurred"









Sapphire TriXX 1075 MHz stable so far ...1125 MHz GPU clock is final objective (as ASUS 270X).

We will see what will happen when I run CFX @ 1125MHz.


----------



## rickyman0319

what about 280 (x) , what gpu cooler is silent or barely silent?


----------



## mikemykeMB

Quote:


> Originally Posted by *Pionir*
> 
> Sorry Recr3ational, but I do not want to use AMD Clean UP Utility
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It is a 65:35 chance - http://forums.guru3d.com/showthread.php?t=373855
> 
> I do it in the old fashion way
> 
> 
> 
> 
> 
> 
> 
> 
> - DISABLE the iGPU in the motherboard BIOS
> - Control Panel uninstall
> - Reboot
> - Device Manager under Display adapters must be Standard VGA Graphics Adapter ...OK!
> - Safe mode + Driver Sweeper (with and without delete all files, backups and log)
> - Reboot
> - CCleaner Registy
> - Program files (x64 and x86) delete folder AMD
> 
> 
> 
> 
> 
> 
> 
> 
> - Reboot and Install new driver - amd_catalyst_14.2_beta1.3 (current)
> 
> I did it so many times and again "warnings occurred"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire TriXX 1075 MHz stable so far ...1125 MHz GPU clock is final objective (as ASUS 270X).
> 
> We will see what will happen when I run CFX @ 1125MHz.


What error occurs?.. I get an error that relates with the decoding drag and drop..try a custom install and see if all goes well.


----------



## Unknownm

Quote:


> Originally Posted by *Recr3ational*
> 
> Yes, thats what i was confirming, someone said you needed two bridges.


wow I wonder why mine require both. AMD drivers don't give the option in CF to enable with any one bridge plugged in (left or right), it just says "both bridge interconnects must be attached"


----------



## DeviousAddict

Quote:


> Originally Posted by *Unknownm*
> 
> wow I wonder why mine require both. AMD drivers don't give the option in CF to enable with any one bridge plugged in (left or right), it just says "both bridge interconnects must be attached"


I think they mean that both sides of the bridge must be connected to a card, rather than both bridge connections on each card should be used.
I only have one bridge attached and CF works fine with my R9 280X's


----------



## Unknownm

Quote:


> Originally Posted by *DeviousAddict*
> 
> I think they mean that both sides of the bridge must be connected to a card, rather than both bridge connections on each card should be used.
> I only have one bridge attached and CF works fine with my R9 280X's


ah but still doesn't explain why I can't enable CF with one bridge.. Unless one of them is faulty, should pull it out and test each one.


----------



## DeviousAddict

Quote:


> Originally Posted by *Unknownm*
> 
> ah but still doesn't explain why I can't enable CF with one bridge.. Unless one of them is faulty, should pull it out and test each one.


sounds plausible. might be worth testing both bridges. but if its working with both bridges and there are no issues haveing both connected, you might as well leave them on.


----------



## Recr3ational

Quote:


> Originally Posted by *DeviousAddict*
> 
> sounds plausible. might be worth testing both bridges. but if its working with both bridges and there are no issues haveing both connected, you might as well leave them on.


Maybe it's your card mate.


----------



## lightsout

I got a faulty bridge with a Crosshair IV once. It happens.


----------



## Pionir

Quote:


> Originally Posted by *mikemykeMB*
> 
> What error occurs?.. I get an error that relates with the decoding drag and drop..try a custom install and see if all goes well.


I have no errors, everything installed properly, here this is it ; http://forums.amd.com/game/messageview.cfm?catid=454&threadid=172023
Single GPU load is OK 95-99%, but I have 2-3x longer boot time








I will publish a picture when I get home...now works at 1125 [email protected]%-60 oC max stable (3x case fans) and may even push further...


----------



## mikemykeMB

Quote:


> Originally Posted by *Pionir*
> 
> I have no errors, everything installed properly, here this is it ; http://forums.amd.com/game/messageview.cfm?catid=454&threadid=172023
> Single GPU load is OK 95-99%, but I have 2-3x longer boot time
> 
> 
> 
> 
> 
> 
> 
> 
> I will publish a picture when I get home...now works at 1125 [email protected]%-60 oC max stable (3x case fans) and may even push further...


Cool, right on, now go bump up the Mhz'$ Good luck


----------



## DiceAir

Have anyone tried the NZXT kraken g10 and antec khuler 620 on a r9-280x. I'm just scared about vrm temps. What about i keep the vrm plate on the gpu and install that plate and AIO cooler for at least my top card. if everything works out i hope i can overclock my cards much further. but this will cost me so much that in the end tis will cost me more than 2x r9-290 non reference cards.


----------



## Pionir

Quote:


> Originally Posted by *mikemykeMB*
> 
> Cool, right on, now go bump up the Mhz'$ Good luck












pic-mic ;


Spoiler: Warning: Spoiler!


----------



## DiceAir

So I'm running 2560x1440 @ 120Hz. My friend was over at my place this weekend and he is running a GTX760 gaming. He was playing on a 1080p 60hz panel and I don't know if I'm imagining things or what but every game looked smoother than on my screen. Maybe the frametime is worse on my cards.

I'm running windows 8.1 with latest 14.2 drivers. Will see once they release the new drivers this week. fingers crossed for tomorrow


----------



## Roaches

Just ordered a R9-270X devil edition today for my secondary gaming rig/ home server build. Hopefully it arrives tomorrow. As much how sexy these cards look, I might get a second to crossfire for fun.



Will take unboxing pics once it arrives,


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> Just ordered a R9-270X devil edition today for my secondary gaming rig/ home server build. Hopefully it arrives tomorrow. As much how sexy these cards look, I might get a second to crossfire for fun.
> 
> 
> 
> Will take unboxing pics once it arrives,


Awesome, your gonna love it. X-Fire will be awesome if you go there. The core is a tad limited on mine but the memory can almost hit 1600 Mhz/6.4 Ghz. Is that going in the X58 Crimson Phantom?

Can't wait for pics and clocks.

Just noticed you pay taxes on your order, I am lucky not to have to deal with that yet.


----------



## Roaches

Quote:


> Originally Posted by *Devildog83*
> 
> Awesome, your gonna love it. X-Fire will be awesome if you go there. The core is a tad limited on mine but the memory can almost hit 1600 Mhz/6.4 Ghz. Is that going in the X58 Crimson Phantom?
> 
> Can't wait for pics and clocks.
> 
> Just noticed you pay taxes on your order, I am lucky not to have to deal with that yet.


Nah its going into a new build to make use of my FT02 catching dust in my living living room workstation area. Been wanting to do a pure AMD FX Crossfire build for sometime though the time available to do so is not always there and fluctuating GPU market in the AMD arena added further delay. I might just use my 2500k from my unfinished ITX build and Z77 Extreme 4 temporary until I get my hands on a good OCing 990FX board since having a brand new FX-8350 sitting next to me for several months now unused.

My old X58 rig now belongs to my sis, I'm baffled on how strong that thing stands against modern games to this day, its over 3 years old!


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> Nah its going into a new build to make use of my FT02 catching dust in my living living room workstation area. Been wanting to do a pure AMD FX Crossfire build for sometime though the time available to do so is not always there and fluctuating GPU market in the AMD arena added further delay. I might just use my 2500k from my unfinished ITX build and Z77 Extreme 4 temporary until I get my hands on a good OCing 990FX board since having a brand new FX-8350 sitting next to me for several months now unused.
> 
> My old X58 rig now belongs to my sis, I'm baffled on how strong that thing stands against modern games to this day, its over 3 years old!


A CHVFZ would be nice for the 8350, they work well together. Bummer the 8350 just sits there not doin' nothin'.


----------



## Roaches

Quote:


> Originally Posted by *Devildog83*
> 
> A CHVFZ would be nice for the 8350, they work well together. Bummer the 8350 just sits there not doin' nothin'.


I've thought about the Crosshair V Formula-Z and 990FX Sabertooth, but recent waves of negative feedback of their Customer Service Support has be looking the other way for the time being, even though I have yet to have an ASUS board to fail on me. I might just pull the trigger for one if all else fails to meet my requirements from other vendors.

I'm still on the fence due to other board vendors have their own problems, such as Gigabyte 990FXA-UD3 mosfet cooling issue and MSI VRM housefires. Not even sure if Newegg sells the latest revision of the UD5 or UD7 which is nowhere to be found there at this time of writing. Asrock boards on the other hard are Iffy due to mixed feedback I've gathered from Newegg and here.


----------



## mikemykeMB

Quote:


> Originally Posted by *Roaches*
> 
> I've thought about the Crosshair V Formula-Z and 990FX Sabertooth, but recent waves of negative feedback of their Customer Service Support has be looking the other way for the time being, even though I have yet to have an ASUS board to fail on me. I might just pull the trigger for one if all else fails to meet my requirements from other vendors.
> 
> I'm still on the fence due to other board vendors have their own problems, such as Gigabyte 990FXA-UD3 mosfet cooling issue and MSI VRM housefires. Not even sure if Newegg sells the latest revision of the UD5 or UD7 which is nowhere to be found there at this time of writing. Asrock boards on the other hard are Iffy due to mixed feedback I've gathered from Newegg and here.


Sometimes the way feedback is presented creates a step away from a purchase because of all the bad ones..like you said , haven't had a fail.. and so why hold back!! There comes a time to just do it and deal w/it & how it comes @ ya. Oh..I sat on that fence and then jumped into it and came out grinning. Like the song.


----------



## dmfree88

Get the sabertooth! From what i hear asus is the only way to go with amd boards and the saberkitty is a tank. I personally hated the msi board i had and my giga gives me issues everything i do. I wish i woulda got sabertooth. UD5 has not impressed me.


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> I've thought about the Crosshair V Formula-Z and 990FX Sabertooth, but recent waves of negative feedback of their Customer Service Support has be looking the other way for the time being, even though I have yet to have an ASUS board to fail on me. I might just pull the trigger for one if all else fails to meet my requirements from other vendors.
> 
> I'm still on the fence due to other board vendors have their own problems, such as Gigabyte 990FXA-UD3 mosfet cooling issue and MSI VRM housefires. Not even sure if Newegg sells the latest revision of the UD5 or UD7 which is nowhere to be found there at this time of writing. Asrock boards on the other hard are Iffy due to mixed feedback I've gathered from Newegg and here.


The asrock board doesn't have LLC and has vdroop issues form what I have heard, the UD5 and UD7 if you were to ask over at the veshira club are good boards but a bit different to overclock. The Saberkitty and the Crosshair seem to be the ones that users swear buy. By the way I had to RMA my first one and did it without issue. It turns out it wasn't even the board that was faulty it was the CPU but they still sent a new one straight away. If you want a decent one and want to save a few bucks the M5A99x evo is a decent board too.


----------



## Recr3ational

If you want AMD I highly recommend the GIgabyte 990FXA-UD5 REV:3

It's an awesome board I had it before upgrading to intel. Oced my 8350 to 5.2.GHZ on it.


----------



## cremelo

I have a question: If I make X-Fire between my 280X Toxic and Asus R9 280X DC2 Top I will have noticeable performance loss of Toxic?
because Toxic runs on 1150Mhz / 6400Mhz and Asus runs on 1070Mhz / 6400Mhz in case if I took the Asus on 1100Mhz and Toxic on 1100Mhz would be a good thing?

I still can not decide whether I get the Asus or seeing her and caught more HD and memories because I believe in trying to mine Bitcoin whit 2 280X and play some games '-'

Any tips ?


----------



## Devildog83

Quote:


> Originally Posted by *cremelo*
> 
> I have a question: If I make X-Fire between my 280X Toxic and Asus R9 280X DC2 Top I will have noticeable performance loss of Toxic?
> because Toxic runs on 1150Mhz / 6400Mhz and Asus runs on 1070Mhz / 6400Mhz in case if I took the Asus on 1100Mhz and Toxic on 1100Mhz would be a good thing?
> 
> I still can not decide whether I get the Asus or seeing her and caught more HD and memories because I believe in trying to mine Bitcoin whit 2 280X and play some games '-'
> 
> Any tips ?


That's kinda what I do with my setup, I have the 2 cards running at 1200/1400 which is kinda the middle ground for both cards on the core. I leave the memory a bit low for both because I just don't see that much of a boost except for heat which I don't like. As far as mining the best thing would be to test what the best clocks are because a lot of times lower clocks will net more hashrate. I would set up 1 profile for gaming and 1 for mining once you find the best clocks for both and the you can switch back and forth as needed.

Example - for gaming you might rum them at 1125/1650, usually higher is better
for mining you might be better at 1000/1500, sometimes lower is better

Also I have found that the top card get's hotter normally, so I would put the coolest card in the top slot.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> If you want AMD I highly recommend the GIgabyte 990FXA-UD5 REV:3
> 
> It's an awesome board I had it before upgrading to intel. Oced my 8350 to 5.2.GHZ on it.


See, just don't get the UD3 for the 8350. I have heard terrible things about the 2 together.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> See, just don't get the UD3 for the 8350. I have heard terrible things about the 2 together.


If you do get the UD5, don't get the rev 1. Hasn't got LLC support.


----------



## cremelo

Quote:


> Originally Posted by *Devildog83*
> 
> That's kinda what I do with my setup, I have the 2 cards running at 1200/1400 which is kinda the middle ground for both cards on the core. I leave the memory a bit low for both because I just don't see that much of a boost except for heat which I don't like. As far as mining the best thing would be to test what the best clocks are because a lot of times lower clocks will net more hashrate. I would set up 1 profile for gaming and 1 for mining once you find the best clocks for both and the you can switch back and forth as needed.
> 
> Example - for gaming you might rum them at 1125/1650, usually higher is better
> for mining you might be better at 1000/1500, sometimes lower is better
> 
> Also I have found that the top card get's hotter normally, so I would put the coolest card in the top slot.


I think about should leaving the Toxic above because my case is still small and Asus is smaller and fits in part with the HD bays and Toxic out which showed a little colder than Asus....

My Max. temps on Toxic is 75ºC on Ambient temp. 32ºC and 72ºC on ambient temp. on 18ºC
Maybe Asus get 76ºC on ambient temp. 32ºC and 74ºC on ambient temp. on 18ºC

I hope that at least the Asus back normal after the RMA claims not to have seen any problem in it ...

I'll follow your tip to leave the clocks in between







Maybe 1100Mhz/6400Mhz


----------



## Roaches

Oh great! Newegg decides to ship it from their Tennessee warehouse instead of from City of Industry









Not sure how I'm gonna deal with shipping issues because it was my only day off from work this week. A waste of 8 dollars if it takes longer than 3 days to arrive.


----------



## cremelo

i dont liked this benchmark











Edit: I noticed was the DayZ Open at another monitor







I'll do another test when I get home


----------



## Pionir

Probably most of you allready know ; Thief: Mantle Performance vs DX ;
http://www.hardwareheaven.com/reviews/1956/pg1/thief-mantle-performance-full-article.html

Note. there is no CPU usage and total power W consumption !


----------



## Recr3ational

Quote:


> Originally Posted by *Pionir*
> 
> Probably most of you allready know ; Thief: Mantle Performance vs DX ;
> http://www.hardwareheaven.com/reviews/1956/pg1/thief-mantle-performance-full-article.html
> 
> Note. there is no CPU usage and total power W consumption !


Wow. That's awesome. Over 20% increase with the 280x. Just by mantle. I'm glad I've stuck with AMD now. I bought the 280x to replace my 7950 for £260 just for 20% performance increase.

Well done AMD.

Can someone explain to me why the performance increase is super high with AMD CPUs?


----------



## lightsout

Quote:


> Originally Posted by *Recr3ational*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Pionir*
> 
> Probably most of you allready know ; Thief: Mantle Performance vs DX ;
> http://www.hardwareheaven.com/reviews/1956/pg1/thief-mantle-performance-full-article.html
> 
> Note. there is no CPU usage and total power W consumption !
> 
> 
> 
> Wow. That's awesome. Over 20% increase with the 280x. Just by mantle. I'm glad I've stuck with AMD now. I bought the 280x to replace my 7950 for £260 just for 20% performance increase.
> 
> Well done AMD.
> 
> Can someone explain to me why the performance increase is super high with AMD CPUs?
Click to expand...

Those are some pretty excellent results in the graphs directly from amd. The review for nothing like that. Seems fishy to me.

Or maybe they got a quick spike to 200 fps so they call that a 50% increase


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Wow. That's awesome. Over 20% increase with the 280x. Just by mantle. I'm glad I've stuck with AMD now. I bought the 280x to replace my 7950 for £260 just for 20% performance increase.
> 
> Well done AMD.
> 
> Can someone explain to me why the performance increase is super high with AMD CPUs?


Just as when two people communicate through a translator, this works, but it isn't efficient. And it's the CPU that has to do all this extra work, translating and queuing data for the graphics card to process. PCs are meant to be the ultimate gaming platform - they have the power - but all this translation slows things down, and game developers approached AMD asking for something better.

Mantle reduces the CPU's workload by giving developers a way to talk to the GPU directly with much less translation. With less work for the CPU to do, programmers can squeeze much more performance from a system, delivering the greatest benefits in gaming systems where the CPU can be the bottleneck.

Now that Mantle has freed up some extra CPU capacity, we expect Mantle will lead to better games, and more of them, since Mantle makes game development easier.

That's not all Mantle will do for gamers. By shifting work to the GPU, a mid-range or older CPU isn't the same handicap it was before. With Mantle, the GPU becomes the critical part of the system, and GPU upgrades will have a bigger impact than before.

Source: AMD.com


----------



## Recr3ational

Oh didn't realise it was directly from AMD. Missed that bit, sorry was at work.

Though saying that. Even, let's say it's a 10% performance boost, it's still good.

That's basically like me spending £130 extra to buy a better gpu but instead getting it's for free. That's the way I see it anyway. Any performance boost is a winner.


----------



## lightsout

Quote:


> Originally Posted by *Recr3ational*
> 
> Oh didn't realise it was directly from AMD. Missed that bit, sorry was at work.
> 
> Though saying that. Even, let's say it's a 10% performance boost, it's still good.
> 
> That's basically like me spending £130 extra to buy a better gpu but instead getting it's for free. That's the way I see it anyway. Any performance boost is a winner.


I agree I will take any gains. But a 290x getting a 50% boost at 1080p? Thats crazy.


----------



## M1kuTheAwesome

Is there a custom BIOS for the 280X TurboDuo that allows more voltage? I know I can edit it myself, but if possible, I'd use something is proven to work for some people at least.


----------



## Recr3ational

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Is there a custom BIOS for the 280X TurboDuo that allows more voltage? I know I can edit it myself, but if possible, I'd use something is proven to work for some people at least.


I want to know too. It really needs more voltage. It's hell of a card.

If you do edit the bios give me a copy?


----------



## Bagmup

Can i join?


----------



## Devildog83

Quote:


> Originally Posted by *Bagmup*
> 
> Can i join?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Bagmup*has been added, nice set-up. Looking forward to some clocks and bench's if you would.


----------



## eAT5

i RMA'ed my 7970's, Guess what came FEDEX today.


----------



## Recr3ational

Quote:


> Originally Posted by *eAT5*
> 
> i RMA'ed my 7970's, Guess what came FEDEX today.


Good choice!
Tell us the clocks I wanna see how they compare to mine







. Oh and tell me the max the voltage sliders allow please. Thanks!


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *eAT5*
> 
> i RMA'ed my 7970's, Guess what came FEDEX today.


Do you have any idea how hard I had to convince myself I don't need 2 of those? Well, you just ruined my hard work. I NEEEEEEEED another TurboDuo!! Anyone wanna buy my soul? Oh, I already sold it for the first card...







Oh well...
Ahem, anyone wanna buy my roommate?







Only 299 euros! (price of 1 TurboDuo here)


----------



## Recr3ational

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Do you have any idea how hard I had to convince myself I don't need 2 of those? Well, you just ruined my hard work. I NEEEEEEEED another TurboDuo!! Anyone wanna buy my soul? Oh, I already sold it for the first card...
> 
> 
> 
> 
> 
> 
> 
> Oh well...
> Ahem, anyone wanna buy my roommate?
> 
> 
> 
> 
> 
> 
> 
> Only 299 euros! (price of 1 TurboDuo here)


Yeah I say that about everything. I don't need a watercooled computer that does more than I really need. But nope. I got one anyway.

Why isn't there a lot more people with the TurboDuo? They're really decent cards.

I'm guessing the terrible cooler? Or?


----------



## Pionir

Quote:


> Originally Posted by *Recr3ational*
> 
> Can someone explain to me why the performance increase is super high with AMD CPUs?


Because of Level 2 cache size, mostly, AMD is 8/4-Intel 4/8...

Compare these two, L1 and Level 2 cache size :
http://www.cpu-world.com/CPUs/Bulldozer/AMD-FX-Series%20FX-8350.html
http://www.cpu-world.com/CPUs/Core_i7/Intel-Core%20i7-4770K.html
Quote:


> Level 2 Cache is used for accepting data directly from the memory (RAM) of the PC and having it ready for the CPU to use a lot quicker than if the CPU had to wait for the RAM to deliver the data over the system bus directly.


http://www.overclock.net/t/1451891/amd-oxide-games-amd-mantle-presentation-and-demo

According to AMD's announcements, futuristic computers will not need the RAM (see PS4 and XBox1 system-specifications), it will use graphics memory+Mantle.
Quote:


> In April 2013, the term "hUMA" (for heterogenous Uniform Memory Access) began to appear in AMD promotional material to refer to CPU and GPU sharing the same system memory via cache coherent views. Advantages include an easier programming model and less copying of data between separate memory pools


It means = Eight core+4 x 2 MB shared exclusive caches+GDDR5 (vs DDR3 RAM) sistem memory.


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *Recr3ational*
> 
> Yeah I say that about everything. I don't need a watercooled computer that does more than I really need. But nope. I got one anyway.
> 
> Why isn't there a lot more people with the TurboDuo? They're really decent cards.
> 
> I'm guessing the terrible cooler? Or?


Cooler seems pretty ok to me. I got it up to a max of 71C at 1300mV with my custom fan profile. VRM temps were a few degrees higher, 74 I think. That's why I want more voltage on this card: obviously has headroom for more.
TBH this card wasn't my first choice either, I was thinking more Asus or Sapphire, but since budget was tight the Powercolor caught my eye and I am very happy with the turnout.


----------



## Bagmup

Quote:


> Originally Posted by *Devildog83*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bagmup*
> 
> Can i join?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Bagmup*has been added, nice set-up. Looking forward to some clocks and bench's if you would.
> 
> 
> 
> Cheers mate.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So i'm a complete noob when it comes to oc-ing, in fact, this is my first gaming PC, i've been a dirty console player all my life, but the ps4 and xbone really haven't excited me, so i thought well, i'll jump into PC gaming.
> 
> Now to go find some "overclocking for dummies" threads.
Click to expand...


----------



## Recr3ational

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Cooler seems pretty ok to me. I got it up to a max of 71C at 1300mV with my custom fan profile. VRM temps were a few degrees higher, 74 I think. That's why I want more voltage on this card: obviously has headroom for more.
> TBH this card wasn't my first choice either, I was thinking more Asus or Sapphire, but since budget was tight the Powercolor caught my eye and I am very happy with the turnout.


I was dissapointed that i didnt get the Msi, but i heard that its voltaged locked?
I got the power color just because it fits my blocks.


----------



## Bagmup

Quote:


> Originally Posted by *Recr3ational*
> 
> I was dissapointed that i didnt get the Msi, but i heard that its voltaged locked?
> I got the power color just because it fits my blocks.


Yeah, they are voltage locked, which was quite disappointing when i found out. I'm not sure if all 270x's are or not.


----------



## microkid21

My GPU is Powercolor R9 270x Devil and I have experienced BSOD with latest beta driver catalyst 14.3 and other previous driver such as Catalyst 14.2, Catalyst 14.1 and 13.2 while playing Call of Duty: Ghost and Skyrim. Do you guys experience it too?
How did you manage to solve this? Please help.

*Specs*

PROCS: Intel core i3 2100 @ 3.1ghz
RAM: GSKILL Ripjaws 2 x 4GB 1866 DDR3
MOBO: Asrock Z68 Fatal1ty Professional GEN3
PSU: Corsair CS650M 80+ Gold
GPU: Powercolor R9 270x Devil

Thanks in advance!


----------



## Devildog83

Quote:


> Originally Posted by *microkid21*
> 
> My GPU is Powercolor R9 270x Devil and I have experienced BSOD with latest beta driver catalyst 14.3 and other previous driver such as Catalyst 14.2, Catalyst 14.1 and 13.2 while playing Call of Duty: Ghost and Skyrim. Do you guys experience it too?
> How did you manage to solve this? Please help.
> 
> *Specs*
> 
> PROCS: Intel core i3 2100 @ 3.1ghz
> RAM: GSKILL Ripjaws 2 x 4GB 1866 DDR3
> MOBO: Asrock Z68 Fatal1ty Professional GEN3
> PSU: Corsair CS650M 80+ Gold
> GPU: Powercolor R9 270x Devil
> 
> Thanks in advance!


I have 2 devils and haven't had BSOD's with them, 14.2 driver now. What clocks are you running in games?


----------



## microkid21

Quote:


> Originally Posted by *Devildog83*
> 
> I have 2 devils and haven't had BSOD's with them, 14.2 driver now. What clocks are you running in games?


The default clocks which the devil runs @1150MHz with boost state of 1180MHz. What games do you play? Upon checking of dump files from BSOD, using Bluescreenviewer the description is *ATI Radeon Kernel Mode Driver. File path: C:\WINDOWS\system32\drivers\atikmdag.sys* any clue? Thanks


----------



## eAT5

Quote:


> Originally Posted by *Recr3ational*
> 
> Good choice!
> Tell us the clocks I wanna see how they compare to mine
> 
> 
> 
> 
> 
> 
> 
> . Oh and tell me the max the voltage sliders allow please. Thanks!


i can only get 1 to work, crossfire seems impossible... gonna try to instal driver with only 1 card next then put second card in....

how do i check voltage?

here is stock clocks...


----------



## Devildog83

Quote:


> Originally Posted by *eAT5*
> 
> i can only get 1 to work, crossfire seems impossible... gonna try to instal driver with only 1 card next then put second card in....
> 
> how do i check voltage?
> 
> here is stock clocks...


Get HWinfo64, it's great for your whole system.


----------



## Devildog83

Delete double post


----------



## Devildog83

I would use these settings - By the way, disable graphics overdrive in CCC.


----------



## microkid21

My GPU is Powercolor R9 270x Devil and I have experienced BSOD with latest beta driver catalyst 14.3 and other previous driver such as Catalyst 14.2, Catalyst 14.1 and 13.2 while playing Call of Duty: Ghost and Skyrim. Do you guys experience it too?
How did you manage to solve this? Please help.

Specs

PROCS: Intel core i3 2100 @ 3.1ghz
RAM: GSKILL Ripjaws 2 x 4GB 1866 DDR3
MOBO: Asrock Z68 Fatal1ty Professional GEN3
PSU: Corsair CS650M 80+ Gold
GPU: Powercolor R9 270x Devil

Thanks in advance!


----------



## eAT5

Quote:


> Originally Posted by *Devildog83*
> 
> I would use these settings - By the way, disable graphics overdrive in CCC.


Quote:


> Originally Posted by *Devildog83*
> 
> I would use these settings - By the way, disable graphics overdrive in CCC.


it wont even boot with 2 cards. just shows a flashing dash in upper left corner... but i get install errors, i'll try afterburner.

have hiwinfo, had to update it ....
Quote:


> Originally Posted by *microkid21*
> 
> My GPU is Powercolor R9 270x Devil and I have experienced BSOD with latest beta driver catalyst 14.3 and other previous driver such as Catalyst 14.2, Catalyst 14.1 and 13.2 while playing Call of Duty: Ghost and Skyrim. Do you guys experience it too?
> How did you manage to solve this? Please help.
> 
> Specs
> 
> PROCS: Intel core i3 2100 @ 3.1ghz
> RAM: GSKILL Ripjaws 2 x 4GB 1866 DDR3
> MOBO: Asrock Z68 Fatal1ty Professional GEN3
> PSU: Corsair CS650M 80+ Gold
> GPU: Powercolor R9 270x Devil
> 
> Thanks in advance!


im running no beta drivers.... using 13.12

and


----------



## rickyman0319

is club31 a good brand for video card? it is on sale?


----------



## eAT5

Quote:


> Originally Posted by *rickyman0319*
> 
> is club31 a good brand for video card? it is on sale?


review

http://www.techpowerup.com/159712/club-3d-announces-its-radeon-hd-7950-graphics-card.html


----------



## microkid21

Quote:


> Originally Posted by *eAT5*
> 
> it wont even boot with 2 cards. just shows a flashing dash in upper left corner... but i get install errors, i'll try afterburner.
> 
> have hiwinfo, had to update it ....
> im running no beta drivers.... using 13.12
> 
> and


I also tried that with no avail. Whats your card by the way?


----------



## Bagmup

Is there any way of unlocking the voltage on an MSI 270x 2gb gaming edition?

Also, since i put the second card in, i can't get anywhere near the oc i had running a single card, is that normal?


----------



## eAT5

Quote:


> Originally Posted by *microkid21*
> 
> I also tried that with no avail. Whats your card by the way?


2 powercolor turboduo AXR9 280X OC

i just updated the photos in my rig on my profile if you want to look...


----------



## eAT5

Quote:


> Originally Posted by *Bagmup*
> 
> Is there any way of unlocking the voltage on an MSI 270x 2gb gaming edition?
> 
> Also, since i put the second card in, i can't get anywhere near the oc i had running a single card, is that normal?


Drivers are in shambles. when Mantle and a solid driver is released. probably have to wait.


----------



## Bagmup

Quote:


> Originally Posted by *eAT5*
> 
> Drivers are in shambles. when Mantle and a solid driver is released. probably have to wait.


I was able to get a stable 1200/1500 on a single 270x.

Furthest i can get two stable is 1140/1450.


----------



## eAT5

Quote:


> Originally Posted by *Bagmup*
> 
> I was able to get a stable 1200/1500 on a single 270x.
> 
> Furthest i can get two stable is 1140/1450.


im goona try a different install method with the drivers.... gonna try 14.3, every time i try to install with both cards installed i get warning on installation 2nd gpu is not recognized. if it works ill report back, if not ill pull my hair out and cry.....

1030/1500 is stock clocks on these factory OC'ed 1200 would be nice...

i only got 1 in right now. running great... with IOMMU running....

updated my bios from 1.90beta to 1.90 final also

im thinking about building another PC with the second card if i cant get results... bare bones system


----------



## lightsout

Quote:


> Originally Posted by *Bagmup*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Recr3ational*
> 
> I was dissapointed that i didnt get the Msi, but i heard that its voltaged locked?
> I got the power color just because it fits my blocks.
> 
> 
> 
> Yeah, they are voltage locked, which was quite disappointing when i found out. I'm not sure if all 270x's are or not.
Click to expand...

With my Sapphire 270's (non-x) I can change the voltage in trixx, can't get it working in AB.


----------



## Recr3ational

Quote:


> Originally Posted by *lightsout*
> 
> With my Sapphire 270's (non-x) I can change the voltage in trixx, can't get it working in AB.


Yeah I remember you saying. Not a big issue. Sometimes afterbuner does that.


----------



## lightsout

Quote:


> Originally Posted by *Recr3ational*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> With my Sapphire 270's (non-x) I can change the voltage in trixx, can't get it working in AB.
> 
> 
> 
> Yeah I remember you saying. Not a big issue. Sometimes afterbuner does that.
Click to expand...

I wonder if it's only sapphire cards where trixx works and ab does not.


----------



## Recr3ational

Quote:


> Originally Posted by *lightsout*
> 
> I wonder if it's only sapphire cards where trixx works and ab does not.


I doubt it. As it's a software I doubt the card can lock out a software.


----------



## Mackem

I'm fed up of constant artifacts in games with my R9 280X DirectCUII TOP. Are there any 280X's that don't artifact in games? I don't want to RMA it and have the same problem happen again. It's incredibly annoying..


----------



## eAT5

i got my cards to Crossfire flawless

things that took 15 hours to figure out:

i installed driver with 1 card, R9 280X Turbo Duo OC.

#1 it only works with both BIOS switches both on 2, when they were on 1 the had all sorts of Errors and Crashes. now the Crossfire cable has to have J1A on the top also. did not work with IOMMU on in BIOS.

#2 added 2nd card after a clean driver installation that worked on the single card.

#3Re- Booted, Booyah.

my MOBO Bios had to be Current... was running ASRock 990FX FATL1TY Pro 2.0beta, Now running 2.0 final....

took alot of combo's of failures to figure this out ...

also tried windows 8.1... helped with the clean instal cause my windows 7 was tweaked to the bone, all sorts of Dirty folders and stuff... a fresh win 7 might work.

BF4 worked Perfect 1st try.... i was getting 45/60 FPS with 2 7970's so win 8 might have helped. + UEFI cards wont work all the way with win 7 .

my FPS at ULTRA 1080 is 95/130

4xaa
network smoothing 0
Framepacing off


----------



## Devildog83

Quote:


> Originally Posted by *eAT5*
> 
> i got my cards to Crossfire flawless
> 
> things that took 15 hours to figure out:
> 
> i installed driver with 1 card, R9 280X Turbo Duo OC.
> 
> #1 it only works with both BIOS switches both on 2, when they were on 1 the had all sorts of Errors and Crashes. now the Crossfire cable has to have J1A on the top also. did not work with IOMMU on in BIOS.
> 
> #2 added 2nd card after a clean driver installation that worked on the single card.
> 
> #3Re- Booted, Booyah.
> 
> my MOBO Bios had to be Current... was running ASRock 990FX FATL1TY Pro 2.0beta, Now running 2.0 final....
> 
> took alot of combo's of failures to figure this out ...
> 
> also tried windows 8.1... helped with the clean instal cause my windows 7 was tweaked to the bone, all sorts of Dirty folders and stuff... a fresh win 7 might work.
> 
> BF4 worked Perfect 1st try.... i was getting 45/60 FPS with 2 7970's so win 8 might have helped. + UEFI cards wont work all the way with win 7 .
> 
> my FPS at ULTRA 1080 is 95/130
> 
> 4xaa
> network smoothing 0
> Framepacing off
> 
> I heard that the new MSI lightning had an issue like that and they switch to bios #2 (LN2) and it worked, After that they were able to switch back to bios #1 and it worked fine there too.


----------



## Pionir

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *microkid21*
> 
> My GPU is Powercolor R9 270x Devil and I have experienced BSOD with latest beta driver catalyst 14.3 and other previous driver such as Catalyst 14.2, Catalyst 14.1 and 13.2 while playing Call of Duty: Ghost and Skyrim. Do you guys experience it too?
> How did you manage to solve this? Please help.
> 
> *Specs*
> 
> PROCS: Intel core i3 2100 @ 3.1ghz
> RAM: GSKILL Ripjaws 2 x 4GB 1866 DDR3
> MOBO: Asrock Z68 Fatal1ty Professional GEN3
> PSU: Corsair CS650M 80+ Gold
> GPU: Powercolor R9 270x Devil
> 
> Thanks in advance!





Quote:


> Originally Posted by *microkid21*
> 
> The default clocks which the devil runs @1150MHz with boost state of 1180MHz. What games do you play? Upon checking of dump files from BSOD, using Bluescreenviewer the description is *ATI Radeon Kernel Mode Driver. File path: C:\WINDOWS\system32\drivers\atikmdag.sys* any clue? Thanks


Yes, I am...
First, which graphics card was before in the system ?
Find post #4781 of 4856 4 days, 19 hours ago...(do not use CCleaner).

[email protected] For installation of CFX always first install a single graphic, and then another on the same (non-oc) settings, weaker going into the first PCIe slot...
With MSI Afterburner Sync settings for similar graphics processors+apply OC (profile) at system startup.


----------



## Recr3ational

Anybody want to change the voltage on my bios? I'm willing to be the test rat.


----------



## Pionir

Quote:


> Originally Posted by *Mackem*
> 
> I'm fed up of constant artifacts in games with my R9 280X DirectCUII TOP. Are there any 280X's that don't artifact in games? I don't want to RMA it and have the same problem happen again. It's incredibly annoying..


Use FRAPS.
http://www.fraps.com/

If you have more than 100 FPS it can be normal.

Solution = use Vsync and lock FPS to 60 (monitor refresh rate).

Try with and without DISABLE the iGPU in the motherboard BIOS.


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *microkid21*
> 
> The default clocks which the devil runs @1150MHz with boost state of 1180MHz. What games do you play? Upon checking of dump files from BSOD, using Bluescreenviewer the description is *ATI Radeon Kernel Mode Driver. File path: C:\WINDOWS\system32\drivers\atikmdag.sys* any clue? Thanks


I think I had the same problem with my card. I got it fixed by doing a clean driver reinstall and installed driver without CCC as that was probably the cause.


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *Recr3ational*
> 
> Anybody want to change the voltage on my bios? I'm willing to be the test rat.


TBH I might try it this weekend if I go crazy enough.
Sorry for double post, my phone is a turd.


----------



## Recr3ational

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> TBH I might try it this weekend if I go crazy enough.
> Sorry for double post, my phone is a turd.


If you edit the voltage, just raise it and send me a copy?


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *Recr3ational*
> 
> If you edit the voltage, just raise it and send me a copy?


Sure. Might not happen til sunday though, caught the flu while visting my folks and I'm not fully healed yet. Rig is on the other side of the country atm.


----------



## Recr3ational

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Sure. Might not happen til sunday though, caught the flu while visting my folks and I'm not fully healed yet. Rig is on the other side of the country atm.


Sweet cheers. Hopefully we can get more out if these cardsz


----------



## Bagmup

Quote:


> Originally Posted by *lightsout*
> 
> I wonder if it's only sapphire cards where trixx works and ab does not.


I had a look at Trixx and the voltage is unlocked, but unfortunately, it only gives you the option to undervolt. The slider is maxed out at the cards default voltage.


----------



## Skye12977

I recently purchased a r9 270x MSI hawk and I'm already glad that it almost compares to a 7970 we used to have.
http://i607.photobucket.com/albums/tt160/azenor2010/270x_zps35ea2ae5.png
Didn't want to post the picture (only the link to it).
I was able to run 1250 core and 1500 memory off from the stock bios without problems, was even happier to see the temps (despite the fan speeds not being 100%) only at 60C.

I'm wondering where I could go to find ln2 bios for this 270x hawk.


----------



## lightsout

Quote:


> Originally Posted by *Bagmup*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> I wonder if it's only sapphire cards where trixx works and ab does not.
> 
> 
> 
> I had a look at Trixx and the voltage is unlocked, but unfortunately, it only gives you the option to undervolt. The slider is maxed out at the cards default voltage.
Click to expand...

That sucks. On my cards it's kind of the opposite. I can move the slider both ways. But undervolting has no affect.


----------



## Devildog83

Devil's even look sweeter with new cable combs -


----------



## Arkanon

For those having trouble unlocking their voltage slider in AB, are you guys aware you have to uncheck the "enable low-level hardware acces interface" ? There's a bug in the latest AB releases that doesn't give you any voltage control without unchecking it. Might be worth a try for those having problems unlocking their voltage slider.


----------



## Devildog83

Quote:


> Originally Posted by *Arkanon*
> 
> For those having trouble unlocking their voltage slider in AB, are you guys aware you have to uncheck the "enable low-level hardware acces interface" ? There's a bug in the latest AB releases that doesn't give you any voltage control without unchecking it. Might be worth a try for those having problems unlocking their voltage slider.


The slider on my 270x is still locked but the one on my 7870 has always been unlocked up to 1.3v. Max I get out of the 270x is about 1.275 with a VDDC of 1.206.


----------



## Bagmup

Still locked with that unchecked.


----------



## Recr3ational

Hey got my loop sorted for the min.
Im getting artifacts in game but not in Kombuster and OCCT
so at least my cards all right.

Heat seems fine...
Any advice?


----------



## Skye12977

Quote:


> Originally Posted by *Skye12977*
> 
> I recently purchased a r9 270x MSI hawk and I'm already glad that it almost compares to a 7970 we used to have.
> http://i607.photobucket.com/albums/tt160/azenor2010/270x_zps35ea2ae5.png
> Didn't want to post the picture (only the link to it).
> I was able to run 1250 core and 1500 memory off from the stock bios without problems, was even happier to see the temps (despite the fan speeds not being 100%) only at 60C.
> 
> I'm wondering where I could go to find ln2 bios for this 270x hawk.


1/2 way off topic of thread.
But I never noticed how big my card was.

Top is 780 lighting
Bot is 270x hawk


----------



## lightsout

Quote:


> Originally Posted by *Bagmup*
> 
> Still locked with that unchecked.


None of those settings worked for me either. Checked and unchecked a bunch of stuff. Still no go. Only works with trixx.


----------



## Roaches

Its here! Card looks and feels amazing on hand.











I have to give PowerColor huge props on the packaging quality. Top tier and just on par as my Gigabytes 680 SOC packages and hello new mousepad







Gonna spend late night putting my FT02 build together to test it.


----------



## Megadrone

I discovered that using post processing in battlefield 3 or HBAO instead of SSAO gives me serious flickering squares or black spike artifacts. What to blame in this case? its a 280x, downclocking doesnt help much. The game also freezes like half of the time and i have to terminate it from task manager.


----------



## Bruska

Can someone explain me why my r9 270x gave me these results in Valley Bench?



The max I am able to do is 1555, but don't know why one day I turned on my pc and run the benchmark over and over gain and gave that result, or close. And now I can't reach it, not even close. Any idea?


----------



## Skye12977

Quote:


> Originally Posted by *Bruska*
> 
> Can someone explain me why my r9 270x gave me these results in Valley Bench?
> 
> 
> 
> The max I am able to do is 1555, but don't know why one day I turned on my pc and run the benchmark over and over gain and gave that result, or close. And now I can't reach it, not even close. Any idea?


drivers? CPU over-clock?


----------



## Bruska

Quote:


> Originally Posted by *Skye12977*
> 
> drivers? CPU over-clock?


I was using 14.1, and everything stock except for the processor as you can see. I can't understand why it gave me those results and only one day.


----------



## Skye12977

Quote:


> Originally Posted by *Bruska*
> 
> I was using 14.1, and everything stock except for the processor as you can see. I can't understand why it gave me those results and only one day.


Bros 3570k still registers as stock from valley and he is at 5.0
I'm stock clock on a 3570k and got a 1601 with my 270x but I'm not doing oc'ing


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> Its here! Card looks and feels amazing on hand.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have to give PowerColor huge props on the packaging quality. Top tier and just on par as my Gigabytes 680 SOC packages and hello new mousepad
> 
> 
> 
> 
> 
> 
> 
> Gonna spend late night putting my FT02 build together to test it.


Beautiful card Roaches, your in the club.


----------



## lightsout

Quote:


> Originally Posted by *Roaches*
> 
> Its here! Card looks and feels amazing on hand.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have to give PowerColor huge props on the packaging quality. Top tier and just on par as my Gigabytes 680 SOC packages and hello new mousepad
> 
> 
> 
> 
> 
> 
> 
> Gonna spend late night putting my FT02 build together to test it.


Reminds me of a DCII only with three fans.

Oh and time for a keyboard you will enjoy.


----------



## Devildog83

Quote:


> Originally Posted by *lightsout*
> 
> Reminds me of a DCII only with three fans.
> 
> Oh and time for a keyboard you will enjoy.


I noticed you are not on the members list, did I miss you some how? Do you want me to add you?


----------



## lightsout

Quote:


> Originally Posted by *Devildog83*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> Reminds me of a DCII only with three fans.
> 
> Oh and time for a keyboard you will enjoy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I noticed you are not on the members list, did I miss you some how? Do you want me to add you?
Click to expand...

Yeah you can add me. I never asked, was I supposed to post a pic? Hows this 

I only have two left. Super biiz actually sent me 5 instead of 4. But I was a nice guy and told them, didn't even get a thank you lol just a return label in an email.


----------



## Devildog83

Your in *lightsout*, such an honest John! I would love that deck of cards.


----------



## lightsout

Quote:


> Originally Posted by *Devildog83*
> 
> Your in *lightsout*, such an honest John! I would love that deck of cards.


I got a nice tax return only reason I could afford them. Had a little mining setup going with 4 x 270's and a 7870. BUt it was taking a lot of my time as I was constantly chasing the new coin. Decided to go back to easier times when I just used my rig for browsing and gaming. Kept a pair of them for crossfire. I'm pretty happy with performance of the little guys.









and thanks for the entrance btw.


----------



## JaredLaskey82

JaredLaskey: 2x ASUS DirectCU II Top-1070(BST 1100)/1600 (Painted white to match my RIG)


----------



## Majentrix

In retrospect I probably shouldn't have done this, but hey the block was only $40.


----------



## Devildog83

Quote:


> Originally Posted by *JaredLaskey82*
> 
> 
> 
> JaredLaskey: 2x ASUS DirectCU II Top-1070(BST 1100)/1600 (Painted white to match my RIG)


I love the paint job, I am adding you now.


----------



## lightsout

Hey guys my overclock is not maxed out in crossfire. Is there a way to make the cards stay at their OC all the time when in use? I have disabled ulps. I am using trix to have voltage control. But I have to open AB to get rivatuner OSD to show up.

Running valley the cards sometimes go to 1200 (the OC I have them at) but most the time drop to 920mhz.


----------



## Devildog83

Quote:


> Originally Posted by *Majentrix*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In retrospect I probably shouldn't have done this, but hey the block was only $40.


I bet it's cooler. I get you though, unless you have a full block and nice backplate some of the air coolers look sweet. I will most likely never water-cool my Devil's. They are cool enough now and they look so killer the way they are.


----------



## lightsout

Is there a special trick I need to do to get my cards to run at max clocks while benching? They are all over the place, is it something to do with boost?


----------



## JaredLaskey82

Just painting the last card and the crossfire bridge and then I can install it some time in the coming week.


----------



## Recr3ational

Quote:


> Originally Posted by *lightsout*
> 
> Is there a special trick I need to do to get my cards to run at max clocks while benching? They are all over the place, is it something to do with boost?


That's why I never get boost.
Can you somehow change it?


----------



## JaredLaskey82

It's JaredLaskey.







Not JaredLiskey.


----------



## lightsout

Quote:


> Originally Posted by *Recr3ational*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> Is there a special trick I need to do to get my cards to run at max clocks while benching? They are all over the place, is it something to do with boost?
> 
> 
> 
> That's why I never get boost.
> Can you somehow change it?
Click to expand...

I don't think so. I believe its a bios thing. Is that whats going on? Pretty crappy for benching, even in something like valley they spend more time at 900 than they do at 1200. I guess its whatever. I'm not a huge bencher but wanted to see what they could do. I know they are being held back quite a bit bouncing all around like that.


----------



## Recr3ational

Quote:


> Originally Posted by *lightsout*
> 
> I don't think so. I believe its a bios thing. Is that whats going on? Pretty crappy for benching, even in something like valley they spend more time at 900 than they do at 1200. I guess its whatever. I'm not a huge bencher but wanted to see what they could do. I know they are being held back quite a bit bouncing all around like that.


Is it overclocked higher than boost?


----------



## lightsout

Quote:


> Originally Posted by *Recr3ational*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> I don't think so. I believe its a bios thing. Is that whats going on? Pretty crappy for benching, even in something like valley they spend more time at 900 than they do at 1200. I guess its whatever. I'm not a huge bencher but wanted to see what they could do. I know they are being held back quite a bit bouncing all around like that.
> 
> 
> 
> Is it overclocked higher than boost?
Click to expand...

Yeah the boost is only 945. Stock is 920. Although at stock clocks as far as AB is concerned it will always run at 945. Only when I go past that does it bounce from the OC'd clock down to 920.

I am running two in xfire if that matters. ULPS is disabled.


----------



## Roaches

Finally got it up and running, and its looking sexy. So quiet on both idle and load its almost not there!1

Not sure if my temps are okay, but earlier I got around high 70s during my last stress test but this time its hovering around 60s


----------



## lightsout

Quote:


> Originally Posted by *Recr3ational*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> I don't think so. I believe its a bios thing. Is that whats going on? Pretty crappy for benching, even in something like valley they spend more time at 900 than they do at 1200. I guess its whatever. I'm not a huge bencher but wanted to see what they could do. I know they are being held back quite a bit bouncing all around like that.
> 
> 
> 
> Is it overclocked higher than boost?
Click to expand...

So it appears to be when I get trixx involved. Which sucks cause thats the only way I can adjust voltage. If I set to 1100 on AB they lock at that clock. If I open trixx it shows them at 1100. I can change the voltage and see the change in gpuz. But then the clocks start bouncing, weird.


----------



## Roaches

Some more test, oddly better overall temps than before.

Peaked at 66 degrees celsius in FF benchmark.




Currently downloading my steam games to conduct more stock benches later.


----------



## Unknownm

so I don't know what to say about using 2 CF bridges anymore. Tested each cable and each plugin, all of them say

"The AMD CrossFireX Internal bridge interconnect linking your graphics card together are not properly attached. Both bridge interconnects must be attached. For more information, see the user's guide for your graphics card"

Than if two are plugged in "enable AMD CrossFireX" becomes clickable and crossfire works. I was thinking maybe my motherboard didn't support CF but it does, gigabyte website confirms and also CPUZ says Motherboard model "Z87-UD5H-CF" which I assume CF means crossfire enabled. I don't understand how you 280x users can crossfire with one and I'm forced to use two...


----------



## neurotix

Quote:


> Originally Posted by *Devildog83*
> 
> I love the paint job, I am adding you now.


How do you guys go about painting your cards? What kind of paint do you use?

I have these awesome Sapphire Tri-X, but a black and red rig. The thought of painting the yellow parts red has crossed my mind more than once. Does doing this void your warranty? Considering I paid a LOT more for my cards than most people have, I don't want to void my warranty.


----------



## JaredLaskey82

I have no idea. The Gigabyte site does say the board is supported for both 2 way XFire and SLI.

The only thing I can think of is:

Making sure your drivers for both your Mobo and AMD Catalyst are up to date.
http://www.gigabyte.com/products/product-page.aspx?pid=4483#dl
http://support.amd.com/en-us/download

Also checking the Bios for the Mobo could be a good idea too.

I am assuming you are using the same two cards (Sapphire, XFX, ASUS and so on). If you just bought a new 280X to crossfire with your old 7970 you need to make sure that your 7970 is in the top slot as that governs the crossfire and you need to overclock/underclock the new 280x to the same clocks for it to work.

Can you let us know what drivers you are using and the card setup?


----------



## JaredLaskey82

I just used quick dry enamel spray paint, isopropyl alcohol to remove fingerprint oils and so forth before painting my cards and RAM heat spreaders.

Check out this Youtube vid for Tek Syndicate that gave me the ideas in the first place. They are painting a plastic cover, but the method will still be the same.

http://www.youtube.com/watch?v=c0yYrcf-0jY


----------



## eAT5

Quote:


> Originally Posted by *Devildog83*
> 
> Quote:
> 
> 
> 
> Originally Posted by *eAT5*
> 
> i got my cards to Crossfire flawless
> 
> things that took 15 hours to figure out:
> 
> i installed driver with 1 card, R9 280X Turbo Duo OC.
> 
> #1 it only works with both BIOS switches both on 2, when they were on 1 the had all sorts of Errors and Crashes. now the Crossfire cable has to have J1A on the top also. did not work with IOMMU on in BIOS.
> 
> #2 added 2nd card after a clean driver installation that worked on the single card.
> 
> #3Re- Booted, Booyah.
> 
> my MOBO Bios had to be Current... was running ASRock 990FX FATL1TY Pro 2.0beta, Now running 2.0 final....
> 
> took alot of combo's of failures to figure this out ...
> 
> also tried windows 8.1... helped with the clean instal cause my windows 7 was tweaked to the bone, all sorts of Dirty folders and stuff... a fresh win 7 might work.
> 
> BF4 worked Perfect 1st try.... i was getting 45/60 FPS with 2 7970's so win 8 might have helped. + UEFI cards wont work all the way with win 7 .
> 
> my FPS at ULTRA 1080 is 95/130
> 
> 4xaa
> network smoothing 0
> Framepacing off
> 
> I heard that the new MSI lightning had an issue like that and they switch to bios #2 (LN2) and it worked, After that they were able to switch back to bios #1 and it worked fine there too.
> 
> 
> 
> is 1 stock bios and 2 OC bios on a OC card? cause mine show up as Boosted state in CCC or GPUz with out changing nothing.
Click to expand...


----------



## Unknownm

8xAA


0xAA


8xAA


0xAA


----------



## Devildog83

*JaredLaskey*

Got the name fixed. I love the white, it's time to do a hide the PSU mod.


*ligtsout*

Running Valley or any other program the clocks and voltages will go up and down depending on load. It's normal for readings to have peaks and Valleys like that, you will see it on bench's and during games as the card will only run as fast and hard as asked. Having said that if you have Trixx and Afterburner both installed at the same time there may be conflicts there from the system info that both programs are running plus the system info on the CCC, Unigen and Futuremark. All of these programs detect hardware in there own way and may be causing conflicts.

*Roaches*

You may have just had a bit of burn -in time on the GPU/Thermal paste or something like that. If you don't mind a bit of noise turn up the fan profile to keep them as cool as possible, it can't hurt.

*Neurotix*

I used ceramic infused engine paint but any paint that's good for plastic should work.


----------



## lightsout

Quote:


> Originally Posted by *Devildog83*
> 
> *JaredLaskey*
> 
> Got the name fixed. I love the white, it's time to do a hide the PSU mod.
> 
> 
> *ligtsout*
> 
> Running Valley or any other program the clocks and voltages will go up and down depending on load. It's normal for readings to have peaks and Valleys like that, you will see it on bench's and during games as the card will only run as fast and hard as asked. Having said that if you have Trixx and Afterburner both installed at the same time there may be conflicts there from the system info that both programs are running plus the system info on the CCC, Unigen and Futuremark. All of these programs detect hardware in there own way and may be causing conflicts.
> 
> *Roaches*
> 
> You may have just had a bit of burn -in time on the GPU/Thermal paste or something like that. If you don't mind a bit of noise turn up the fan profile to keep them as cool as possible, it can't hurt.
> 
> *Neurotix*
> 
> I used ceramic infused engine paint but any paint that's good for plastic should work.


I know that some benches the clock will fluctuate, like 3dmark. But valley should pin the card at maxx. And it does when I take trixx out of the equation. Even running trixx alone the clocks seem to bounce as the score sucks. Lame because I can't change voltage without trixx. Oh well though they appear to be stable at 1100 with power limit at %20. Which is good enough for me.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> *JaredLaskey*
> 
> Got the name fixed. I love the white, it's time to do a hide the PSU mod.
> 
> 
> *ligtsout*
> 
> Running Valley or any other program the clocks and voltages will go up and down depending on load. It's normal for readings to have peaks and Valleys like that, you will see it on bench's and during games as the card will only run as fast and hard as asked. Having said that if you have Trixx and Afterburner both installed at the same time there may be conflicts there from the system info that both programs are running plus the system info on the CCC, Unigen and Futuremark. All of these programs detect hardware in there own way and may be causing conflicts.
> 
> *Roaches*
> 
> You may have just had a bit of burn -in time on the GPU/Thermal paste or something like that. If you don't mind a bit of noise turn up the fan profile to keep them as cool as possible, it can't hurt.
> 
> *Neurotix*
> 
> I used ceramic infused engine paint but any paint that's good for plastic should work.


I used Enamel paint. Good stuff

If you have lights showing through it make sure you paitn it white first.
Then what ever colour.


----------



## JaredLaskey82

Quote:


> Originally Posted by *Devildog83*
> 
> *JaredLaskey*
> 
> Got the name fixed. I love the white, it's time to do a hide the PSU mod.
> 
> 
> *ligtsout*
> 
> Running Valley or any other program the clocks and voltages will go up and down depending on load. It's normal for readings to have peaks and Valleys like that, you will see it on bench's and during games as the card will only run as fast and hard as asked. Having said that if you have Trixx and Afterburner both installed at the same time there may be conflicts there from the system info that both programs are running plus the system info on the CCC, Unigen and Futuremark. All of these programs detect hardware in there own way and may be causing conflicts.
> 
> *Roaches*
> 
> You may have just had a bit of burn -in time on the GPU/Thermal paste or something like that. If you don't mind a bit of noise turn up the fan profile to keep them as cool as possible, it can't hurt.
> 
> *Neurotix*
> 
> I used ceramic infused engine paint but any paint that's good for plastic should work.


Already under way.


----------



## Devildog83

Quote:


> Originally Posted by *JaredLaskey82*
> 
> Already under way.


Excellent!! What are you going to use?


----------



## cremelo

Now just need enable CrossFireX when get home ( and buy new case







)








PS: Sorry potato camera


----------



## passinos

Got 2x7970 CF (1 ref with a syscooling block, 1 XFX non-ref and cant find a block)

Need 2 Dual Link DVI for my Two Q-Nix (2x 2560x1440)

1) If I put a 280x in top slot and CF with 7970 will I get 2 Dual-Link DVIs? (280x has 2 DL-DVI but 7970's had only 1)
2) any 280x can use a 7970 Ref Waterblock?

thanks


----------



## Recr3ational

Quote:


> Originally Posted by *passinos*
> 
> Got 2x7970 CF (1 ref with a syscooling block, 1 XFX non-ref and cant find a block)
> 
> Need 2 Dual Link DVI for my Two Q-Nix (2x 2560x1440)
> 
> 1) If I put a 280x in top slot and CF with 7970 will I get 2 Dual-Link DVIs? (280x has 2 DL-DVI but 7970's had only 1)
> 2) any 280x can use a 7970 Ref Waterblock?
> 
> thanks


Powercolor and MSI both uses ek 7970 blocks.

I have two PowerColour Turbo Duos with EK


----------



## cremelo

Toxic is very hot


----------



## heroxoot

So my poor 7970 lightning was reduced to a MSI G series R9 280x, and I cannot get MSI AB to unlock the voltage. Also at highish fan speed, like 50%+ one of the fans squeeks and its exactly why I sent my 7970 lightning in for replacement..... It doesn't do it that often so far though.

So how can I get MSI AB to let me adjust voltage?


----------



## passinos

Quote:


> Originally Posted by *passinos*
> 
> Got 2x7970 CF (1 ref with a syscooling block, 1 XFX non-ref and cant find a block)
> 
> Need 2 Dual Link DVI for my Two Q-Nix (2x 2560x1440)
> 
> 1) If I put a 280x in top slot and CF with 7970 will I get 2 Dual-Link DVIs? (280x has 2 DL-DVI but 7970's had only 1)
> 2) any 280x can use a 7970 Ref Waterblock?
> 
> thanks


Quote:


> Originally Posted by *Recr3ational*
> 
> Powercolor and MSI both uses ek 7970 blocks.
> 
> I have two PowerColour Turbo Duos with EK


Does the powercooler or MSI have 2x Dual Link DVI?

Thanks


----------



## FilipePT

I'm thinking of buying the Asus R9 270x Top ( http://www.techpowerup.com/reviews/ASUS/R9_270X_Direct_Cu_II_TOP/3.html )but I saw that she did not have any cooling for the Vrm and memories Vram. This is asus lower costs?
Normally on other cards comes with aluminum in VRM mosfets and heatskins Vram but this asus they are all naked.



This will affect the longevity of the card?


----------



## microkid21

I'll just have to ask the same question before, because I did not yet receive a good feedback. My problem is I experienced BSOD with the latest beta driver 14.3 and other previous driver such as Catalyst 14.2, Catalyst 14.1 and 13.2 while playing Call of Duty: Ghost and Skyrim. Do you guys experience it too? What do I need to do to overcome this or how did you manage to solve this? Please help.

*Specs*

PROCS: Intel core i3 2100 @ 3.1ghz
RAM: GSKILL Ripjaws 2 x 4GB 1866 DDR3
MOBO: Asrock Z68 Fatal1ty Professional GEN3
PSU: Corsair CS650M 80+ Gold
GPU: Powercolor R9 270x Devil

Thanks in advance!


----------



## mAs81

Quote:


> Originally Posted by *heroxoot*
> 
> So my poor 7970 lightning was reduced to a MSI G series R9 280x, and I cannot get MSI AB to unlock the voltage. Also at highish fan speed, like 50%+ one of the fans squeeks and its exactly why I sent my 7970 lightning in for replacement..... It doesn't do it that often so far though.
> So how can I get MSI AB to let me adjust voltage?


If you already have tried from the settings>general>unlock voltage control/monitoring/force constant voltage,try this.
I also get that squeaking noise,but at much higher rpm (at about +/-70%),together with some mild coil whine,but since I'm gaming with headphones it doesn't really bother me(and it isn't so very loud to begin with)








At 1250 mV Core Voltage and +20% Power limit I get a stable 1200/1600 OC and min 40 fps at most games @ 1080p

Hope this helps..


----------



## heroxoot

Quote:


> Originally Posted by *mAs81*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> So my poor 7970 lightning was reduced to a MSI G series R9 280x, and I cannot get MSI AB to unlock the voltage. Also at highish fan speed, like 50%+ one of the fans squeeks and its exactly why I sent my 7970 lightning in for replacement..... It doesn't do it that often so far though.
> So how can I get MSI AB to let me adjust voltage?
> 
> 
> 
> If you already have tried from the settings>general>unlock voltage control/monitoring/force constant voltage,try this.
> I also get that squeaking noise,but at much higher rpm (at about +/-70%),together with some mild coil whine,but since I'm gaming with headphones it doesn't really bother me(and it isn't so very loud to begin with)
> 
> 
> 
> 
> 
> 
> 
> 
> At 1250 mV Core Voltage and +20% Power limit I get a stable 1200/1600 OC and min 40 fps at most games @ 1080p
> 
> Hope this helps..
Click to expand...

Thanks for the info. So I was monitoring it on a game. The GPU isn't even getting 1000RPM at 50% speed. It will go up to around 700 and drop back down to 500ish. The fans are confirmed defective. An MSI tech is supposed to call me back tomorrow with information from the RMA department. Looks like I have to send this in now.

I should have never sent in my 7970 lightning. I should have waited till I could afford a 3rd party cooler for it. Very upsetting.


----------



## cremelo

Well I have a problem: When I play, I use my R9 280X Toxic as primary GPU and Asus 280X as secondary GPU....
But only Asus uses 99% of the GPU being Toxic reaches a maximum of 70% of use and my processor reaches 99% of locked use and which is a i5 3570k @ 4.0Ghz whit Vcore on 1.1709.
Should I increase the overclock the processor for best use of GPU???? Do I have to use a standard clock for both GPU? like 1100Mhz on Asus ( 1070Mhz Stock ) and Toxic ( 1150Mhz stock ) and is Normal the second GPU off the fan when using the computer at idle?

I find it strange because I am not able to harness the full use of both GPU....


----------



## Amph

i have a strange problem with my rig

for now i'm running with only two 280x, 1 is the old manufacturer with 0.39 bios and hynix memory, the other with elpidia bios 0.43

now if i put the hynix one in the first pci-e slot and the elpidia in the second, the hynix run 15° hotter(already this make no sense, it doesn't matter if there is the cpu near it), but if i swap it with the elpida(so elpidia in the first and the hynix in the second slot), result is the elpidia run only 8° hotter

something is very wrong with the motherboard, which is an old asrock extreme 6 p67

all this whith mining, the problem persist even with cooler algo like x11 or heavycoin algo


----------



## Recr3ational

@cremelo

Right, having something at 100% meaning it's throttling. So you really don't want everything at 100%, I'm guessing your cpu is the issue?

Anyone can clarify?

I think it is, I just remembered that I needed to overclock my old 8350 at like 4.9GHz to use my old 7950s properly, I recommend OCing your i5 to like 4.5 if you can cool it.

Also just to add,
Now my i7 is at 4.5 it rarely goes up to 100%,
Where as my 280xs does even when overclocked as high as it could.


----------



## passinos

Quote:


> Originally Posted by *cremelo*
> 
> Well I have a problem: When I play, I use my R9 280X Toxic as primary GPU and Asus 280X as secondary GPU....
> But only Asus uses 99% of the GPU being Toxic reaches a maximum of 70% of use and my processor reaches 99% of locked use and which is a i5 3570k @ 4.0Ghz whit Vcore on 1.1709.
> Should I increase the overclock the processor for best use of GPU???? Do I have to use a standard clock for both GPU? like 1100Mhz on Asus ( 1070Mhz Stock ) and Toxic ( 1150Mhz stock ) and is Normal the second GPU off the fan when using the computer at idle?
> 
> I find it strange because I am not able to harness the full use of both GPU....


Side question: are both your cards 2x Dual-Link DVI?


----------



## cremelo

Quote:


> Originally Posted by *Recr3ational*
> 
> @cremelo
> 
> Right, having something at 100% meaning it's throttling. So you really don't want everything at 100%, I'm guessing your cpu is the issue?
> 
> Anyone can clarify?
> 
> I think it is, I just remembered that I needed to overclock my old 8350 at like 4.9GHz to use my old 7950s properly, I recommend OCing your i5 to like 4.5 if you can cool it.
> 
> Also just to add,
> Now my i7 is at 4.5 it rarely goes up to 100%,
> Where as my 280xs does even when overclocked as high as it could.


I believe the processor is pushing the VGA =/
When I get home I will try to climb it 4.0 to 4.5 if the Vcore and temperature does not get too high.
this was my fear









Quote:


> Originally Posted by *passinos*
> 
> Side question: are both your cards 2x Dual-Link DVI?


Asus Have 1 x Single-Link DVI-D,1 x Dual-Link DVI-I, 1xHDMI, 1xDisplay port and Sapphire have 1 x HDMI (with 3D),2 x Mini-DisplayPort,1 x Single-Link DVI-D,1 x Dual-Link DVI-I...
I use HDMI and DVI on first and 1 DVI on Second.....


----------



## Recr3ational

Quote:


> Originally Posted by *cremelo*
> 
> I believe the processor is pushing the VGA =/
> When I get home I will try to climb it 4.0 to 4.5 if the Vcore and temperature does not get too high.
> this was my fear
> 
> 
> 
> 
> 
> 
> 
> 
> Asus Have 1 x Single-Link DVI-D,1 x Dual-Link DVI-I, 1xHDMI, 1xDisplay port and Sapphire have 1 x HDMI (with 3D),2 x Mini-DisplayPort,1 x Single-Link DVI-D,1 x Dual-Link DVI-I...
> I use HDMI and DVI on first and 1 DVI on Second.....


You don't need it to go 4.5. Just go as high as you're happy with it. If your game plays fine at the graphics you're happy with and at 60fps leave it be. There's no reason on fixing something that is isn't already broken


----------



## passinos

Quote:


> Originally Posted by *cremelo*
> 
> I believe the processor is pushing the VGA =/
> When I get home I will try to climb it 4.0 to 4.5 if the Vcore and temperature does not get too high.
> this was my fear
> 
> 
> 
> 
> 
> 
> 
> 
> Asus Have 1 x Single-Link DVI-D,1 x Dual-Link DVI-I, 1xHDMI, 1xDisplay port and Sapphire have 1 x HDMI (with 3D),2 x Mini-DisplayPort,1 x Single-Link DVI-D,1 x Dual-Link DVI-I...
> I use HDMI and DVI on first and 1 DVI on Second.....


I didnt know you could use interfaces on both cards in CF. Thought all ports are disabled on the CF card.
At least my 7970's will not.


----------



## Recr3ational

Quote:


> Originally Posted by *passinos*
> 
> I didnt know you could use interfaces on both cards in CF. Thought all ports are disabled on the CF card.
> At least my 7970's will not.


I know it's so cool!
I just found out like 4 days ago. Now I'm using my second gpu to show all my monitors, my 7950s had to have it on top.


----------



## passinos

Quote:


> Originally Posted by *Recr3ational*
> 
> I know it's so cool!
> I just found out like 4 days ago. Now I'm using my second gpu to show all my monitors, my 7950s had to have it on top.


Damn, awesome.
Gots to get me my 290's now


----------



## Recr3ational

Quote:


> Originally Posted by *passinos*
> 
> Damn, awesome.
> Gots to get me my 290's now


Man I think it's way overpowered to have two. No games is going to be optimised to use both gpus. Reason why I bought 2 x 280xs is because I have to have two cards. I hate the look of a single card.


----------



## JaredLaskey82

Quote:


> Originally Posted by *cremelo*
> 
> Well I have a problem: When I play, I use my R9 280X Toxic as primary GPU and Asus 280X as secondary GPU....
> But only Asus uses 99% of the GPU being Toxic reaches a maximum of 70% of use and my processor reaches 99% of locked use and which is a i5 3570k @ 4.0Ghz whit Vcore on 1.1709.
> Should I increase the overclock the processor for best use of GPU???? Do I have to use a standard clock for both GPU? like 1100Mhz on Asus ( 1070Mhz Stock ) and Toxic ( 1150Mhz stock ) and is Normal the second GPU off the fan when using the computer at idle?
> 
> I find it strange because I am not able to harness the full use of both GPU....


Of course they have to be the same. It will break one of the cards. They have to be in sync for gpu clock. There should be a way in your overclocking software to sync them together.


----------



## cremelo

Quote:


> Originally Posted by *JaredLaskey82*
> 
> Of course they have to be the same. It will break one of the cards. They have to be in sync for gpu clock. There should be a way in your overclocking software to sync them together.


So how option that shows me is the clock Toxic (in the case would have to raise the Asus 1070Mhz to 1150MHz)....

But obviously I will not stay with CrossFire being my friend wants to buy Asus for Bitcoin Mining and the price he wants to pay'll try to get another 280X Toxic or perhaps a 290x Toxic when release









In my country an Asus R9 280X DC2 Top is U$ 650 since i can buy on ebay with the rest of the money I have saved and buy a 290x our 280X Toxic on my country for U$ 800....


----------



## passinos

Quote:


> Originally Posted by *Recr3ational*
> 
> Man I think it's way overpowered to have two. No games is going to be optimised to use both gpus. Reason why I bought 2 x 280xs is because I have to have two cards. I hate the look of a single card.


Quote:


> Originally Posted by *cremelo*
> 
> So how option that shows me is the clock Toxic (in the case would have to raise the Asus 1070Mhz to 1150MHz)....
> 
> But obviously I will not stay with CrossFire being my friend wants to buy Asus for Bitcoin Mining and the price he wants to pay'll try to get another 280X Toxic or perhaps a 290x Toxic when release
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In my country an Asus R9 280X DC2 Top is U$ 650 since i can buy on ebay with the rest of the money I have saved and buy a 290x our 280X Toxic on my country for U$ 800....


my bad, I thought Cremelo had 2x290's. the 280x Asus DCII does have 2xDL-DVI but the Sapphire only has 1x DL-DVI.
Maybe its firmware of the 280x that can allow both cards interfaces to be used at the same time.

and 2x7970 barely get me 96 fps in BF4.
2x290's OCd underwater would be better.


----------



## Recr3ational

Quote:


> Originally Posted by *passinos*
> 
> my bad, I thought Cremelo had 2x290's. the 280x Asus DCII does have 2xDL-DVI but the Sapphire only has 1x DL-DVI.
> Maybe its firmware of the 280x that can allow both cards interfaces to be used at the same time.
> 
> and 2x7970 barely get me 96 fps in BF4.
> 2x290's OCd underwater would be better.


Get 2x 280x? Under water?
Mine gets over 120fps. Even though I don't even need it that high.


----------



## passinos

Quote:


> Originally Posted by *Recr3ational*
> 
> Get 2x 280x? Under water?
> Mine gets over 120fps. Even though I don't even need it that high.


got 2x 7970's right now (1 Ref on water the other non-ref on air)

I need 2x DL-DVI and a waterblock for 2nd non-ref card.

My plan was to get One 280x with 2xDL-DVI that I can get a water block for (maybe Asus DCII) and hope that if its primary GPU I will get the 2x DL-DVI when I CF to my 7970.


----------



## Amph

Quote:


> Originally Posted by *Amph*
> 
> i have a strange problem with my rig
> 
> for now i'm running with only two 280x, 1 is the old manufacturer with 0.39 bios and hynix memory, the other with elpidia bios 0.43
> 
> now if i put the hynix one in the first pci-e slot and the elpidia in the second, the hynix run 15° hotter(already this make no sense, it doesn't matter if there is the cpu near it), but if i swap it with the elpida(so elpidia in the first and the hynix in the second slot), result is the elpidia run only 8° hotter
> 
> something is very wrong with the motherboard, which is an old asrock extreme 6 p67
> 
> all this whith mining, the problem persist even with cooler algo like x11 or heavycoin algo


i solved my problem

the hynix on the first slot was causing problem, now in the third and the elpidia in the first all ok, only 5° diff temp


----------



## Spade616

fired up the HIS utility today, and saw that my 270x's vddc was on 1300mV(its normally on 1200), which is like the max i think for the 270x?(not sure) could this have done any sort of damage?(i was just idling and playing dota2 anyway) im very conservative with adjusting my voltage, and i was just surprised that it was somehow set on max.


----------



## dmfree88

One of my cards or my psu is near death.. cant figure it out but a strange fan scraping type noise happens only under load (not a fan have checked all of them and they run at constant speed so shouldnt be affected by load anyways). Cant pinpoint the sound but i think somethings about to give because the sound is seemingly progressively getting worse


----------



## Recr3ational

Quote:


> Originally Posted by *dmfree88*
> 
> One of my cards or my psu is near death.. cant figure it out but a strange fan scraping type noise happens only under load (not a fan have checked all of them and they run at constant speed so shouldnt be affected by load anyways). Cant pinpoint the sound but i think somethings about to give because the sound is seemingly progressively getting worse


Scraping? OR coil whine?


----------



## Devildog83

Quote:


> Originally Posted by *Spade616*
> 
> fired up the HIS utility today, and saw that my 270x's vddc was on 1300mV(its normally on 1200), which is like the max i think for the 270x?(not sure) could this have done any sort of damage?(i was just idling and playing dota2 anyway) im very conservative with adjusting my voltage, and i was just surprised that it was somehow set on max.


I don't think unless your core voltage is stuck there even at idle, it will be a problem. Is there a box with force constant voltage that is checked or something, What is your voltage at idle?

My core volts go up to 1.275 or so even thought the VDDC is 1.216.


----------



## shilka

Quote:


> Originally Posted by *dmfree88*
> 
> One of my cards or my psu is near death.. cant figure it out but a strange fan scraping type noise happens only under load (not a fan have checked all of them and they run at constant speed so shouldnt be affected by load anyways). Cant pinpoint the sound but i think somethings about to give because the sound is seemingly progressively getting worse


What kind of noise and where does it come from?


----------



## Devildog83

Afterburner kept telling me that the beta version I was using was going to expire so I deleted it and tried HIS Powertune and for some crazy reason it allows me to adjust the VDDC up to 1.320 on both cards. I am going to try to do some overclocking and see if I can get more out of my cards.


----------



## Spade616

Quote:


> Originally Posted by *Devildog83*
> 
> I don't think unless your core voltage is stuck there even at idle, it will be a problem. Is there a box with force constant voltage that is checked or something, What is your voltage at idle?
> 
> My core volts go up to 1.275 or so even thought the VDDC is 1.216.


with the HIS utility, you pretty much just drag the slider until you have the voltage that you want, and press "apply". thats why i found it odd that it was on 1.3 volts, when i havent even gotten around to tweaking it yet. hasnt happened again tho, its always on 1.2v now.(0.875-0.97 volts when idling on desktop) im thinking it might have been some sort of bug with the app. well, even if it was indeed accidentally set to 1.3v for like half a day, i was pretty much idling during those times anyway, so the voltage probably never even got close to hitting 1.3v. i did play one game tho, so i hope it didnt stress out the chip too much. im just really cautious with voltages now, as i almost fried my old gpu lol


----------



## Devildog83

I did 1 run of Valley @ 1250/1462 and broke 70 FPS. I will push it up some more, I have to redo the thermal paste on the 7870 or something because it hit 88 degrees and it's too hot for my liking. I may move it to the bottom and install the 14.3 driver.

file:///C:/Users/Devildog/Unigine_Valley_Benchmark_1.0_20140326_0930.html


----------



## FilipePT

Hello

I have a Asus R9 270x TOP that came from the factory 1120Mhz.
I'd rather have more longevity on the card and off the performance, so I wanted to make a underclock it and at 1050 MHZ.
Can I do this without any problems in GPU Tweak or MSI AfterBurn? also have to lower the voltage?
thank you


----------



## Recr3ational

Quote:


> Originally Posted by *FilipePT*
> 
> Hello
> 
> I have a Asus R9 270x TOP that came from the factory 1120Mhz.
> I'd rather have more longevity on the card and off the performance, so I wanted to make a underclock it and at 1050 MHZ.
> Can I do this without any problems in GPU Tweak or MSI AfterBurn? also have to lower the voltage?
> thank you


It's should be fine. Also if the manufacture has that clocks as stock clocks. It's perfectly fine to keep it so.


----------



## dmfree88

Quote:


> Originally Posted by *shilka*
> 
> What kind of noise and where does it come from?


It sounds like a fan rubbing. Like i start cgminer and i hear what sounds like a fan revving up while scraping. Then it "evens out" and just makes a constant like CD or fan like scraping noise. Ssssksssksssksskssksssksssk

Thought it was coil whine but its so strange. Cant pinpoint where its coming from. I also get a strange pulsation under heavy loads like opening a wallet while mining or mining cpu and gpu i get like a . Wurrrrrrrrrwurrrrrwurrrrwurrrrwurrwurrwurrwurwurwurwurwurwurrrrrrrrrrrrrrrrrrrrrr

Usually "evens out" to a solid wurrr. Then if i stop the cpu load it does kinda the opposite and the noise usually completely goes away or is much quieter pulsation.

Both noises seemingly unrelated and i cant find the origin of either one but both effected by load. One seemingly by gpu load the other by cpu load.


----------



## shilka

Quote:


> Originally Posted by *dmfree88*
> 
> It sounds like a fan rubbing. Like i start cgminer and i hear what sounds like a fan revving up while scraping. Then it "evens out" and just makes a constant like CD or fan like scraping noise. Ssssksssksssksskssksssksssk
> 
> Thought it was coil whine but its so strange. Cant pinpoint where its coming from. I also get a strange pulsation under heavy loads like opening a wallet while mining or mining cpu and gpu i get like a . Wurrrrrrrrrwurrrrrwurrrrwurrrrwurrwurrwurrwurwurwurwurwurwurrrrrrrrrrrrrrrrrrrrrr
> 
> Usually "evens out" to a solid wurrr. Then if i stop the cpu load it does kinda the opposite and the noise usually completely goes away or is much quieter pulsation.
> 
> Both noises seemingly unrelated and i cant find the origin of either one but both effected by load. One seemingly by gpu load the other by cpu load.


Try and rule out the cards by taking one of them out

Or you could just play it safe and get a new PSU unless you have another one laying around you can try


----------



## dmfree88

Unfortunately its setup with dual psu already. Im trying to buy a 2nd pc so i dont have this issue but the noises happened prior to adding 2nd psu and 3rd gpu. Just a mess of a rig. I hope it survives long enough


----------



## FilipePT

Quote:


> Originally Posted by *Recr3ational*
> 
> It's should be fine. Also if the manufacture has that clocks as stock clocks. It's perfectly fine to keep it so.


but I have the card TOP version, comes with lots of factory overclocking, I want to go down a little clock and voltage to have a longer life


----------



## Recr3ational

Quote:


> Originally Posted by *FilipePT*
> 
> but I have the card TOP version, comes with lots of factory overclocking, I want to go down a little clock and voltage to have a longer life


Under clocking is fine.


----------



## heroxoot

Quote:


> Originally Posted by *FilipePT*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Recr3ational*
> 
> It's should be fine. Also if the manufacture has that clocks as stock clocks. It's perfectly fine to keep it so.
> 
> 
> 
> but I have the card TOP version, comes with lots of factory overclocking, I want to go down a little clock and voltage to have a longer life
Click to expand...

Factory overclock is just what they call it when they sell a boosted model that has a higher clock than reference design. Its perfectly fine to let it run like that and I doubt it will live any longer otherwise. Hell I think the fans die way sooner than the actual GPU do. But the fan death can cause the card destruction.

Its 50/50.


----------



## Northern Isnlad

i'm using the r9 270x and and irrespective of the game i'm playing the edges in the game is not good, it seems like edges of the objects are not linear and what they should be they are appearing kind of distorted. please help me out i'm using the latest beta driver


----------



## runelotus

A bit of a suggestion to MSI


----------



## FLaguy954

I just snagged me a MSI Gaming R9 270 off of Newegg for $162! How is the overclocking potential with a 500W PSU, and single 6 pin connector?


----------



## shilka

Quote:


> Originally Posted by *FLaguy954*
> 
> I just snagged me a MSI Gaming R9 270 off of Newegg for $162! How is the overclocking potential with a 500W PSU, and single 6 pin connector?


Which 500 watt?


----------



## FLaguy954

I have Evga 500W 80 Plus (the revised one that came out recently).


----------



## shilka

Quote:


> Originally Posted by *FLaguy954*
> 
> I have Evga 500W 80 Plus (the revised one that came out recently).


Can you still return it?


----------



## FLaguy954

I can't return it. But why should I? I haven't had any problems with it so far and it's a better 500W PSU than the Corsair 500 CX.

EDIT: Damn, I just realized you are the PSU expert around here. I took a look at your advice thread and now I regret my purchase


----------



## shilka

Quote:


> Originally Posted by *FLaguy954*
> 
> I can't return it. But why should I? I haven't had any problems with it so far and it's a better 500W PSU than the Corsair 500 CX.
> 
> EDIT: Damn, I just realized you are the PSU expert around here. I took a look at your advice thread and now I regret my purchase


Actually the CX500 was better

If you cant return it then keep it untill you have money for something better, But i would not OC with a PSU like that


----------



## FLaguy954

Okay. Since you're here, Is it better to get a 550W PSU or a 500W? And what would be the best recommended one under $60?


----------



## shilka

Quote:


> Originally Posted by *FLaguy954*
> 
> Okay. Since you're here, Is it better to get a 550W PSU or a 500W?


Its pretty much the same unless you are going to overvolt, But something of better/higher quality is what you should find


----------



## FLaguy954

Thanks for the info! Decided to go with the XFX 550 Core Edition (there's a sweet deal going on with a $25 rebate).


----------



## runelotus

Quote:


> Originally Posted by *shilka*
> 
> Its pretty much the same unless you are going to overvolt, But something of better/higher quality is what you should find


btw
Speaking of PSU's

I'm Thinking of buying a 600WGold/Platinum Certified PSU ,
Been choosing between
Seasoonic X-650
FSP CM Aurum 92+ 650
Antec Earthwatts Platinum 650

What would you recommend between these 3 PSU's

TIA!


----------



## shilka

Quote:


> Originally Posted by *runelotus*
> 
> btw
> Speaking of PSU's
> 
> I'm Thinking of buying a 600WGold/Platinum Certified PSU ,
> Been choosing between
> Seasoonic X-650
> FSP CM Aurum 92+ 650
> Antec Earthwatts Platinum 650
> 
> What would you recommend between these 3 PSU's
> 
> TIA!


The Earthwatts Platinum and the FSP Aurum 92+ is the same unit but voltage regulation is not very good, So that leaves only the Seasonic X

Unless you there are other units you can find?


----------



## Devildog83

Quote:


> Originally Posted by *runelotus*
> 
> btw
> Speaking of PSU's
> 
> I'm Thinking of buying a 600WGold/Platinum Certified PSU ,
> Been choosing between
> Seasoonic X-650
> FSP CM Aurum 92+ 650
> Antec Earthwatts Platinum 650
> 
> What would you recommend between these 3 PSU's
> 
> TIA!


The Seasonic SS660 xp2 Platinum is on sale right now for $135 plus a $15 rebate at Newegg. I use this unit to power my FX 8350 @ 4.8 X-Fire 7870/r9 270x devils and 9 fans plus multiple led lights and have plenty of power. Very stable power and never hear it.


----------



## lightsout

Quote:


> Originally Posted by *FLaguy954*
> 
> I just snagged me a MSI Gaming R9 270 off of Newegg for $162! How is the overclocking potential with a 500W PSU, and single 6 pin connector?


I don't think your card allows voltage control. So whatever you can get with raising the power limit.


----------



## FLaguy954

Quote:


> Originally Posted by *lightsout*
> 
> I don't think your card allows voltage control. So whatever you can get with raising the power limit.


Not even with custom/modded bios?


----------



## lightsout

Quote:


> Originally Posted by *FLaguy954*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lightsout*
> 
> I don't think your card allows voltage control. So whatever you can get with raising the power limit.
> 
> 
> 
> Not even with custom/modded bios?
Click to expand...

I can't comment on that. It may work with a modded bios. I haven't messed with it myself.


----------



## Recr3ational

I'm still waiting for a modded 280x bios


----------



## runelotus

Quote:


> Originally Posted by *shilka*
> 
> The Earthwatts Platinum and the FSP Aurum 92+ is the same unit but voltage regulation is not very good, So that leaves only the Seasonic X
> 
> Unless you there are other units you can find?


Other PSU to Consider here are Corsair RM series, CoolerMaster V Series and Antec HCG Pro's


----------



## shilka

Quote:


> Originally Posted by *runelotus*
> 
> Other PSU to Consider here are Corsair RM series, CoolerMaster V Series and Antec HCG Pro's


Cooler Master V series is pretty much the best series in its class

It is the semi modular you are talking about right?

Corsair RM is not something you should buy unless its really cheap


----------



## runelotus

Quote:


> Originally Posted by *shilka*
> 
> Cooler Master V series is pretty much the best series in its class
> 
> It is the semi modular you are talking about right?
> 
> Corsair RM is not something you should buy unless its really cheap


Yes its the CM V750s model which is same price as the Corsair RM650 and Seasonic X-650 ang slightly more expensive than the FSP Aurum 650 92+


----------



## Recr3ational

Guys, is it normal to have coil whine when loading up a game. Then once its at 200ish fps its perfectly fine lol?


----------



## Bruska

Quote:


> Originally Posted by *Recr3ational*
> 
> Guys, is it normal to have coil whine when loading up a game. Then once its at 200ish fps its perfectly fine lol?


I always get coil whine in the introduction of every game I play, but then it dessapear. I get coil whine also when i quit valley benchmark, when it displays the frozen image


----------



## Devildog83

Tell that machine to quit whining.


----------



## Unknownm

Quote:


> Originally Posted by *Recr3ational*
> 
> Guys, is it normal to have coil whine when loading up a game. Then once its at 200ish fps its perfectly fine lol?


I only get coil whine on both of cards if it's 90%+ GPU load. BF4 most of the time only takes about 40/50% on both gpus when setting a frame limit of 60fps which does not result in a whine.


----------



## Recr3ational

Quote:


> Originally Posted by *Bruska*
> 
> I always get coil whine in the introduction of every game I play, but then it dessapear. I get coil whine also when i quit valley benchmark, when it displays the frozen image


I think its when you get like 3000 fps, it only does that when its that high. I dont mind. Just wondering if it was just me.


----------



## Skye12977

Anyone ever had the problem where your card displays the motherboard screen then goes to black screen to where you can move the mouse, but the mouse is put back into the center of the center of the screen after a few seconds? Can't use my graphic card and gotta use my integrated until I can fix this :/


----------



## JaredLaskey82

Quote:


> Originally Posted by *Devildog83*
> 
> *JaredLaskey*
> 
> Got the name fixed. I love the white, it's time to do a hide the PSU mod.
> 
> 
> *ligtsout*
> 
> Running Valley or any other program the clocks and voltages will go up and down depending on load. It's normal for readings to have peaks and Valleys like that, you will see it on bench's and during games as the card will only run as fast and hard as asked. Having said that if you have Trixx and Afterburner both installed at the same time there may be conflicts there from the system info that both programs are running plus the system info on the CCC, Unigen and Futuremark. All of these programs detect hardware in there own way and may be causing conflicts.
> 
> *Roaches*
> 
> You may have just had a bit of burn -in time on the GPU/Thermal paste or something like that. If you don't mind a bit of noise turn up the fan profile to keep them as cool as possible, it can't hurt.
> 
> *Neurotix*
> 
> I used ceramic infused engine paint but any paint that's good for plastic should work.


Well I am all done now. Well for now.

Ram modules painted, both ASUS R9 28X DCU2 shrouds painted white and the PSU cover made and installed.


----------



## Devildog83

That looks pretty sweet







I finnished too -


----------



## JaredLaskey82

Nice. I just started getting into acrylic sheets and doing mods with them.


----------



## Spade616

Quote:


> Originally Posted by *FLaguy954*
> 
> Okay. Since you're here, Is it better to get a 550W PSU or a 500W? And what would be the best recommended one under $60?


if you're going to be running just one card, even just a 450-watt is sufficient.(sig rig)


----------



## JaredLaskey82

Quote:


> Originally Posted by *Spade616*
> 
> if you're going to be running just one card, even just a 450-watt is sufficient.(sig rig)


CX Series™ Modular CX500M ATX Power Supply - 500 Watt 80 PLUS® Bronze Certified Modular PSU could be a good option.
$69 from Newegg.
$59 after rebate.

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=+CX+Series%E2%84%A2+Modular+CX500M+ATX+Power+Supply&N=-1&isNodeId=1


----------



## shilka

Quote:


> Originally Posted by *runelotus*
> 
> Yes its the CM V750s model which is same price as the Corsair RM650 and Seasonic X-650 ang slightly more expensive than the FSP Aurum 650 92+


Seasonic X is the best of those with the CM V second


----------



## Unknownm

since owning my first crossfire setup & coming from SLi, never though Windows TDR would be the cause of my random freezes. When it was enabled 2 seconds was the default option before it recovers and I always get a message saying the drivers recovered. After disabling TDR the message is gone, applications run fine with both cards, BF4 starts up fine and furmark uses both GPU's @ 99% compared to one card only doing 50%.


----------



## raven86

Hi guys. I am new here. I have a Devil R9 270X with [email protected] stock and [email protected] I am just worried about my gaming temps. They go up to around 80C but stay mostly around mid 70's. Are the temps ok? At idle though the card stays at around 29C.


----------



## Devildog83

Quote:


> Originally Posted by *raven86*
> 
> Hi guys. I am new here. I have a Devil R9 270X with [email protected] stock and [email protected] I am just worried about my gaming temps. They go up to around 80C but stay mostly around mid 70's. Are the temps ok? At idle though the card stays at around 29C.


I will ad you if you like. As far as the temps they do seem a little high for stock clocks, but not dangerous. I would set up a fan profile the ramps up the fans a bit more when the temps get around 70C, the stock profile does not cool mine very well although it's very quiet. It could be the thermal paste too, it may need to be reapplied with better paste although my 270x Devil doesn't get that high even overclocked but my fan profile won't allow it. Also the air flow in your case might be causing higher temps, what is it like in yours?


----------



## raven86

Yeah sure you can add me. i have a front and bottom fan for intake. The top 2 fans and the rear fan are for exhaust. All the fans are the xigmatek XLF series ones. The case BitFenix Merc Alpha. It has vents on the right side. Are they messing up my airflow?


----------



## danilon62

My 280x looks better each day


----------



## Recr3ational

Raven. Remove the top two fans and put it on the side panel? Or you could buy 2 more fans. Fans blowing directly on the cards will knock a bit off your card.

My rule is always have more intake than exhaust. I actually don't have a single exhaust.


----------



## raven86

Well I tried devil's advice and went with a custom fan profile. After that as soon as the gpu hits 70, the fans go into overdrive. I did some more research on this and the thing with these cards is that they are really silent. My friend has a toxic 270x and the fans on his card spin a bit faster than mine even at idle. I think powercolor went with low speed fans to keep it quiet.


----------



## Recr3ational

Quote:


> Originally Posted by *raven86*
> 
> Well I tried devil's advice and went with a custom fan profile. After that as soon as the gpu hits 70, the fans go into overdrive. I did some more research on this and the thing with these cards is that they are really silent. My friend has a toxic 270x and the fans on his card spin a bit faster than mine even at idle. I think powercolor went with low speed fans to keep it quiet.


Yeah I think power colour fans sucks. They have good cards but the cooler is below par. You could always buy a block for it


----------



## Devildog83

Quote:


> Originally Posted by *raven86*
> 
> Well I tried devil's advice and went with a custom fan profile. After that as soon as the gpu hits 70, the fans go into overdrive. I did some more research on this and the thing with these cards is that they are really silent. My friend has a toxic 270x and the fans on his card spin a bit faster than mine even at idle. I think powercolor went with low speed fans to keep it quiet.


I don't think the fans suck at all, mine stays very cool even at full load and overclocked. The Fans on the Devils, at least on mine are not silent, they aren't crazy loud but not silent by any means. I can't speak for the rest of Powercolors cards but I can tell you the heat-sinks on these work great and if you are having issues it's either with airflow or thermal paste. What are you using for an overclockng tool?


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> I don't think the fans suck at all, mine stays very cool even at full load and overclocked. The Fans on the Devils, at least on mine are not silent, they aren't crazy loud but not silent by any means. I can't speak for the rest of Powercolors cards but I can tell you the heat-sinks on these work great and if you are having issues it's either with airflow or thermal paste. What are you using for an overclockng tool?


Never tested the devils, but the turbo duo is meh. Does after burner work on yours DevilDog?


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Never tested the devils, but the turbo duo is meh. Does after burner work on yours DevilDog?


Yes it does but I switched to HIS turbo because it allows me to adjust VDDC up to 1,320mV. The Devil has a monster heat -sink with 3 fans. They are set a bit lower speed wise at the low end but will ramp up quite nicely when needed and aren't too horribly loud when they are.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Yes it does but I switched to HIS turbo because it allows me to adjust VDDC up to 1,320mV. The Devil has a monster heat -sink with 3 fans. They are set a bit lower speed wise at the low end but will ramp up quite nicely when needed and aren't too horribly loud when they are.


Might have to try that a go. My turbo duo has hit a voltage wall. So maybe that can sort the problem. Cheers dude


----------



## wilflare

is there any news on AMD Downsampling?
wonder when they will implement something similar to Gamestream... would be nice to stream steam games to like my Android phone


----------



## raven86

I agree devi, the fans are barely audible at idle. but they definitely need a custom profile to give the cooling performance required from a three fan cooling solution. I did moved my fans around a bit and now I have 2 fans infront for intake along with a bottom one and another infront on the card on the side panel. only 2 fans are on exhaust now; rear and top. I do have a jetflo around here somewhere, maybe I'll go with that. the trouble with the block is that they are not available where I live and it will just not be as sexy as the devil


----------



## Arkanon

Tried the Iturbo software from HIS. Strangly enough my GPU-z registers my max VDCC at 1.4V now. Is it safe to run the graphics card at that voltage cuz it's like .100mv over the max specified by AMD itself?


----------



## DeviousAddict

Does anyone know if they have done a waterblock for the XFX DD R9-280X yet? I don't want a generic block, i want one that fits perfectly.
Cheers peeps


----------



## Recr3ational

Devious, I think the EK 7970 full cover fits "some" Xfx 280x. It fits my Powercolour so it might fit yoursz


----------



## Arkanon

Request to update my clocks in first post: 1350/1500
Max voltage actually shows up as 1.412 and it makes the card absolutely fly. There's probably even more in it tho the card is running a bit hot now.

Edit: Max stable clocks 1360/1500


----------



## Devildog83

Quote:


> Originally Posted by *Arkanon*
> 
> Tried the Iturbo software from HIS. Strangly enough my GPU-z registers my max VDCC at 1.4V now. Is it safe to run the graphics card at that voltage cuz it's like .100mv over the max specified by AMD itself?


The card should only use as much as needed which will be determined by the clocks and the load. Watch the heat and use HWinfo64 to properly monitor you volts and heat.

Clocks updated!!


----------



## Arkanon

Quote:


> Originally Posted by *Devildog83*
> 
> The card should only use as much as needed which will be determined by the clocks and the load. Watch the heat and use HWinfo64 to properly monitor you volts and heat.
> 
> Clocks updated!!


Yes ofc, the 1.412 VDDC was while running benchmarks. Strange cuz afaik the voltage was hard locked at about 1.3V. Oh well, i managed to squeeze another 80Mhz out of the card and with a custom fan profile it doesn't go over 74°c.

http://www.3dmark.com/3dm11/8175687

10k+ graphics score


----------



## Recr3ational

I'm gonna try this HIS turbo. Sounds like the thing I need.


----------



## Arkanon

Here's the proof.


----------



## Devildog83

Quote:


> Originally Posted by *Arkanon*
> 
> Yes ofc, the 1.412 VDDC was while running benchmarks. Strange cuz afaik the voltage was hard locked at about 1.3V. Oh well, i managed to squeeze another 80Mhz out of the card and with a custom fan profile it doesn't go over 74°c.
> 
> http://www.3dmark.com/3dm11/8175687
> 
> 10k+ graphics score


That's very nice, I can't get near 1360 out of my 270x.


----------



## Obsyd

Hello there everyone!
First post and I will start with a question!








I own a Sapphire R9 280X Vapor-X. It's working beautifully since a few months now but I would like to watercool it. Can someone tell me if I can fit an all in one block on this little card? (I have googled around but nothing)
As soon as I clean my case I will post a few pictures









Thank you!


----------



## Recr3ational

Quote:


> Originally Posted by *Obsyd*
> 
> Hello there everyone!
> First post and I will start with a question!
> 
> 
> 
> 
> 
> 
> 
> 
> I own a Sapphire R9 280X Vapor-X. It's working beautifully since a few months now but I would like to watercool it. Can someone tell me if I can fit an all in one block on this little card? (I have googled around but nothing)
> As soon as I clean my case I will post a few pictures
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you!


Hmm I can only help if you take a picture of the pcb for me. chances are it probably uses a 7970 PCB, so if you take a picture of it for me I can confirm it for you.


----------



## DeviousAddict

Quote:


> Originally Posted by *Recr3ational*
> 
> Devious, I think the EK 7970 full cover fits "some" Xfx 280x. It fits my Powercolour so it might fit yoursz


Cheers Rec'
i've tried that auto build thing on the EK website but it only offers the VGA blocks and not a full PCB one. I'll email EK to see if they can confirm the 7970 waterblock will work. I don't want to fork out the money just to find they dont fit.


----------



## Recr3ational

Quote:


> Originally Posted by *DeviousAddict*
> 
> Cheers Rec'
> i've tried that auto build thing on the EK website but it only offers the VGA blocks and not a full PCB one. I'll email EK to see if they can confirm the 7970 waterblock will work. I don't want to fork out the money just to find they dont fit.


Yeah if you're lucky they might fit. Good luck mate.


----------



## KingT

Quote:


> Originally Posted by *Recr3ational*
> 
> Devious, I think the EK 7970 full cover fits "some" Xfx 280x. It fits my Powercolour so it might fit yoursz


I see you have Powercolor R9 280X Turbo Duo, can you please tell me how good card is ist?
I know it/s a reference HD7970 PCB, with Turbo Duo cooler, how good is that cooler if you have tried it first on air?

Also does card comes with ulocked core voltage control, as it has Chill voltage controler?

CHEERS..


----------



## Recr3ational

Quote:


> Originally Posted by *KingT*
> 
> I see you have Powercolor R9 280X Turbo Duo, can you please tell me how good card is ist?
> I know it/s a reference HD7970 PCB, with Turbo Duo cooler, how good is that cooler if you have tried it first on air?
> 
> Also does card comes with ulocked core voltage control, as it has Chill voltage controler?
> 
> CHEERS..


Air cooler is meh. It's average. It has potential under water though. The voltage is unlock but it has a limit lower than the card can go. I mean the card overclockability is higher than the voltage can take it. I'm currently waiting on a fellow turbo duo use to mod the bios to see if he can raise the voltage.

I don't know what chill voltage is.


----------



## beklogxd

Is the extra $ worth it going from 270 to 270x?


----------



## KingT

Quote:


> Originally Posted by *Recr3ational*
> 
> Air cooler is meh. It's average. It has potential under water though. The voltage is unlock but it has a limit lower than the card can go. I mean the card overclockability is higher than the voltage can take it. I'm currently waiting on a fellow turbo duo use to mod the bios to see if he can raise the voltage.
> 
> I don't know what chill voltage is.


Chill is a name of the chip on PCB that controls voltage, it's common used among various manufacturers and considered the best choice.

I'm interested in this card as I can get it pretty cheap (265 euros) and I would use it mostly for mining and gaming in CrossFire with my Asus HD7950 DC2 TOP V2.

That's why I was interested in voltage control ability as I want to undervolt it for mining.

Also I can get Powercolor R9 280 Turbo Duo the non X version (HD7950 ekvivalent), for 250 euros, it has identical PCB (8+6 Pin) and identical cooler, the difference is in 1972 shaders and core/memory clocks (960/1250MHz).

I was wondering if a BIOS flash with R9 280X BIOS would work on this R9 280 card as it has 100% identical PCB, but probably it would give only increase in core/memory clocks to 1030/1500, but shader count would remain @ 1792.

CHEERS..


----------



## Arkanon

Depends really. There's only a 5% performance difference between the 270 and the 270x. In most cases you can just simply overclock your 270 to match 270x stock clocks (and prob a bit further). If you wanna push it further than that (1150mhz and up), there's no guarantee the 270's single 6pin power connector can deliver enough power to sustain higher stable overclocks. Also: The 270 and 270x are the exact same chip, i'm only assuming that 270x's get the better binned chips and the normal 270's the slightly lesser good ones.


----------



## Recr3ational

Quote:


> Originally Posted by *KingT*
> 
> Chill is a name of the chip on PCB that controls voltage, it's common used among various manufacturers and considered the best choice.
> 
> I'm interested in this card as I can get it pretty cheap (265 euros) and I would use it mostly for mining and gaming in CrossFire with my Asus HD7950 DC2 TOP V2.
> 
> That's why I was interested in voltage control ability as I want to undervolt it for mining.
> 
> Also I can get Powercolor R9 280 Turbo Duo the non X version (HD7950 ekvivalent), for 250 euros, it has identical PCB (8+6 Pin) and identical cooler, the difference is in 1972 shaders and core/memory clocks (960/1250MHz).
> 
> I was wondering if a BIOS flash with R9 280X BIOS would work on this R9 280 card as it has 100% identical PCB, but probably it would give only increase in core/memory clocks to 1030/1500, but shader count would remain @ 1792.
> 
> CHEERS..


If you can afford it. Buy the 280x.
Power colour is a great card. The cooler is the only bad thing about it.

But it's worth it if your undervolting and under clocking the cards.


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *Recr3ational*
> 
> Air cooler is meh. It's average. It has potential under water though. The voltage is unlock but it has a limit lower than the card can go. I mean the card overclockability is higher than the voltage can take it. I'm currently waiting on a fellow turbo duo use to mod the bios to see if he can raise the voltage.
> 
> I don't know what chill voltage is.


I'm starting to feel the voltage might be hardware limited, BIOS mods won't take it past 1300 no matter what. And hardware modding is scary.


----------



## eAT5

im trying this today on 3d mark to see if i beat my best tune...


----------



## Unknownm

Quote:


> Originally Posted by *eAT5*
> 
> im trying this today on 3d mark to see if i beat my best tune...


hey nice system, do you have 3dmark benchmarks available? just wondering what our system compare to in each one


----------



## raven86

Quote:


> Originally Posted by *Devildog83*
> 
> I don't think the fans suck at all, mine stays very cool even at full load and overclocked. The Fans on the Devils, at least on mine are not silent, they aren't crazy loud but not silent by any means. I can't speak for the rest of Powercolors cards but I can tell you the heat-sinks on these work great and if you are having issues it's either with airflow or thermal paste. What are you using for an overclockng tool?


I am using catalyst for over-clocking. I don't want to go with voltage mods, so i am sticking with catalyst. I have bought some mx-4 to reapply to the gpu. Hopefully it works. Case airflow is fine as I tested my friend's toxic r9 270x in my case and it didn't go past 52C during heaven benchmark. Also the ambient is in the lower 30's so I think that is also causing the high temps.


----------



## Devildog83

Quote:


> Originally Posted by *raven86*
> 
> I am using catalyst for over-clocking. I don't want to go with voltage mods, so i am sticking with catalyst. I have bought some mx-4 to reapply to the gpu. Hopefully it works. Case airflow is fine as I tested my friend's toxic r9 270x in my case and it didn't go past 52C during heaven benchmark. Also the ambient is in the lower 30's so I think that is also causing the high temps.


Using something like Afterburner or Powercolor's own GPU tuning tool aren't voltage mods, the are tools made by the manufacturers for clockingand tuning your card, you severely limit yourself as to your options if you just use CCC, but that is you choice.


----------



## Recr3ational

Hmm HIS Turbo software only alows 1320 VDDC for me. So rather pointless using it.

I'll wait until i get the modded bios.


----------



## Arkanon

Have you checked GPU-Z for VDCC validation? HIS iturbo slider only goes up to 1.325V for me as well, but for some reason it overvolts my card to 1.412V in GPU-Z and i tend to believe the voltage readout from GPU-Z as it's kinda hard to believe i gained another 80MHz going from 1.3V to 1.325


----------



## Recr3ational

Quote:


> Originally Posted by *Arkanon*
> 
> Have you checked GPU-Z for VDCC validation? HIS iturbo slider only goes up to 1.325V for me as well, but for some reason it overvolts my card to 1.412V in GPU-Z and i tend to believe the voltage readout from GPU-Z as it's kinda hard to believe i gained another 80MHz going from 1.3V to 1.325


That just makes me dont want to use it more lol.


----------



## Arkanon

Yeah, it's a bit dodgy. In the end i couldn't care less really. Card seems to be fine at 1.41V. Doesn't go over 75°c with a custom fan profile. Still got 23months of warranty on the card, i suspect that if the card should die because of overvolting it too much it will surely happen in the coming months. So all in all, it's a risk but one i'm more than willing to take given the warranty on it.


----------



## eAT5

Quote:


> Originally Posted by *Recr3ational*
> 
> If you can afford it. Buy the 280x.
> Power colour is a great card. The cooler is the only bad thing about it.
> 
> But it's worth it if your undervolting and under clocking the cards.


which cooler. Turbo Duo has awesome coolers and quiet....
Quote:


> Originally Posted by *eAT5*
> 
> im trying this today on 3d mark to see if i beat my best tune...


this tune was past my sweet spot. ^^^^^

Quote:


> Originally Posted by *Unknownm*
> 
> hey nice system, do you have 3dmark benchmarks available? just wondering what our system compare to in each one


my Stock clocks Vs my OC's and Xboost state.

http://www.3dmark.com/compare/cg/1583502/cg/1583549


----------



## raven86

So i bought some Arctic mx-4 and changed the TIM on my devil. I was getting upto 84C on gaming. Now it wont go past 70 during heaven benchmark or gaming. The fan hardly goes above 48% and all that with an ambient of lower 30's!!! No custom fan profile, the card controls the fan on its own. I have the card at 1200Mhz/1500Mhz.


----------



## lightsout

Quote:


> Originally Posted by *raven86*
> 
> So i bought some Arctic mx-4 and changed the TIM on my devil. I was getting upto 84C on gaming. Now it wont go past 70 during heaven benchmark or gaming. The fan hardly goes above 48% and all that with an ambient of lower 30's!!! No custom fan profile, the card controls the fan on its own. I have the card at 1200Mhz/1500Mhz.


Wow sounds like it had a bad mount or was loose or something. Or just the worst TIM job in history. Either way congrats. Did you have to void a warranty sticker to do it?


----------



## raven86

Nope. Even if I did, I cant RMA it from my country. From the reviews I looked on the internet, its probably the quality of the TIM used by the guys over in China.


----------



## Devildog83

Quote:


> Originally Posted by *raven86*
> 
> Nope. Even if I did, I cant RMA it from my country. From the reviews I looked on the internet, its probably the quality of the TIM used by the guys over in China.


Awesome, I am going to do this right away on my Devil's, my 7870 is getting a bit too warm now.


----------



## raven86

Definitely do it! I just got off playing Crysis after 2 hours and the it stayed at around 68C with a spike to 73C. I used Radeon Pro to monitor the temps using the OSD feature.


----------



## PCpwnz

Hey I just purchased a XFX 280 and am looking to watercool it using the NZXT G10. Does anyone have any experience with this? I am looking to know if i should buy a shim for it and what ram heatsinks I should use. Thank you!


----------



## Recr3ational

Quote:


> Originally Posted by *PCpwnz*
> 
> Hey I just purchased a XFX 280 and am looking to watercool it using the NZXT G10. Does anyone have any experience with this? I am looking to know if i should buy a shim for it and what ram heatsinks I should use. Thank you!


I think the G10 is very adaptable so I'm confident it will fit. Just make sure you look at the VRAM temps. Which cooler you cooling it with?


----------



## Unknownm

This card is locked, can only use VBE7.0.0.7b to enable upto 1.256v (still 1.18v load). Stock BIOS only allows 1.18v in trixx while the custom one allows me to apply 1.256v.



If only it had 1.3v it would do 1200Mhz no problem.


----------



## Alanthor

Hi! New member here.
I also have a R9 270X. To be more precisly, I have the Sapphire Radeon R9 Dual-X 270X 2GB OC Edition. Lol, the name is so long.

It's about 1 week old, and I have OC'ed it abit. Here is my settings in CCC.

GPU Clock - 1155MHz
Memory Clock - 1425MHz
Power Control - 20%+

I also got a little question. Anyone know's what will increase my FPS in games when running on high, e.x BF4. GPU Clock or Memory clock? Temperature measured to 62 degrees celsius when playing BF4 on high.

In overall, I am very satisfied with my purchase. It only costed me 1.500 SEK. Approx 160 euros, and BF4 included


----------



## neurotix

Quote:


> Originally Posted by *Alanthor*
> 
> Hi! New member here.
> I also have a R9 270X. To be more precisly, I have the Sapphire Radeon R9 Dual-X 270X 2GB OC Edition. Lol, the name is so long.
> 
> It's about 1 week old, and I have OC'ed it abit. Here is my settings in CCC.
> 
> GPU Clock - 1155MHz
> Memory Clock - 1425MHz
> Power Control - 20%+
> 
> I also got a little question. Anyone know's what will increase my FPS in games when running on high, e.x BF4. GPU Clock or Memory clock? Temperature measured to 62 degrees celsius when playing BF4 on high.
> 
> In overall, I am very satisfied with my purchase. It only costed me 1.500 SEK. Approx 160 euros, and BF4 included


Core clock is king. GPU clock will increase your fps much more drastically than a RAM overclock.


----------



## Alanthor

Quote:


> Originally Posted by *neurotix*
> 
> Core clock is king. GPU clock will increase your fps much more drastically than a RAM overclock.


So, in order to increase my FPS in the best and drastic way, it is the Memory clock I should increase? I may be a little lost right now, but is Memory clock Core clock? How many MHz per step should I increase when OC'ing a graphic card? I usually increase 10MHz, play a few games and surfing for some hours and see if it's stable. Am I doing it in a good way?


----------



## neurotix

Quote:


> Originally Posted by *Alanthor*
> 
> So, in order to increase my FPS in the best and drastic way, it is the Memory clock I should increase? I may be a little lost right now, but is Memory clock Core clock? How many MHz per step should I increase when OC'ing a graphic card? I usually increase 10MHz, play a few games and surfing for some hours and see if it's stable. Am I doing it in a good way?


Quote:


> Originally Posted by *neurotix*
> 
> Core clock is king. GPU clock will increase your fps much more drastically than a RAM overclock.


RAM is memory. I said core clock will increase your performance much more than RAM, which is memory.

Are you sure you should be doing this?


----------



## Alanthor

Quote:


> Originally Posted by *neurotix*
> 
> RAM is memory. I said core clock will increase your performance much more than RAM, which is memory.
> 
> Are you sure you should be doing this?


Well.. In my world (Might just be me, hehe), but RAM is the memory sticks, like Corsair or something. I also do blame that I am very tired, haha.

But let me know if I got you. RAM is also the same as Memory Clock in graphic card manners? So, black on white, increasing Memory clock is the best if im looking for higher FPS? And yes, Im sure I will be doing this. You have been a rookie too you know..

Btw, another question. Out of nowhere, when I try to launch the game Apache Air Assault, I get a error message, like "FATAL ERROR: prog.engine2/shaders/ShaderBlock.ccp..." something. Is that game-wise, or is it something with my rig?... I have never gotten it before. I can play MW3 and BF4 without any errors. All drivers are up to date.

EDIT::
Wow... I recently ran OCCT GPU 3D test. I got a bluescreen after 4 seconds.. Have I clocked it too much?

Code:



Code:


Problem signature:
  Problem Event Name:   BlueScreen
  OS Version:   6.1.7600.2.0.0.256.1
  Locale ID:    1053

Additional information about the problem:
  BCCode:       a0000001
  BCP1: 0000000000000005
  BCP2: 0000000000000000
  BCP3: 0000000000000000
  BCP4: 0000000000000000
  OS Version:   6_1_7600
  Service Pack: 0_0
  Product:      256_1

Files that help describe the problem:
  C:\Windows\Minidump\040214-28392-01.dmp
  C:\Users\robin\AppData\Local\Temp\WER-56737-0.sysdata.xml

Read our privacy statement online:
  http://go.microsoft.com/fwlink/?linkid=104288&clcid=0x0409

If the online privacy statement is not available, please read our privacy statement offline:
  C:\Windows\system32\en-US\erofflps.txt


----------



## Recr3ational

Memory = Ram = Memory
Ram sticks = Memory

Core clock will be labeled as Core Clock


----------



## neurotix

Quote:


> Originally Posted by *Alanthor*
> 
> Well.. In my world (Might just be me, hehe), but RAM is the memory sticks, like Corsair or something. I also do blame that I am very tired, haha.
> 
> But let me know if I got you. RAM is also the same as Memory Clock in graphic card manners? So, black on white, increasing Memory clock is the best if im looking for higher FPS? And yes, Im sure I will be doing this. You have been a rookie too you know.. snip
> 
> [/code]


The GPU actually has DDR5 RAM chips soldered to the pcb, this is what your memory clock is (actually, it's your frame buffer, but it DOES operate at a fixed frequency).

No, increasing memory clock is not the best if you're looking for performance. Core clock is king. Core clock is what gives you fps. Memory clock gives fps too, but not as much. If you can run a higher core clock but have to lower your memory clock to do so, you *always* want to take the higher core clock.

As far as overclocking goes, the method you're using is fine, but it's a bit too tedious for my taste.

It is okay to raise your overclock by 25mhz at a time.

The best way to do this is to go download Unigine Valley 1.0 benchmark. Open it, set it to the ExtremeHD preset, click run. Hit F9 to start the bench. Run your card at stock and note your fps. Write it down. Personally, I use Sapphire Trixx for overclocking. Try raising your cards core clock to 1100mhz with about 1.225v (I don't know what card you have because you never said, but any 270X or 280X should be able to do this). Run Valley again and note your increased fps. Write it down. Now, put your card back to stock settings but overclock the RAM by 100mhz (all cards should be able to handle this). Run the benchmark and write down your fps with just a memory increase. The core increase should net you MUCH more fps than the RAM increase. Anyway, down to business. Raise your core clock by 25mhz (Say, 1125mhz with 1.225v). Make sure you set your fans to 100% while you overclock so the card runs cool since you're adding voltage. Run Valley and look for artifacts (they should look like flashing blue dots). If you artifact, raise your core voltage and run the bench again until you don't get artifacts. Repeat this, raising the core clock in 25mhz increments, until you get to the point where adding voltage won't get rid of the artifacts. Once you get to that point, you know that your chip is unstable at that clock. Likely, it will never be stable at that clock. With a 270X or 280X it will probably be somewhere over 1200mhz, but bad cards can't even do 1200mhz without artifacts. Back down to your last stable setting and voltage, this should be your max core overclock. Next, do the same thing for your RAM, raise it in 25mhz increments until you artifact in Valley. You might even get a black screen, requiring a system reboot. The difference is you can't add voltage to stabilize the RAM. The plus side of doing it this way is that you end up with a nice sounding number (I'd prefer 1175mhz to something weird like 1163mhz). After you find your max stable core and memory overclock, go play games with it to make sure it's stable. Valley will usually run fine even at an unstable overclock but some games will crash within a few seconds of gameplay if your overclock is too high and your chip can't handle it. My golden 270X can pass Valley at 1300mhz but it will crash right away in any DirectX11 game.

As far as your game goes, I have no idea. Google the error message. If it's an old game or an unpopular game it might not play well with newer cards and AMD drivers.

The blue screen probably means you're unstable, especially if it was a bsod in an ati DLL. I can't tell if it was or not from that crash dump.

Also, when you get the time, fill out a rigbuilder and put it in your sig. Click rigbuilder in the top right, fill EVERYTHING in, then go to your profile -> edit signature -> show stuff off in my signature dropdown box -> select the rig -> save.


----------



## electrocabeza

Testing my new Sapphire 270X Toxic, 1320MHz 1.25V Stable for 3DMark 11... Memory is stock because in this benchmark it doesn't help to much in this benchmark.

I think it can get 1350MHz with a little more of voltage... GPU Temps are great, 58 degrees in 3DM11 with fans at 50%.


----------



## neurotix

Quote:


> Originally Posted by *electrocabeza*
> 
> Testing my new Sapphire 270X Toxic, 1320MHz 1.25V Stable for 3DMark 11... Memory is stock because in this benchmark it doesn't help to much in this benchmark.
> 
> I think it can get 1350MHz with a little more of voltage... GPU Temps are great, 58 degrees in 3DM11 with fans at 50%.


What's the ASIC on that card?

Also, something looks wrong with that bench, your score should be much higher, especially at 1320mhz. This is what I did with a 270X @ 1270mhz and an FX-8350.

http://hwbot.org/submission/2480766_neurotix_3dmark11___performance_radeon_r9_270x_9927_marks

You should be breaking 10k with 1320mhz and a Haswell processor.


----------



## cyph3rz

I've been thinking of spray painting the shroud (but not the fans) a nice gloss red color of my Gigabyte R9 280X. Has anyone painted the shroud of their card?


----------



## DiceAir

Why can I only do 1150MHz on the core? when i go higher my games crash.

I'm already at 1.263v on the core. anything else I should change. max gpu temps is 75C and I'm only running 1 card.


----------



## electrocabeza

Quote:


> Originally Posted by *neurotix*
> 
> What's the ASIC on that card?
> 
> Also, something looks wrong with that bench, your score should be much higher, especially at 1320mhz. This is what I did with a 270X @ 1270mhz and an FX-8350.
> 
> http://hwbot.org/submission/2480766_neurotix_3dmark11___performance_radeon_r9_270x_9927_marks
> 
> You should be breaking 10k with 1320mhz and a Haswell processor.


Its 83.2%...

Yeah, I noticed that but I dont know why that is happening. Im going to try if I can get better scores with other drivers during the weekend.


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> What's the ASIC on that card?
> 
> Also, something looks wrong with that bench, your score should be much higher, especially at 1320mhz. This is what I did with a 270X @ 1270mhz and an FX-8350.
> 
> http://hwbot.org/submission/2480766_neurotix_3dmark11___performance_radeon_r9_270x_9927_marks
> 
> You should be breaking 10k with 1320mhz and a Haswell processor.


Nuerotix, did you have tessy off? His is a valid score with tessy on, makes a huge difference.


----------



## Devildog83

Quote:


> Originally Posted by *cyph3rz*
> 
> I've been thinking of spray painting the shroud (but not the fans) a nice gloss red color of my Gigabyte R9 280X. Has anyone painted the shroud of their card?


Just use any paint for plastics.


----------



## Alanthor

Sapphire Radeon R9 270X 2GB (DUAL-X OC EDITION). Running stable on 1150MHz Core clock and 1400MHz Memory clock







Anything more on the Core clock causes bluescreen. Even if I increase VDDC to 1240 (1.40 according to HWMonitor), it still causes bluescreen if I take Core clock at 1160MHz. Should I increase the VDDC even more to get higher core clocks?

Btw, does overclocking a GPU and VDDC cause a increase of heat on the North Bridge? Because when I had the CPU on 4.31GHz and a voltage of 1.4, I almost burn my finger if I touch it more than 3 seconds.


----------



## Devildog83

Quote:


> Originally Posted by *Alanthor*
> 
> Sapphire Radeon R9 270X 2GB (DUAL-X OC EDITION). Running stable on 1150MHz Core clock and 1400MHz Memory clock
> 
> 
> 
> 
> 
> 
> 
> 
> Anything more on the Core clock causes bluescreen. Even if I increase VDDC to 1240 (1.40 according to HWMonitor), it still causes bluescreen if I take Core clock at 1160MHz. Should I increase the VDDC even more to get higher core clocks?
> 
> Btw, does overclocking a GPU and VDDC cause a increase of heat on the North Bridge? Because when I had the CPU on 4.31GHz and a voltage of 1.4, I almost burn my finger if I touch it more than 3 seconds.


Could you post what your CPU and motherboard are and I might be able to help out? You might be stressing the NB but changing voltage and frequency may help. As much info as possible will help us help you.


----------



## lightsout

Quote:


> Originally Posted by *Alanthor*
> 
> Sapphire Radeon R9 270X 2GB (DUAL-X OC EDITION). Running stable on 1150MHz Core clock and 1400MHz Memory clock
> 
> 
> 
> 
> 
> 
> 
> Anything more on the Core clock causes bluescreen. Even if I increase VDDC to 1240 (1.40 according to HWMonitor), it still causes bluescreen if I take Core clock at 1160MHz. Should I increase the VDDC even more to get higher core clocks?
> 
> Btw, does overclocking a GPU and VDDC cause a increase of heat on the North Bridge? Because when I had the CPU on 4.31GHz and a voltage of 1.4, I almost burn my finger if I touch it more than 3 seconds.


What program are you using to change the VDDC with that card?


----------



## Alanthor

I use Sapphire TrixXx to change the VDDC, and CCC to change the clocks. (They are synced). I putted my CPU down to 4.12GHz again, cuz the pc didnt boot when I got it to 4.5... It didnt show any display, and when it did, it got stucked on "Entering setup" when i tryed to get into BIOS. The NB heatsink was really really hot.

I had it off for 10min, and then I was able to get into BIOS and decrease CPU FSB and voltage.

GPU card is now stable, but I wanna OC it more.. But does increasing the GPU's VDDC cause heat increasing of the northbridge? Cuz I dont want that.

I have MSI 970A-G43 as MOBO and AMD FX-4300 Black Edition


----------



## Recr3ational

Quote:


> Originally Posted by *Alanthor*
> 
> I use Sapphire TrixXx to change the VDDC, and CCC to change the clocks. (They are synced). I putted my CPU down to 4.12GHz again, cuz the pc didnt boot when I got it to 4.5... It didnt show any display, and when it did, it got stucked on "Entering setup" when i tryed to get into BIOS. The NB heatsink was really really hot.
> 
> I had it off for 10min, and then I was able to get into BIOS and decrease CPU FSB and voltage.
> 
> GPU card is now stable, but I wanna OC it more.. But does increasing the GPU's VDDC cause heat increasing of the northbridge? Cuz I dont want that.
> 
> I have MSI 970A-G43 as MOBO and AMD FX-4300 Black Edition


I don't want to be that guy. But the board isn't very good for overclocking. I had it myself with my second rig. That's the reason your cpu isn't stable at 4.5. Look into a 990FX board. They're reasonable cheap now. Also if you can try and oc with just the multiplier and voltage. I had issues messing with FSB on that board.

Turning down VDDC? I haven't got an idea about that. Though in my experience I've never had an issue of the temps going up. Northbridge that is.


----------



## Alanthor

Ok. I'll buy a new motherboard tomorrow, so I can overclock more!









But OnT.
As stated, My graphic card is only stable at 1140MHz core clock and 1400MHz Mem clock. If I increase the VDDC on tjhe graphic card, will it allow me to overclock it more?

Im buying Sabertooth 990FX R2.0 tomorrow


----------



## Recr3ational

Quote:


> Originally Posted by *Alanthor*
> 
> Ok. I'll buy a new motherboard tomorrow, so I can overclock more!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But OnT.
> As stated, My graphic card is only stable at 1140MHz core clock and 1400MHz Mem clock. If I increase the VDDC on tjhe graphic card, will it allow me to overclock it more?
> 
> Im buying Sabertooth 990FX R2.0 tomorrow


Good choice, solid board.
Yes if you INCREASE VDDC, it might be stable. Sometimes though it's not really worth it as it increases temps. So just be careful. Also stable is used loosely around here. 100 hours stable on benchmarks doesn't mean it's still going to work in games. Just increase is enough so you're happy and slowly raise the voltage. Play some games or w.e. If it crashes then increase it more.


----------



## uroshnish

Add me to R9 270X owners







I got Sapphire Vapor-X R9 270X
Thanks in advance


----------



## Devildog83

Quote:


> Originally Posted by *Alanthor*
> 
> Ok. I'll buy a new motherboard tomorrow, so I can overclock more!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But OnT.
> As stated, My graphic card is only stable at 1140MHz core clock and 1400MHz Mem clock. If I increase the VDDC on tjhe graphic card, will it allow me to overclock it more?
> 
> Im buying Sabertooth 990FX R2.0 tomorrow


That's a great choice, the Vishera chips don't like the 970 chipset too much. If you need help overclocking that let me know. I have the CHVFZ and the bios and overclocking are about the same.

By the by, I heard the using CCC and another overclocking tool can cause instability due to system info conflicts. Just a thought.


----------



## Devildog83

Quote:


> Originally Posted by *uroshnish*
> 
> Add me to R9 270X owners
> 
> 
> 
> 
> 
> 
> 
> I got Sapphire Vapor-X R9 270X
> Thanks in advance


You have been added, Welcome!!! How about some clocks when you get done playing with it.


----------



## M1kuTheAwesome

Has anyone tried using paste instead of pads under the VRM heatsink? I'm not 100% sure if mine has pads or not, but I assume so. Could I get better VRM temps by replacing the pads with MX-4? Last night while paying FarCry 3 my VRM temps went up to 85C at some point. Not dangerous I believe, but would like it lower.


----------



## lightsout

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Has anyone tried using paste instead of pads under the VRM heatsink? I'm not 100% sure if mine has pads or not, but I assume so. Could I get better VRM temps by replacing the pads with MX-4? Last night while paying FarCry 3 my VRM temps went up to 85C at some point. Not dangerous I believe, but would like it lower.


I would be careful if you do it you could have quite a mess on your hands since the vrms are so small it might smear everywhere.


----------



## amalinkin

Hello everyone!
I've just recently bought a XFX R9-280X-TDBD card version 2.2

And I was surprised a lot, becouse card sets voltage 1.275 in games, frequency is 1080MHz! MSI AB monitoring

But, in the games like Crysis 3, Battlefield 4 - in 1 hour gameplay I got a couple of crashes! With the messages like "Application Crysis3.exe has stopped working".

Can it happen because the frequency and voltage are very high?
Thank you.


----------



## Tobe404

Add me to the list please...

Gigabyte R9 280x Rev 2

Modified BIOS: 1.095v
Core Clock: 1085
Memory Clock: 1485

Stock is 1.163v / 1100 / 1500 but I'd rather lower voltage for slightly lower clocks any day.


----------



## microkid21

Has anyone here tried to update their BIOS for Powercolor R9 270x Devil? What's the result? Why did you update? I'am having a problem with my card and I just want to know if updating BIOS can solve my problem. Thanks in advanced!


----------



## Recr3ational

Quote:


> Originally Posted by *Tobe404*
> 
> Add me to the list please...
> 
> Gigabyte R9 280x Rev 2
> 
> Modified BIOS: 1.095v
> Core Clock: 1085
> Memory Clock: 1485
> 
> Stock is 1.163v / 1100 / 1500 but I'd rather lower voltage for slightly lower clocks any day.


Do you want to mod my bios?


----------



## Unknownm

Okay I have a feeling flashing rev1 to rev2 bios of my gigabyte gv-r928xoc-3gd bios to unlock 1.3 doesn't actually give me 1.3v. The rev 1 BIOS only allows 1.256v , while rev2 1.3v.
1.256 = 1180Mhz OC (1100 stock) when applying 1.3v but I don't gain any overclock at all it still artifacts around 1190mhz.


----------



## M1kuTheAwesome

How could I measure the power consumption of my card? If I want to undervolt the card to reduce power consumption, what do I do with the power limiter? Just bored and thought I might give it a try.
Cheers.


----------



## Devildog83

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> How could I measure the power consumption of my card? If I want to undervolt the card to reduce power consumption, what do I do with the power limiter? Just bored and thought I might give it a try.
> Cheers.


Do you have HWinfo64, this should tell you what power your card is gobbling up.


----------



## Devildog83

Quote:


> Originally Posted by *microkid21*
> 
> Has anyone here tried to update their BIOS for Powercolor R9 270x Devil? What's the result? Why did you update? I'am having a problem with my card and I just want to know if updating BIOS can solve my problem. Thanks in advanced!


I have not modded the bios on mine, what kind of issue do you have?


----------



## Devildog83

*Tobe404* & *omalinkin* have been added, Welcome !!


----------



## Tobe404

Quote:


> Originally Posted by *Recr3ational*
> 
> Do you want to mod my bios?


I didn't modify my own. Just flashed a BIOS that someone else had already modified.
I can try and find where the site is that I got the modified BIOS from if you like...
Or I could just send you the modified BIOS file? Up to you.

Should add that I used Trixx to Downclock/Undervolt my card.
Even with a modified BIOS Afterburner still wouldn't allow voltage adjustment.
Using Trixx without the modified BIOS allowed me to supposedly change voltage.. Always reverted back to original 1.163v once a game was fired up though. Hence the need for a modified BIOS.


----------



## Recr3ational

Quote:


> Originally Posted by *Tobe404*
> 
> I didn't modify my own. Just flashed a BIOS that someone else had already modified.
> I can try and find where the site is that I got the modified BIOS from if you like...
> Or I could just send you the modified BIOS file? Up to you.
> 
> Should add that I used Trixx to Downclock/Undervolt my card.
> Even with a modified BIOS Afterburner still wouldn't allow voltage adjustment.
> Using Trixx without the modified BIOS allowed me to supposedly change voltage.. Always reverted back to original 1.163v once a game was fired up though. Hence the need for a modified BIOS.


Hmm, i wanted higer voltage wall. As the Turbo Duo, Oc ability surpasses the voltage..


----------



## Praston

New member here with Asus R9 270X TOP


----------



## Alanthor

Weeh! Sabertooth 990FX AMD3+ platform, inside my rig right now









Overclocking god! Here I come


----------



## Recr3ational

Quote:


> Originally Posted by *Alanthor*
> 
> Weeh! Sabertooth 990FX AMD3+ platform, inside my rig right now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Overclocking god! Here I come


Good man. You like it? You're welcome btw.








Try and OC on it. One at a time remember. CPU first then GPU.


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *Devildog83*
> 
> Do you have HWinfo64, this should tell you what power your card is gobbling up.


How did I miss that? It's been staring me in the face for hours.


----------



## Tobiman

Quote:


> Originally Posted by *amalinkin*
> 
> Hello everyone!
> I've just recently bought a XFX R9-280X-TDBD card version 2.2
> 
> And I was surprised a lot, becouse card sets voltage 1.275 in games, frequency is 1080MHz! MSI AB monitoring
> 
> But, in the games like Crysis 3, Battlefield 4 - in 1 hour gameplay I got a couple of crashes! With the messages like "Application Crysis3.exe has stopped working".
> 
> Can it happen because the frequency and voltage are very high?
> Thank you.


I doubt it. A crash as a result of setting core frequency too high would be more noticeable in the form of artifacts or complete system freeze. To be sure that this isn't the case, I'd compare your core voltage to what others with the same card are getting.


----------



## GuestVeea

Does anyone know when catalyst 14.4 Is to be released? and if it will have BF4/mantle optimization for a 280x?


----------



## Tobiman

Quote:


> Originally Posted by *Recr3ational*
> 
> Hmm, i wanted higer voltage wall. As the Turbo Duo, Oc ability surpasses the voltage..


Techpowerup should have a ton of bioses on it for 280Xs. Find out which one unlocks and flash it.


----------



## Recr3ational

Quote:


> Originally Posted by *Tobiman*
> 
> Techpowerup should have a ton of bioses on it for 280Xs. Find out which one unlocks and flash it.


Mine is unlock I just want the limit higher, can't find one higher than mine.


----------



## Devildog83

Quote:


> Originally Posted by *Alanthor*
> 
> Weeh! Sabertooth 990FX AMD3+ platform, inside my rig right now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Overclocking god! Here I come


Yeah baby!! The Sabertooth is from all accounts a strong overclocker and a beats of a board.

Edit, I hope no one beats there board.


----------



## Devildog83

Quote:


> Originally Posted by *Praston*
> 
> New member here with Asus R9 270X TOP


You are now, and if you were to post the clocks you are running you will be a club insider. OK, just kidding but clocks to add by your name would be cool.


----------



## Praston

Default setup 1120 / 5600


----------



## amalinkin

Quote:


> Originally Posted by *Tobiman*
> 
> I doubt it. A crash as a result of setting core frequency too high would be more noticeable in the form of artifacts or complete system freeze. To be sure that this isn't the case, I'd compare your core voltage to what others with the same card are getting.


As I figured out - the problem isn't in card stability.
The card works well in highest core and memory frequency.
The card is still stable in [email protected] MEM.
1180 windows crashes after 5 min gameplay.

But! I've got a question!
Does anyone know - how to change voltage in max loading?
I've tried to change it in MSI AB, and nothign happend!

It changes the voltage for 800MHz freq.
Thank you!


----------



## JaredLaskey82

Well I think I have my rig to where I want it (for now).


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *JaredLaskey82*
> 
> Well I think I have my rig to where I want it (for now).


Wow. That looks amazing.








I've had a softspot for white cases and white-themed builds for a while now. At one point I even thought about spraying my own case white, but it isn't the best looker anyway, so I might as well wait until I start a new build someday in a few years


----------



## rdr09

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Wow. That looks amazing.
> 
> 
> 
> 
> 
> 
> 
> 
> I've had a softspot for white cases and white-themed builds for a while now. At one point I even thought about spraying my own case white, but it isn't the best looker anyway, so I might as well wait until I start a new build someday in a few years


I agree. so well planned.


----------



## JaredLaskey82

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Wow. That looks amazing.
> 
> 
> 
> 
> 
> 
> 
> 
> I've had a softspot for white cases and white-themed builds for a while now. At one point I even thought about spraying my own case white, but it isn't the best looker anyway, so I might as well wait until I start a new build someday in a few years


NZXT has the new NZXT H440 Mid Tower Case with a built in PSU cover. I think they look good but is a bit small for me as I live in a warm country and need good airflow.


----------



## mAs81

Very nice rig JaredLaskey,kudos








Don't forget to put it in your signature!!


----------



## Devildog83

*Praston* clocks added

*JaredLaskey82* Very sweet build!! The only thing I would do is find full backplates for the GPU's. I just hate the look of the PCB's. I guess I am just OCD about it.

*Everyone* If you like white cases take a look at the Corsair 760T. So nice looking and so much room and it can really show off your build. I like the H440 too but radiator space is limited and I am going full tower soon just for that reason, to go to a full loop.


----------



## Recr3ational

760t looks awesome apart from the front. I'm waiting for the 600T v2

Also, about the backplates. You could always make custom one for like £5


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> 760t looks awesome apart from the front. I'm waiting for the 600T v2
> 
> Also, about the backplates. You could always make custom one for like £5


I don't mind the front at all, it's the top that has me worried. The plate on the top can't be used if you have any air flow that needs to flow through it. I am not fond of the way it looks off. I have already figured out how to solve that problem with some sweet 360mm grills. The front has clean lines and plenty of airflow. Never was a huge fan of the 600T, it's not horrible but not my style. I have the C70 which I am getting bored of and it's too small. I want something with more room than the C70 or 600T so 750D or 760T will be the choice I have to make.


----------



## Recr3ational

Yeah the 600T is small as well, but if able to use a dremel you could work around it. Like other cases. Personally I'll choose the 750D over the 760T.

1: it's just a decent, solid case and has a lot of loop options. Even more if you're willing to cut some of the case.

2: it's a bit older so you might find it cheap somewhere.

3: Dat window.


----------



## mikemykeMB

Quote:


> Originally Posted by *Recr3ational*
> 
> Yeah the 600T is small as well, but if able to use a dremel you could work around it. Like other cases. Personally I'll choose the 750D over the 760T.
> 
> 1: it's just a decent, solid case and has a lot of loop options. Even more if you're willing to cut some of the case.
> 
> 2: it's a bit older so you might find it cheap somewhere.
> 
> 3: Dat window.


Hahahaa, "Dat" Dare Wind'r on the 760t has some endless modding tricks I think, U could paint over some, make holes for side fans-radiators, cut lines with mesh screen(s)...etc, but yeah @ the price and to chop at it seems cruel to an already up-scaled case.

What is this 600t v2 u speak of?


----------



## Recr3ational

Quote:


> Originally Posted by *mikemykeMB*
> 
> Hahahaa, "Dat" Dare Wind'r on the 760t has some endless modding tricks I think, U could paint over some, make holes for side fans-radiators, cut lines with mesh screen(s)...etc, but yeah @ the price and to chop at it seems cruel to an already up-scaled case.
> 
> What is this 600t v2 u speak of?


Corsair George released a silhouette of 2 cases. One mini itx and one full tower. He released it at the 600T club thing. Said to us to wait for June/July for further details..

He didn't tell us the name so were just calling it the V2.


----------



## mikemykeMB

Quote:


> Originally Posted by *Recr3ational*
> 
> Corsair George released a silhouette of 2 cases. One mini itx and one full tower. He released it at the 600T club thing. Said to us to wait for June/July for further details..
> 
> He didn't tell us the name so were just calling it the V2.


OJC..cool, just have to wait and see..thanks!!
Meantime, will have to arrange for new setup w/ 760t or this new V2,... kinda wanted the 750d before, but went with sleek 600t..now it has become a compulsive mood to make/build/mod one or the other.


----------



## Recr3ational

Quote:


> Originally Posted by *mikemykeMB*
> 
> OJC..cool, just have to wait and see..thanks!!
> Meantime, will have to arrange for new setup w/ 760t or this new V2,... kinda wanted the 750d before, but went with sleek 600t..now it has become a compulsive mood to make/build/mod one or the other.


The 600T is a sweet case man, I love mine


----------



## mikemykeMB

Quote:


> Originally Posted by *Recr3ational*
> 
> The 600T is a sweet case man, I love mine


True that, especially when personal touch is made to them. I just had to paint the inside of a black-dark case frame. Huge difference in ownership.
Been thinking of using some acrylic to makeup a back plate for the GPU-add a LED or 2 on the edges with some paint just to hide the pcb. Who knows?!?!?


----------



## Recr3ational

Quote:


> Originally Posted by *mikemykeMB*
> 
> True that, especially when personal touch is made to them. I just had to paint the inside of a black-dark case frame. Huge difference in ownership.
> Been thinking of using some acrylic to makeup a back plate for the GPU-add a LED or 2 on the edges with some paint just to hide the pcb. Who knows?!?!?


like this?


mines without the led obviously..


----------



## mikemykeMB

Quote:


> Originally Posted by *Recr3ational*
> 
> like this?
> 
> mines without the led obviously..


Yea something like what you've done, and cut some small lines in a fashion to allow some air. Gonna have to use a friends table saw!!


----------



## Roboyto

Just wondering if anyone can give any insight on the Visiontek R9 280X?



http://www.newegg.com/Product/Product.aspx?Item=14-129-280&turntoflow=activity&turntoxauthtoken=MTMzMDEzMToxNDI4Mjc4MjE4Nzg5OmI4Nzg2NjFkYmMxMGU4YWVmM2I4ZjAzMGIxOWExYjUx&turntosku=N82E16814129280#turntodone

The price is pretty killer considering the warranty and included backplate, but I am having trouble finding any solid info on the card. The cryptocurrency area of Reddit, doesn't have much nice to say about their cooling ability...but I'm not concerned with running the card 24/7 at 100% load. The few NewEgg reviews that there are do not fair well stating DOA in the last few months.


----------



## Alanthor

Quote:


> Originally Posted by *Recr3ational*
> 
> Good man. You like it? You're welcome btw.
> 
> 
> 
> 
> 
> 
> 
> 
> Try and OC on it. One at a time remember. CPU first then GPU.


Its installed and going!







4.3GHz stable atm







But I dont like that CPU FSB OC my ram aswell..

And I dont like that if I have more than 1120MHz core clock on my gpu (Like, 1020MHz original), i get bluescreen. My VDDC is 1.240. Will increasing VDDC let me OC more? I use Sapphire TrixXx.

specs:
Sabertooth 990FX R2.0
AMD FX-4300 Black edition
Sapphire Radeon R9 270X 2GB GDDR5. (DUAL-X, OC Edition)


----------



## Recr3ational

Quote:


> Originally Posted by *Alanthor*
> 
> Its installed and going!
> 
> 
> 
> 
> 
> 
> 
> 4.3GHz stable atm
> 
> 
> 
> 
> 
> 
> 
> But I dont like that CPU FSB OC my ram aswell..
> 
> And I dont like that if I have more than 1120MHz core clock on my gpu (Like, 1020MHz original), i get bluescreen. My VDDC is 1.240. Will increasing VDDC let me OC more? I use Sapphire TrixXx.
> 
> specs:
> Sabertooth 990FX R2.0
> AMD FX-4300 Black edition
> Sapphire Radeon R9 270X 2GB GDDR5. (DUAL-X, OC Edition)


Just leave bus speed mate. Just run core clock and voltage.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Yeah the 600T is small as well, but if able to use a dremel you could work around it. Like other cases. Personally I'll choose the 750D over the 760T.
> 
> 1: it's just a decent, solid case and has a lot of loop options. Even more if you're willing to cut some of the case.
> 
> 2: it's a bit older so you might find it cheap somewhere.
> 
> 3: Dat window.


Ya, the 760T and 750D are the same inside and the lines are sleek and clean on the 750D. It's also cheaper @ about $160. The front is a bit more restricted but with intake at the bottom you should still have enough air flow into the case with some good SP fans. Microcenter has it right now for $140 plus a $10 rebate.


----------



## Devildog83

Quote:


> Originally Posted by *Alanthor*
> 
> Its installed and going!
> 
> 
> 
> 
> 
> 
> 
> 4.3GHz stable atm
> 
> 
> 
> 
> 
> 
> 
> But I dont like that CPU FSB OC my ram aswell..
> 
> And I dont like that if I have more than 1120MHz core clock on my gpu (Like, 1020MHz original), i get bluescreen. My VDDC is 1.240. Will increasing VDDC let me OC more? I use Sapphire TrixXx.
> 
> specs:
> Sabertooth 990FX R2.0
> AMD FX-4300 Black edition
> Sapphire Radeon R9 270X 2GB GDDR5. (DUAL-X, OC Edition)


If you overclock using FSB you have to change set the RAM, HT link and CPU/NB frequency again because all of these will change. Just set the FSB and then reset your other values afterwords.


----------



## microkid21

Quote:


> Originally Posted by *Devildog83*
> 
> I have not modded the bios on mine, what kind of issue do you have?


BSOD, Crash to desktop with "Display driver stopped responding and has recovered" message, game freezes but the audio still continues. Using different drivers from the official one up to the latest beta. My method of installing new driver is to first removed the previous one using DDU or amd cleanup utility or by freshly install OS.


----------



## Devildog83

Quote:


> Originally Posted by *microkid21*
> 
> BSOD, Crash to desktop with "Display driver stopped responding and has recovered" message, game freezes but the audio still continues. Using different drivers from the official one up to the latest beta. My method of installing new driver is to first removed the previous one using DDU or amd cleanup utility or by freshly install OS.


Ok, what clocks are you trying to run? By itself I can run mine at 1235/1575 without issue but I run everyday clocks at 1200/1400 with my 7870 and when I feel like ramping it up I go to 1250/1450. I have run them together as high as 1260/1475 but it's pushing the crossfire setup because the 7870 has some voltage adjustment to 1.3 in AB on the core and the 270x does not. I can push the 270x way further on the memory though. I keep the 7870 on top because it allows me to stretch the core clocks out a bit more with less stress on the 270x and I keep the memory clocks down because there is not much performance increase past 1475 anyway.


----------



## heroxoot

I think MSI is about to send me back the defective 280x I sent in. The RMA now shows NTF, which seems to mean No Trouble Found. I got a bad feeling.


----------



## microkid21

Quote:


> Originally Posted by *Devildog83*
> 
> Ok, what clocks are you trying to run? By itself I can run mine at 1235/1575 without issue but I run everyday clocks at 1200/1400 with my 7870 and when I feel like ramping it up I go to 1250/1450. I have run them together as high as 1260/1475 but it's pushing the crossfire setup because the 7870 has some voltage adjustment to 1.3 in AB on the core and the 270x does not. I can push the 270x way further on the memory though. I keep the 7870 on top because it allows me to stretch the core clocks out a bit more with less stress on the 270x and I keep the memory clocks down because there is not much performance increase past 1475 anyway.


The stock clocks of r9 270x devil which is 1180/1400. I didn't overclock my cards. I'm Playing games like COD Ghost, BF4 and Skyrim.


----------



## Devildog83

Quote:


> Originally Posted by *microkid21*
> 
> The stock clocks of r9 270x devil which is 1180/1400. I didn't overclock my cards. I'm Playing games like COD Ghost, BF4 and Skyrim.


Are you using Afterburner and if you are is the graphics overdrive disabled in CCC. Powercolors clocking tool is not really that good, I only use afterburner and turn off AMD graphics overdrive because they conflict.


----------



## microkid21

Quote:


> Originally Posted by *Devildog83*
> 
> Are you using Afterburner and if you are is the graphics overdrive disabled in CCC. Powercolors clocking tool is not really that good, I only use afterburner and turn off AMD graphics overdrive because they conflict.


If I use Afterburner, then what will happen? Do I need to overclock to get rid off the BSOD?


----------



## mAs81

Hey guys since we all like OC'ing and benchmarks,why not do it for a cause?I've seen some crazy numbers in this thread,so I'll just pop the question..If anyone of you would like to post your numbers,I'm sure you know that there's a 3D Fanboy Competition: nVidia vs AMD!! going on in OCN..So,go there and help the Red team out!!!Sorry for being







but the Green Team is currently kicking our butts..








Also,just want to say that nobody put me up to this,I just thought it'd be cool to beat those Green Fanboys once and for all


----------



## eAT5

Quote:


> Originally Posted by *microkid21*
> 
> If I use Afterburner, then what will happen? Do I need to overclock to get rid off the BSOD?


BSOD? dont ruin your cards, edpecially if its already OC;ed

Quote:


> Powercolor Devil HD R9 270X : Core 1223MHz, Memory 1592MHz
> 
> When I looked at the PowerColor Devil HD 7870 I found that the core really would stretch its legs once you started adding voltage to the mix. Unfortunately the only way to see any boost at all was by increasing the power limits to +20. Using that adjustment as a baseline I adjusted the fan speed to 100% to keep the components including the core, memory, and 7+1+1 phase power system cool to maximize the clock speed potential using the installed cooling solution. By using this method I was able to gain an additional 43Mhz over the rated boost clock speed of 1180Mhz. Kind of a disappointment when you see just what the architecture can do. However that disappointment in the core clock speed is tempered with the massive boost in memory clock speed that helps offset the performance deficit. A gain of 192Mhz in memory clock speed does help drive performance but not as much as core clock speed. Right from the factory Powercolor put a big tune on this card to allow it to perform at a level above the competition. By doing so they did not leave a lot of meat on the bone for the enthusiast. At least with the core clock speed. Once we get some voltage tuning ability on this one we should see the card spread its Devlish wings.


----------



## microkid21

Ok I manage to solve my issue which is giving me headaches for days.







And I just want to share for all those having an issue with their card that keeps getting blue screen while playing or pushing the card to its limit. With error code of 0xa000001 or atikmdag.sys related issue and also the "Display Driver stopped responding and has recovered" message like mine. So I brought my card to the shop where I purchased it, and report whats happening. I just tell them I'm having BSOD while playing intense game such as COD: Ghost, BF4 and Skyrim using different drivers, from 13.12 up to the latest beta. I'am properly uninstalling the driver using software like DDU. My PSU is Corsair 650W, and my ram sticks are in good shape. So after they test it the BSOD did not occur, while playing games. Then I ask them to run a sensor to monitor the temps of the card, and by looking at it, the temperature is only 60-62c which is way cooler than what I'am getting in my case. (It's summertime so expect it to be hotter) Maybe because of the room where they conduct the testing which is air conditioned. Usually the blue screen occurs for 15mins to 30mins, but It did not. So my card is NTF (No trouble Found).









Then after that I install the card in my system and Increase the fan speed by 50% because it is set to auto by default. Then after playing for an hour, the blue screen did not occur and also no "Display Driver stopped responding and has recovered" popup message on system tray.









Bottom line: I think R9 cards specially the 270x will shutdown itself or BSOD if the VRM and the GPU temps is reaching 73c+ using the auto fan speed of the card. So its an over heating issue because of the poor fan settings of the card if the fan speed is set to auto mode, You can fix this out by increasing the fan speed of the card manually (Mine set to 50%) or depending on how hot is it. Or by lowering your GPU and memory clock which I do not recommend because if you're a gamer / enthusiast you want to get the best performance so I will not underclock my card . By the way my card is Powercolor R9 270x Devil. I don't know if this will work for you, but just give it a try then show the results.


----------



## Roboyto

Quote:


> Originally Posted by *microkid21*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Ok I manage to solve my issue which is giving me headaches for days.
> 
> 
> 
> 
> 
> 
> 
> And I just want to share for all those having an issue with their card that keeps getting blue screen while playing or pushing the card to its limit. With error code of 0xa000001 or atikmdag.sys related issue and also the "Display Driver stopped responding and has recovered" message like mine. So I brought my card to the shop where I purchased it, and report whats happening. I just tell them I'm having BSOD while playing intense game such as COD: Ghost, BF4 and Skyrim using different drivers, from 13.12 up to the latest beta. I'am properly uninstalling the driver using software like DDU. My PSU is Corsair 650W, and my ram sticks are in good shape. So after they test it the BSOD did not occur, while playing games. Then I ask them to run a sensor to monitor the temps of the card, and by looking at it, the temperature is only 60-62c which is way cooler than what I'am getting in my case. (It's summertime so expect it to be hotter) Maybe because of the room where they conduct the testing which is air conditioned. Usually the blue screen occurs for 15mins to 30mins, but It did not. So my card is NTF (No trouble Found).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Then after that I install the card in my system and Increase the fan speed by 50% because it is set to auto by default. Then after playing for an hour, the blue screen did not occur and also no "Display Driver stopped responding and has recovered" popup message on system tray.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bottom line: I think R9 cards specially the 270x will shutdown itself or BSOD if the VRM and the GPU temps is reaching 73c+ using the auto fan speed of the card. So its an over heating issue because of the poor fan settings of the card if the fan speed is set to auto mode, You can fix this out by increasing the fan speed of the card manually (Mine set to 50%) or depending on how hot is it. Or by lowering your GPU and memory clock which I do not recommend because if you're a gamer / enthusiast you want to get the best performance so I will not underclock my card . By the way my card is Powercolor R9 270x Devil. I don't know if this will work for you, but just give it a try then show the results.


The Devil 270X has a pretty high overclock out of the box boosting to 1180 on the core. If you are trying to push it further than that you will need to add additional voltage with AB, Trixx or something of the like. I had an ASUS DC2 7870 that was watercooled and I was able to push it to around 1225 on the core, after that I was having problems. I read a number of reviews on the Devil 270X and there isn't much room for that card to go further than 1180 on the core clock since it is well past GHz specs already. I just bought one and am waiting for it to come in the mail for my wife










The core and VRMs shouldn't have a problem around 70C, if it was reaching 90+ than that would likely be cause for concern.

That message you're getting could be due to temperatures, or from unstable overclocking. I see the pic of rig has the side panel off the case, it may help to use the side panel fans to supply fresh air to the card instead of forcing constant fan speed on the card.


----------



## microkid21

Quote:


> Originally Posted by *Roboyto*
> 
> The Devil 270X has a pretty high overclock out of the box boosting to 1180 on the core. If you are trying to push it further than that you will need to add additional voltage with AB, Trixx or something of the like. I had an ASUS DC2 7870 that was watercooled and I was able to push it to around 1225 on the core, after that I was having problems. I read a number of reviews on the Devil 270X and there isn't much room for that card to go further than 1180 on the core clock since it is well past GHz specs already. I just bought one and am waiting for it to come in the mail for my wife
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The core and VRMs shouldn't have a problem around 70C, if it was reaching 90+ than that would likely be cause for concern.
> 
> That message you're getting could be due to temperatures, or from unstable overclocking. I see the pic of rig has the side panel off the case, it may help to use the side panel fans to supply fresh air to the card instead of forcing constant fan speed on the card.


Again I'am not trying to overclock my card and I have no intention since it is overclocked out of the box and I don't want to shorten the lifespan of the card. I'am getting BSOD with all settings of the card set to default or auto. I don't know if the core or VRM is ok with 73+c which is the temps of mine while running games just 15mins to 30 the blue screen will occur. So my solution for this is instead of leaving the fan settings to auto (while playing games its only 30-40% fan speed and temps of VRM and core is 73+c in my case) So I set it manually to 50% just to dissipate the heat and the result is good, the game did not crashed / no BSOD at all and I manage to play 1 hour with no issues encountered. I will continue this settings and play longer and post the results. The picture you saw is my system before, I did a lot of changes now. And about the side panel taken off, its for picture purposes only. The side panel of thor V2 is not windowed so you can't see the inside clearly.


----------



## Roboyto

Quote:


> Originally Posted by *microkid21*
> 
> Again I'am not trying to overclock my card and I have no intention since it is overclocked out of the box and I don't want to shorten the lifespan of the card. I'am getting BSOD with all settings of the card set to default or auto. I don't know if the core or VRM is ok with 73+c which is the temps of mine while running games just 15mins to 30 the blue screen will occur. So my solution for this is instead of leaving the fan settings to auto (while playing games its only 30-40% fan speed and temps of VRM and core is 73+c in my case) So I set it manually to 50% just to dissipate the heat and the result is good, the game did not crashed / no BSOD at all and I manage to play 1 hour with no issues encountered. I will continue this settings and play longer and post the results. The picture you saw is my system before, I did a lot of changes now. And about the side panel taken off, its for picture purposes only. The side panel of thor V2 is not windowed so you can't see the inside clearly.


The card shouldn't have issues in the 70C range though. AMD tech support told me about core temperature when I contacted them regarding Xfire for 6790s a ways back. The core should be able to operate near 100C without faltering, and most VRMs are rated to function over 100C; not that it is a good idea to do so in either case, but they can. It is good that forcing constant speed has fixed the problem, however which VRM is reaching 73+? Is it the primary VRM1 for core, or secondary VRM2 for RAM? If the secondary VRM is reaching 73+, than the primary is likely much hotter than that which could be the source of your problem.



Powercolor has appropriate sinks attached to the VRMs, and you can see the high quality chokes there as well.


----------



## microkid21

I don't know but after I changes my fan settings to manual my problems went away. My problem / issues is this and I already posted it here before,




That is what I'am getting before, can you tell me what cause this? If you can tell me, and manage to find out the solution then I might say that the card is not over heating.
Here is the list that I've done so far and did not sort things out.

1. Installed drivers 13.12, 13.10, 14.1, 14.2, 14.3 (uninstalling previous driver using DDU) also clean the registry after, using CC Cleaner
2. Already followed the instructions from AMD about atikmdag.sys related problems. http://support.amd.com/en-us/kb-articles/Pages/737-27116RadeonSeries-ATIKMDAGhasstoppedrespondingerrormessages.aspx
3.Followed instruction from other forums with the same problem like mine http://forums.guru3d.com/showthread.php?t=363233
4. Reformat PC / freshly installed OS
5. Updated Motherboard BIOS
6 Drivers and chipset are up to Date
7. Memtest both my RAM (Passed)
8. No bad sector on my Hard Drives
9. PSU: Corsair CS650M 80+ GOLD
10. Updated Windows


----------



## diggiddi

Run whocrashed it should give you info on your BSOD it looks like a driver issue

http://www.tomshardware.com/answers/id-1690717/bsod-0xa0000001-error.html


----------



## microkid21

Quote:


> Originally Posted by *diggiddi*
> 
> Run whocrashed it should give you info on your BSOD it looks like a driver issue
> 
> http://www.tomshardware.com/answers/id-1690717/bsod-0xa0000001-error.html


Like the bluescreenviewer which I used. I managed to solve this.


----------



## Roboyto

Quote:


> Originally Posted by *microkid21*
> 
> I don't know but after I changes my fan settings to manual my problems went away. My problem / issues is this and I already posted it here before,
> 
> That is what I'am getting before, can you tell me what cause this? If you can tell me, and manage to find out the solution then I might say that the card is not over heating.
> Here is the list that I've done so far and did not sort things out.
> 
> 7. Memtest both my RAM (Passed)


You never gave an answer to my question about VRM temperature. Is it VRM1 or VRM2 at 73+? These cards can easily run at 70C on the core, if yours is having problems doing that then maybe there is an issue with the card. Use GPU-Z or HWInfo to monitor VRM temps.

I looked at several reviews of the Devil 270X and none of them have the card breaching 68C under worse conditions than game play; some were Furmark or 3DMark Firestrike loops for 30 minutes. Either the card needs better airflow to it, or there could be something wrong with the way the HSF is mounted/seated.

I should be receiving my Devil tomorrow and I will have it to do a direct comparison with yours.

Do you have different RAM to try? Memtest isn't always a definitive test for bad RAM.


----------



## microkid21

Quote:


> Originally Posted by *Roboyto*
> 
> You never gave an answer to my question about VRM temperature. Is it VRM1 or VRM2 at 73+? These cards can easily run at 70C on the core, if yours is having problems doing that then maybe there is an issue with the card. Use GPU-Z or HWInfo to monitor VRM temps.
> 
> I looked at several reviews of the Devil 270X and none of them have the card breaching 68C under worse conditions than game play; some were Furmark or 3DMark Firestrike loops for 30 minutes. Either the card needs better airflow to it, or there could be something wrong with the way the HSF is mounted/seated.
> 
> I should be receiving my Devil tomorrow and I will have it to do a direct comparison with yours.
> 
> Do you have different RAM to try? Memtest isn't always a definitive test for bad RAM.


Its VRM1 that gets the73c. GPU-Z or HWInfo is what I used to monitor my temps. Like I said on my previous post
Quote:


> So after they test it the BSOD did not occur, while playing games. Then I ask them to run a sensor to monitor the temps of the card, and by looking at it, the temperature is only 60-62c which is way cooler than what I'am getting in my case. (It's summertime so expect it to be hotter) Maybe because of the room where they conduct the testing which is air conditioned. Usually the blue screen occurs for 15mins to 30mins, but It did not. So my card is NTF (No trouble Found).


The card is warmer in my case because my room is not air conditioned, and take note it is summer time expect it to be hotter. Of course the temperature may vary depending on how you test it and consider the room temperature. There is nothing wrong on how the HSF is seated / mounted because testing from another pc from the shop shows that the card is way cooler because of the room where it is tested was air conditioned. And in fact it is normal.

I don't have another ram available I just got 2x4GB. But memtest is more reliable than any other. What do you prefer? I also tried the windows memory diagnostics and found no problem with my ram. I will try to set the fan in auto then placed one stick of ram and post the result later. If I didn't get BSOD while testing one stick and fan set to auto, then maybe I have a bad Ram.


----------



## eAT5

Quote:


> Originally Posted by *microkid21*
> 
> Its VRM1 that gets the73c. GPU-Z or HWInfo is what I used to monitor my temps. Like I said on my previous post
> The card is warmer in my case because my room is not air conditioned, and take note it is summer time expect it to be hotter. Of course the temperature may vary depending on how you test it and consider the room temperature. There is nothing wrong on how the HSF is seated / mounted because testing from another pc from the shop shows that the card is way cooler because of the room where it is tested was air conditioned. And in fact it is normal.
> 
> I don't have another ram available I just got 2x4GB. But memtest is more reliable than any other. What do you prefer? I also tried the windows memory diagnostics and found no problem with my ram. I will try to set the fan in auto then placed one stick of ram and post the result later. If I didn't get BSOD while testing one stick and fan set to auto, then maybe I have a bad Ram.


mine starts to overheat if i have hot room. the ac makes all the difference, 7 Case fans, and 4 GPU fans.... plus i chopped out sections of my case, removed empty HD bays for a fan to get directly onto GPU's . way better.


----------



## eAT5

i got 2 Powercolor R9 280x Turbo Duo's 1040/1500


----------



## E-mil

I have MSI R9 270X Gaming.

I don't want to use Afterburner, or other programs for custom fan profiles, because it crash my online games.
Mostly play BF2 BC Vietnam and FarCry 3.

I just want to know, is it possible to edit and flash BIOS by myself, only with new fan profile?

Can I do that with VBE7 and what about this warning:
"Modifications only affect the legacy BIOS, and after saving, UEFI image will be disabled (in case you have UEFI vBIOS)"

Please, can someone explain to me this legacy/UEFI BIOS things?

Thanks


----------



## Roboyto

Quote:


> Originally Posted by *microkid21*
> 
> Its VRM1 that gets the73c. GPU-Z or HWInfo is what I used to monitor my temps. Like I said on my previous post
> The card is warmer in my case because my room is not air conditioned, and take note it is summer time expect it to be hotter. Of course the temperature may vary depending on how you test it and consider the room temperature. There is nothing wrong on how the HSF is seated / mounted because testing from another pc from the shop shows that the card is way cooler because of the room where it is tested was air conditioned. And in fact it is normal.
> 
> I don't have another ram available I just got 2x4GB. But memtest is more reliable than any other. What do you prefer? I also tried the windows memory diagnostics and found no problem with my ram. I will try to set the fan in auto then placed one stick of ram and post the result later. If I didn't get BSOD while testing one stick and fan set to auto, then maybe I have a bad Ram.


Not sure what else to say about your issue then other than it doesn't make any sense from my personal experience. My Sapphire Vapor-X HD4890 didn't exhibit issues for gaming until it was in the mid-upper 90C when overclocked/volted. I have had GCN cards easily exceed 70C once they were overclocked and mining BTC; a far more strenuous affair than playing a game for a couple hours. I was running a 7950 and a 7870 in a Coolermaster HAF XB, both air cooled.

I believe I have had this very same atikmdag.sys error, it looks very familiar, before and faulty RAM was the culprit. I tried everything you have done...fresh Windows with updates, all drivers numerous times, BIOS, checked PSU, hard drives, etc. etc.

I don't prefer any synthetic RAM test because they don't exactly replicate real world use. Several, 4, times, if I recall, from my experience with strange, unexplainable, or hard to pinpoint issues has faulty RAM been the culprit even after it has passed Memtest86 through DOS for 12+ hours of testing. Last time I ran into an issue like this I actually purchased MemTest Pro from HCI Design and let it run for 12+ hours overnight and the test came back clean. It wasn't until I swapped different RAM that the issues ceased. This has happened with Crucial Ballistix, G.Skill Ripjaws X, G.Skill Ares, and Corsair Vengence; all of which were DDR3 of varying speeds.

Pull the RAM out and re-seat the DIMMs, a tiny piece of dust can ruin everything. Use one DIMM as you have mentioned, and try different slots; motherboard could be faulty as well.


----------



## mAs81

Quote:


> Originally Posted by *E-mil*
> 
> I have MSI R9 270X Gaming.
> I don't want to use Afterburner, or other programs for custom fan profiles, because it crash my online games.
> Mostly play BF2 BC Vietnam and FarCry 3.
> I just want to know, is it possible to edit and flash BIOS by myself, only with new fan profile?
> Can I do that with VBE7 and what about this warning:
> "Modifications only affect the legacy BIOS, and after saving, UEFI image will be disabled (in case you have UEFI vBIOS)"
> Please, can someone explain to me this legacy/UEFI BIOS things?
> Thanks


Hi and welcome to OCN








Does CCC fan profiling also crash your games?..It seems strange to me..
I believe that with VBE7 you can make a new fan profile for your gpu.
A nice tutorial on flashing your gpu Bios can be found here
UEFI(Unified Extensible Firmware Interface) is a standard firmware interface for PCs, designed to replace BIOS (basic input/output system).
I believe that the 270X Gaming supports both legacy and UEFI BIOS,meaning if your computer has an older legacy BIOS or a newer graphical UEFI BIOS then this video card will work on either one.
_food for thought_


----------



## Devildog83

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *microkid21*
> 
> Ok I manage to solve my issue which is giving me headaches for days.
> 
> 
> 
> 
> 
> 
> 
> And I just want to share for all those having an issue with their card that keeps getting blue screen while playing or pushing the card to its limit. With error code of 0xa000001 or atikmdag.sys related issue and also the "Display Driver stopped responding and has recovered" message like mine. So I brought my card to the shop where I purchased it, and report whats happening. I just tell them I'm having BSOD while playing intense game such as COD: Ghost, BF4 and Skyrim using different drivers, from 13.12 up to the latest beta. I'am properly uninstalling the driver using software like DDU. My PSU is Corsair 650W, and my ram sticks are in good shape. So after they test it the BSOD did not occur, while playing games. Then I ask them to run a sensor to monitor the temps of the card, and by looking at it, the temperature is only 60-62c which is way cooler than what I'am getting in my case. (It's summertime so expect it to be hotter) Maybe because of the room where they conduct the testing which is air conditioned. Usually the blue screen occurs for 15mins to 30mins, but It did not. So my card is NTF (No trouble Found).






Then after that I install the card in my system and Increase the fan speed by 50% because it is set to auto by default. Then after playing for an hour, the blue screen did not occur and also no "Display Driver stopped responding and has recovered" popup message on system tray.









Bottom line: I think R9 cards specially the 270x will shutdown itself or BSOD if the VRM and the GPU temps is reaching 73c+ using the auto fan speed of the card. So its an over heating issue because of the poor fan settings of the card if the fan speed is set to auto mode, You can fix this out by increasing the fan speed of the card manually (Mine set to 50%) or depending on how hot is it. Or by lowering your GPU and memory clock which I do not recommend because if you're a gamer / enthusiast you want to get the best performance so I will not underclock my card . By the way my card is Powercolor R9 270x Devil. I don't know if this will work for you, but just give it a try then show the results.









Yep, you should set a user fan profile that keeps the card cool. Mine is set to run at 60% above 50c and 80% above 65c. I can't hear them much at all when gaming but I am not paying attention either.


----------



## Devildog83

*eAT5* Has been added! Welcome to the club.


----------



## M1kuTheAwesome

Quick MS Paint photo editing later and here are my Valley Extreme HD (Custom preset is just Extreme HD in window mode) and Heaven Extreme scores. The card was stressed as far as it goes (1219/1770), the CPU is at its thermal limit so I can't clock that higher until I upgrade my cooling. To be honest I have no idea if these scores are good or bad, but from what I've seen the look okay, feel free to correct me though.

How much could I gain from overclocking RAM? Currently running stock @ 1600MHz, could do 2133 for suicide runs. One thing I noticed though is that an unstable CPU will bring my scores down even if clocked higher, could RAM have the same effect?
Feel free to post your scores with similar hardware setups, I welcome friendly competition.


----------



## MaNNe88

I have a R9 280X XFX Black edition the dilemma I'm having is that I have 2 dvi-d monitors and a tv that I want to hook up to the graphics card. Apparently in order to hook the TV into this setup as well I need an active HDMI-> Displayport adapter, but the one I bought didn't work so I guess it wasn't an "active" adpater. So I'd be forever grateful if a kind soul here could link me to a working active hdmi->displayport adapter that doesn't cost an arm and a leg. Preferably a shop from an EU-country so I don't have to pay taxes. But at the very least they should ship to Finland. Thanks in advance!


----------



## microkid21

Ok an update, I played COD: Ghost a while ago with only 1 stick of ram installed at slot no.2 of my Motherboard. I did not adjust the fan speed and just leave it to auto. Again another crash / BSOD.


*0xa0000001 bug check code Atikmdag.sys* I'm using 13.12, Newly installed OS.

I will try the other one and see what will happen. Again I leave the gpu settings to auto.


----------



## microkid21

2nd attempt: Ram stick no. 2 on slot no. 2 of the motherboard, I've played a little longer than the first attempt, no BSOD but the game crash with the "Display Driver stopped responding and has recovered" message appeared on system tray. Again no adjustments made on the settings of the GPU, all set by auto / default and did not increase the fan speed settings either, I just leave it to auto.

3rd attempt will be installing the 1st ram on slot no. 1 (the slot nearest to the processor), No adjustments will be done just default settings of the GPU.


----------



## GuestVeea

Hello all. I recently got a new build, So I no longer have a 270x. I would like to update it to an XFX 280x Black Edition @1150mhz core, and 1600MHz memory.

Other Specs: Corsair Obsidian 750d
ASrock Fatal1ty 990FX Killer
AMD FX-8350 Black Edition @4.7GHz
XFX R9 280x Black Edition
8gb AMD Radeon Entertainment Series DDR3 Ram
Corsair H100i
Crucial M4 64gb SSD
WD Blue 500gb HDD

What do you guys think of this build? I would appreciate feedback. Thank you!


----------



## heroxoot

Quote:


> Originally Posted by *GuestVeea*
> 
> Hello all. I recently got a new build, So I no longer have a 270x. I would like to update it to an XFX 280x Black Edition @1150mhz core, and 1600MHz memory.
> 
> Other Specs: Corsair Obsidian 750d
> ASrock Fatal1ty 990FX Killer
> AMD FX-8350 Black Edition @4.7GHz
> XFX R9 280x Black Edition
> 8gb AMD Radeon Entertainment Series DDR3 Ram
> Corsair H100i
> Crucial M4 64gb SSD
> WD Blue 500gb HDD
> 
> What do you guys think of this build? I would appreciate feedback. Thank you!


Looks like a fine build and a damn decent overclock on the cpu. My only problem is the hard drive is too small. I've gotta say at one point my 1tb was 700gb of games. Maybe you don't have that problem yet, but eventually it happens.


----------



## Devildog83

Quote:


> Originally Posted by *GuestVeea*
> 
> Hello all. I recently got a new build, So I no longer have a 270x. I would like to update it to an XFX 280x Black Edition @1150mhz core, and 1600MHz memory.
> 
> Other Specs: Corsair Obsidian 750d
> ASrock Fatal1ty 990FX Killer
> AMD FX-8350 Black Edition @4.7GHz
> XFX R9 280x Black Edition
> 8gb AMD Radeon Entertainment Series DDR3 Ram
> Corsair H100i
> Crucial M4 64gb SSD
> WD Blue 500gb HDD
> 
> What do you guys think of this build? I would appreciate feedback. Thank you!


Updated - Very Nice !!


----------



## Recr3ational

Quote:


> Originally Posted by *GuestVeea*
> 
> Hello all. I recently got a new build, So I no longer have a 270x. I would like to update it to an XFX 280x Black Edition @1150mhz core, and 1600MHz memory.
> 
> Other Specs: Corsair Obsidian 750d
> ASrock Fatal1ty 990FX Killer
> AMD FX-8350 Black Edition @4.7GHz
> XFX R9 280x Black Edition
> 8gb AMD Radeon Entertainment Series DDR3 Ram
> Corsair H100i
> Crucial M4 64gb SSD
> WD Blue 500gb HDD
> 
> What do you guys think of this build? I would appreciate feedback. Thank you!


Good rig, I enjoyed my time with the 8350, how cool is the cpu? Cos at 4.7-5.1ghz i had to put it under custom loop as my h100 wasn't good enough.


----------



## smoke2

Please, can the owners of non-refference 280X from Powercolor, HIS, Club3D and new Sapphire Tri-X Vapor-X post their VRM temps?
I have worse cooled case.
Would be very appreciated.


----------



## GuestVeea

Quote:


> Originally Posted by *Recr3ational*
> 
> Good rig, I enjoyed my time with the 8350, how cool is the cpu? Cos at 4.7-5.1ghz i had to put it under custom loop as my h100 wasn't good enough.


It's doing just fine. While playing BF4 it gets 40-50c. Right now idling it's at 18c.


----------



## GuestVeea

Quote:


> Originally Posted by *heroxoot*
> 
> Looks like a fine build and a damn decent overclock on the cpu. My only problem is the hard drive is too small. I've gotta say at one point my 1tb was 700gb of games. Maybe you don't have that problem yet, but eventually it happens.


Ah, okay good plan. I will look into a bigger drive


----------



## Recr3ational

Quote:


> Originally Posted by *GuestVeea*
> 
> It's doing just fine. While playing BF4 it gets 40-50c. Right now idling it's at 18c.


okay, thats good. Just be careful dude. Its a deadly cpu lol.


----------



## GuestVeea

Quote:


> Originally Posted by *Recr3ational*
> 
> okay, thats good. Just be careful dude. Its a deadly cpu lol.


Alrighty I will. Thank you!


----------



## microkid21

Quote:


> Originally Posted by *Roboyto*
> 
> Pull the RAM out and re-seat the DIMMs, a tiny piece of dust can ruin everything. Use one DIMM as you have mentioned, and try different slots; motherboard could be faulty as well.




3rd attempt I installed the 1st ram on slot no. 1 (the slot nearest to the processor), No adjustments done just default settings of the GPU.
Another BSOD!







So all RAM / Configuration I made so far It gave me crashes / BSOD.
After re-seating the DIMMS, trying one stick at a time. So whats your thought about this? Take note its my 3rd attempt now. Do you have any another idea what's the culprit? Or I'll just stick with the fix that I found out?


----------



## mikemykeMB

Quote:


> Originally Posted by *microkid21*
> 
> 
> 
> 3rd attempt I installed the 1st ram on slot no. 1 (the slot nearest to the processor), No adjustments done just default settings of the GPU.
> Another BSOD!
> 
> 
> 
> 
> 
> 
> 
> So both RAM / Configuration I made It gave me crashes / BSOD.
> After re-seating the DIMMS, trying one stick at a time. So whats your thought about this? Take note its my 3rd attempt now. Do you have any another idea what's the culprit? Or I'll just stick with the fix that I found out?


Have read thru this?
http://support.amd.com/en-us/kb-articles/Pages/737-27116RadeonSeries-ATIKMDAGhasstoppedrespondingerrormessages.aspx

I have found that a corrupt save was dealing the BSOD for me.


----------



## microkid21

Quote:


> Originally Posted by *mikemykeMB*
> 
> Have read thru this?
> http://support.amd.com/en-us/kb-articles/Pages/737-27116RadeonSeries-ATIKMDAGhasstoppedrespondingerrormessages.aspx
> 
> I have found that a corrupt save was dealing the BSOD for me.


Yep I've tried that, It did not sort things out. I have finished the game for almost everyday that I played, I got BSOD. I'ts not a corrupt save, I'am selecting missions and just trying to finish it at least. But this crash /BSOD is blocking my way.







I have found a solution for this but Roboyto doesn't seems to agree, that it is the poor fan settings of Powercolor that causes my card to overheat. I know that 70c+ is still safe, but you'll never know.


----------



## mikemykeMB

Quote:


> Originally Posted by *microkid21*
> 
> Yep I've tried that, It did not sort things out. I have finished the game for almost everyday that I played, I got BSOD. I'ts not a corrupt save, I'am selecting missions and just trying to finish it at least. But this crash /BSOD is blocking my way.
> 
> 
> 
> 
> 
> 
> 
> I have found a solution for this but Roboyto doesn't seems to agree, that it is the poor fan settings of Powercolor that causes my card to overheat. I know that 70c+ is still safe, but you'll never know.


Ahh..so your solution worked with 1dimm installed w/out BSOD but a crash. Seems odd,... then with a temp issue it sets off a crash? Well there is 2 problems then, and when there is no BSOD, the game crashes with a possible driver error. i'm confused..sorry-shutting up


----------



## Roboyto

Quote:


> Originally Posted by *microkid21*
> 
> 
> 
> 3rd attempt I installed the 1st ram on slot no. 1 (the slot nearest to the processor), No adjustments done just default settings of the GPU.
> Another BSOD!
> 
> 
> 
> 
> 
> 
> 
> So all RAM / Configuration I made so far It gave me crashes / BSOD.
> After re-seating the DIMMS, trying one stick at a time. So whats your thought about this? Take note its my 3rd attempt now. Do you have any another idea what's the culprit? Or I'll just stick with the fix that I found out?


If you want to _nearly_ eliminate the RAM as a possibility then you have 5 more configurations to attempt if you are on your 3rd try; each DIMM must be tested in each slot individually. That still wouldn't guarantee that the RAM isn't bad because you don't have different RAM to substitute. If you've got a friend who's a PC gamer, ask to borrow theirs for a couple hours, then you would know absolutely the RAM isn't interfering. Or even head to a local store, have a BestBuy around, purchase some RAM and if you still have issues return it.

I *really* find it improbable that 73C would cause the card to BSOD or have driver crashes. Nothing else is overclocked correct?

Have you tried removing and re-installing the GPU? Or even a different PCIe slot? Friend of mine just had quirks with his PowerColor 7870 GHz and after re-installing the GPU everything was OK. If you have compressed air handy, blow out the PCIe slots to make sure nothing is interfering with contact.

Examine the GPU pins and the PCIe slot itself to make sure nothing is out of the ordinary.

My Devil 270X came in the mail today, so I will likely be testing it this evening. I will let you know how it performs.

How long have you had the card BTW? Has it been giving you problems since it was new? Have you recently changed anything else and now you're having these problems?


----------



## eAT5

Quote:


> Originally Posted by *GuestVeea*
> 
> It's doing just fine. While playing BF4 it gets 40-50c. Right now idling it's at 18c.


i get 37c in BF4, people always think this runs hot cause they read something on internet...


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Good rig, I enjoyed my time with the 8350, how cool is the cpu? Cos at 4.7-5.1ghz i had to put it under custom loop as my h100 wasn't good enough.


Mine sits at 4.8 and does just fine, 55c or less under P95, with an H100i. I keep it sub 1.5v by setting LLC at high and the HT link @ 2400 with the CPU/NB frequency @ 2200 to match the RAM. If I try 4.9 or higher I would need over 1.5v and don't have the cooling for that.


----------



## microkid21

Quote:


> Originally Posted by *mikemykeMB*
> 
> Ahh..so your solution worked with 1dimm installed w/out BSOD but a crash. Seems odd,... then with a temp issue it sets off a crash? Well there is 2 problems then, and when there is no BSOD, the game crashes with a possible driver error. i'm confused..sorry-shutting up


My solution for this is located at post #5114 of 5148 (Page 512) While gaming the temps of the card and VRM is reaching 73+ using default settings no overclocking has been made, I have no intention of doing so. The fan speed is set to default during that state and it is only spinning 30% which I think it can not dissipate the heat (consider the room temperature) and I'm 90% sure that causes my system to crash / BSOD. Then I set the fan speed to manual with 50% and it did not crash for 1 hour of gaming and I manage to finish 1 mission which is contrary to the usual.

Quote:


> Originally Posted by *Roboyto*
> 
> If you want to _nearly_ eliminate the RAM as a possibility then you have 5 more configurations to attempt if you are on your 3rd try; each DIMM must be tested in each slot individually. That still wouldn't guarantee that the RAM isn't bad because you don't have different RAM to substitute. If you've got a friend who's a PC gamer, ask to borrow theirs for a couple hours, then you would know absolutely the RAM isn't interfering. Or even head to a local store, have a BestBuy around, purchase some RAM and if you still have issues return it.
> 
> I *really* find it improbable that 73C would cause the card to BSOD or have driver crashes. Nothing else is overclocked correct?
> 
> Have you tried removing and re-installing the GPU? Or even a different PCIe slot? Friend of mine just had quirks with his PowerColor 7870 GHz and after re-installing the GPU everything was OK. If you have compressed air handy, blow out the PCIe slots to make sure nothing is interfering with contact.
> 
> Examine the GPU pins and the PCIe slot itself to make sure nothing is out of the ordinary.
> 
> My Devil 270X came in the mail today, so I will likely be testing it this evening. I will let you know how it performs.
> 
> How long have you had the card BTW? Has it been giving you problems since it was new? Have you recently changed anything else and now you're having these problems?


I only have two RAMs, and it is a set. During my 1st attempt, I installed ram1 to slot 2 of the motherboard, BSOD occurs. 2nd attempt ram2 to slot 2 the "display driver stopped responding..." message appears. I did not try to run it again on that same configuration but If I do I guess It'll BSOD again for sure. So now the 3rd attempt ram1 to slot1 of the motherboard, again it BSOD with the same error code from attempt no. 1 Later I will try to install the ram2 to slot1 and see how it goes, I have 2 more slots to try the 3rd one and the last. If this slots gives me Blue screen again then 2 of my ram are faulty? Or all my ram slots are faulty? which is kinda weird. I always clean my system twice a month so I'm sure that dust interfering is not the problem here.

Default settings, I have no intention of overclocking my card because its already factory overclocked.

Again, I brought my card to the shop where I bought it to think that it was faulty because of what I've experience and they tested it and works flawlessly, no BSOD occurs and the cards run cooler where they tested it (room air conditioned) so GPU pins are in good shape. I also tried using the 2nd PCIe slot but to no avail.

I bought it last day of January, so the card is only 2 months old now. At first of course it did not show signs of problems, but after a week I guess this weird "driver stopped responding...." message appears. At first I think it was only a driver issue, so I waited for another revision from AMD, the first driver I tried is the 13.12 which is I'm currently using right now. For almost two months 4 drivers that I used (including the beta drivers with mantle) the problem is still the same. I did not do anything else to my card since day 1 just the fan speed settings which also fixed the problem.


----------



## Devildog83

Quote:


> Originally Posted by *microkid21*
> 
> My solution for this is located at post #5114 of 5148 (Page 512) While gaming the temps of the card and VRM is reaching 73+ using default settings no overclocking has been made, I have no intention of doing so. The fan speed is set to default during that state and it is only spinning 30% which I think it can not dissipate the heat (consider the room temperature) and I'm 90% sure that causes my system to crash / BSOD. Then I set the fan speed to manual with 50% and it did not crash for 1 hour of gaming and I manage to finish 1 mission which is contrary to the usual.
> I only have two RAMs, and it is a set. During my 1st attempt, I installed ram1 to slot 2 of the motherboard, BSOD occurs. 2nd attempt ram2 to slot 2 the "display driver stopped responding..." message appears. I did not try to run it again on that same configuration but If I do I guess It'll BSOD again for sure. So now the 3rd attempt ram1 to slot1 of the motherboard, again it BSOD with the same error code from attempt no. 1 Later I will try to install the ram2 to slot1 and see how it goes, I have 2 more slots to try the 3rd one and the last. If this slots gives me Blue screen again then 2 of my ram are faulty? Or all my ram slots are faulty? which is kinda weird. I always clean my system twice a month so I'm sure that dust interfering is not the problem here.
> 
> Default settings, I have no intention of overclocking my card because its already factory overclocked.
> 
> Again, I brought my card to the shop where I bought it to think that it was faulty because of what I've experience and they tested it and works flawlessly, no BSOD occurs and the cards run cooler where they tested it (room air conditioned) so GPU pins are in good shape. I also tried using the 2nd PCIe slot but to no avail.
> 
> I bought it last day of January, so the card is only 2 months old now. At first of course it did not show signs of problems, but after a week I guess this weird "driver stopped responding...." message appears. At first I think it was only a driver issue, so I waited for another revision from AMD, the first driver I tried is the 13.12 which is I'm currently using right now. For almost two months 4 drivers that I used (including the beta drivers with mantle) the problem is still the same. I did not do anything else to my card since day 1 just the fan speed settings which also fixed the problem.


Do you have any way of testing that card in another machine? Have you tried the GPU in another slot? It just seems weird to me.


----------



## Roboyto

Quote:


> Originally Posted by *microkid21*
> 
> My solution for this is located at post #5114 of 5148 (Page 512) While gaming the temps of the card and VRM is reaching 73+ using default settings no overclocking has been made, I have no intention of doing so. *The fan speed is set to default during that state and it is only spinning 30%* which I think it can not dissipate the heat (consider the room temperature) and I'm 90% sure that causes my system to crash / BSOD. Then I set the fan speed to manual with 50% and it did not crash for 1 hour of gaming and I manage to finish 1 mission which is contrary to the usual.
> I only have two RAMs, and it is a set. During my 1st attempt, I installed ram1 to slot 2 of the motherboard, BSOD occurs. 2nd attempt ram2 to slot 2 the "display driver stopped responding..." message appears. I did not try to run it again on that same configuration but If I do I guess It'll BSOD again for sure. So now the 3rd attempt ram1 to slot1 of the motherboard, again it BSOD with the same error code from attempt no. 1 Later I will try to install the ram2 to slot1 and see how it goes, I have 2 more slots to try the 3rd one and the last. *If this slots gives me Blue screen again then 2 of my ram are faulty?* Or all my ram slots are faulty? which is kinda weird. I always clean my system twice a month so I'm sure that dust interfering is not the problem here.
> 
> Default settings, I have no intention of overclocking my card because its already factory overclocked.
> 
> Again, I brought my card to the shop where I bought it to think that it was faulty because of what I've experience and they tested it and works flawlessly, no BSOD occurs and the cards run cooler where they tested it (room air conditioned) so GPU pins are in good shape. I also tried using the 2nd PCIe slot but to no avail.
> 
> I bought it last day of January, so the card is only 2 months old now. At first of course it did not show signs of problems, but after a week I guess this weird "driver stopped responding...." message appears. At first I think it was only a driver issue, so I waited for another revision from AMD, the first driver I tried is the 13.12 which is I'm currently using right now. For almost two months 4 drivers that I used (including the beta drivers with mantle) the problem is still the same. I did not do anything else to my card since day 1 just the fan speed settings which also fixed the problem.


If on Auto settings the fans are only going to 30% then something is not right.

I am in the midst of testing my 270X right now, and I have left all settings for the GPU at stock. Just ran FFXIV Benchmark, with all graphics settings maxed, and the fan speed automatically went to 54% to keep temperatures in check.



As much as I don't like Furmark, I am running it now for sake of comparison between our cards and to hopefully find a solution for you. It is half way through the 15 minute burn-in test, and temperature has peaked at 75C and is holding steady at 74C with the fans running at 54%.







Regarding your RAM: It would be unlikely that all RAM slots are bad. It would be more likely both DIMMs are bad. You should finish testing each DIMM in the remaining slots to be certain.


----------



## microkid21

Quote:


> Originally Posted by *Devildog83*
> 
> Do you have any way of testing that card in another machine? Have you tried the GPU in another slot? It just seems weird to me.


The card is also tested in the shop last Sunday (2 days ago) and they found no problems at all. I watched them and monitor the temps while they conduct the testing but the result is NTF.
Quote:


> Again, I brought my card to the shop where I bought it to think that it was faulty because of what I've experience and they tested it and works flawlessly, no BSOD occurs and the cards run cooler where they tested it (room air conditioned) so GPU pins are in good shape. *I also tried using the 2nd PCIe slot but to no avail.*


----------



## Roboyto

Quote:


> Originally Posted by *microkid21*
> 
> The card is also tested in the shop last Sunday (2 days ago) and they found no problems at all. I watched them and monitor the temps while they conduct the testing but the result is NTF.


Pics are posted above.

FurMark is about as bad of a load as can be given to a GPU and mine peaked at 75C. It is in a HTPC case presently for test purposes and isn't getting very good airflow; Ambient temp in my condo presently is 72F. FFXIV benchmark peaked at 69C with the fans also going to 54% while on auto.



Are you certain of that 73C temperature? If the fans are staying at 30% then the card would likely exceed 73C. I am running the FFXIV benchmark again with the fans forced to 30% to see where the temperatures end up.


----------



## microkid21

Quote:


> Originally Posted by *Roboyto*
> 
> If on Auto settings the fans are only going to 30% then something is not right.
> 
> I am in the midst of testing my 270X right now, and I have left all settings for the GPU at stock. Just ran FFXIV Benchmark, with all graphics settings maxed, and the fan speed automatically went to 54% to keep temperatures in check.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> As much as I don't like Furmark, I am running it now for sake of comparison between our cards and to hopefully find a solution for you. It is half way through the 15 minute burn-in test, and temperature has peaked at 75C and is holding steady at 74C with the fans running at 54%.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Regarding your RAM: It would be unlikely that all RAM slots are bad. It would be more likely both DIMMs are bad. You should finish testing each DIMM in the remaining slots to be certain.


During my first week, I also run Furmark and we share the same results. It reach 75c running it for 30mins, and it show no signs of problems. When I got home, I will try to run the furmark again and leave the fan to auto and lets see if the fan speed increases and compare it to your results.. Later I will try to test the two other slots and show the results.

Oh I've noticed that we don't have the same BIOS version. Mine is 015.040.000.000.000000. Maybe this is the reason?


----------



## Devildog83

Nice card!!









I wonder why they didn't put R9 270x Devil on top like they did the HD 7870 Devil?


----------



## Roboyto

Quote:


> Originally Posted by *microkid21*
> 
> During my first week, I also run Furmark and we share the same results. It reach 75c running it for 30mins, and it show no signs of problems. When I got home, I will try to run the furmark again and leave the fan to auto and lets see if the fan speed increases and compare it to your results.. Later I will try to test the two other slots and show the results.
> 
> Oh I've noticed that we don't have the same BIOS version. Mine is 015.040.000.000.000000. Maybe this is the reason?


The BIOS could be effecting it, not sure though.

Forced the fan to 30% and about 90% through the FFXIV benchmark computer BSOD undoubtedly due to overheating...and look what we have here atikmdag.sys



I would try to run a benchmark to see where the fans automatically run at, if you see the temp getting too high cut the benchmark short..no need to abuse the card anymore than necessary. If you aren't on driver version 13.12, scrub with DDU and go back to it.

If you have any software like afterburner, Trixx, Precision X, etc installed remove all of them.

Check Catalyst Control Center and make sure that Overdrive hasn't been enabled to alter fan profiles.


----------



## microkid21

Quote:


> Originally Posted by *Roboyto*
> 
> Are you certain of that 73C temperature? If the fans are staying at 30% then the card would likely exceed 73C. I am running the FFXIV benchmark again with the fans forced to 30% to see where the temperatures end up.


I'll check it again just to make sure. The 30% that I'm getting is while gaming. I don't know if it is the same while running Furmark or any other benchmarks.


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> Nice card!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I wonder why they didn't put R9 270x Devil on top like they did the HD 7870 Devil?


This thing is extremely nice! I am very, very impressed with the quality of this card for its price. As an added bonus I took a gamble on a NewEgg open box for $183. The box itself had been open, but the GPU was still perfectly sealed in the static bag







All the original packaging, accessories, and mouse pad are there and in perfect condition.

The out of box overclock is pretty impressive. I had an Asus DC2 that I put a waterblock on and it was only capable of hitting ~1225 on the core. I'm anxious to see how well this (not so) little Devil will perform


----------



## Roboyto

Quote:


> Originally Posted by *microkid21*
> 
> I'll check it again just to make sure. The 30% that I'm getting is while gaming. I don't know if it is the same while running Furmark or any other benchmarks.


Even if it is not exceeding 30% while gaming that is no good. A game may not force absolute worst case temps like Furmark, but can easily load the GPU to the point of overheating if the fans don't exceed 30%. The cooler is good, but it's not THAT good.


----------



## cyph3rz

Ok I did it! I painted the shroud of my Gigabyte R9 280X REV1.0 for the fun of it. I should have been more patient painting it but it came out pretty good. I used Tamiya TS-49 bright red color which is a popular color and it closely matches the red of my Corsair AF120 fan rings. I'll be getting red sleeved 24 pin mobo cable and red sleeved PCI-e cable soon. Let me know what you think guys!


----------



## microkid21

@Devildog83 What's the BIOS version of your Devil R9 270x?
Mine is 015.040.000.000.000000
Quote:


> Originally Posted by *Roboyto*
> 
> The BIOS could be effecting it, not sure though.
> 
> Forced the fan to 30% and about 90% through the FFXIV benchmark computer BSOD undoubtedly due to overheating...and look what we have here atikmdag.sys
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> I would try to run a benchmark to see where the fans automatically run at, if you see the temp getting too high cut the benchmark short..no need to abuse the card anymore than necessary. If you aren't on driver version 13.12, scrub with DDU and go back to it.
> 
> If you have any software like afterburner, Trixx, Precision X, etc installed remove all of them.
> 
> Check Catalyst Control Center and make sure that Overdrive hasn't been enabled to alter fan profiles.


Oh I forgot to revert back to 13.12, I also report this issue to Powercolor on Thursday, April 3, 2014 and they instructed me to use the driver from them which is I think more advanced than 13.12. This is what they said
Quote:


> This driver is newer than 13.12 and it is a WHQL driver. But this driver just AMD release to us not on website. I hope this driver can help you solve the problem, if it can't solve, please do RMA service to check card.
> Best Regards
> Polar


I have reported this issue to Powercolor last week and two of their guys replied, the first one sent me the driver (newer than 13.12) and the other is telling me to flashed my VBIOS I will not post the links because he tells me to >
Quote:


> One more thing, please do not release the BIOS attached to the public. Flashing the BIOS may result the damage to the card.


----------



## Roboyto

Quote:
Originally Posted by *microkid21* 


> Oh I forgot to revert back to 13.12, I also report this issue to Powercolor on Thursday, April 3, 2014 and they instructed me to use the driver from them which is I think more advanced than 13.12. This is what they said
> I have reported this issue to Powercolor last week and two of their guys replied, the first one sent me the driver (newer than 13.12) and the other is telling me to flashed my VBIOS I will not post the links because he tells me to >


 Test their drivers and even that BIOS to see what happens.

If you can't get the fans to run on auto properly you may have to RMA the card. Hopefully they can do an advanced RMA and ship you a new card first.

It's not that difficult to change fan settings, but Auto is convenience and it SHOULD work. Let me know what happens...I'm going back to building in this Corsair 250D now

http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition

Best of luck to you


----------



## microkid21

How will I flashed my BIOS then? Did you try to flashed your vbios before?


----------



## Roboyto

Quote:


> Originally Posted by *microkid21*
> 
> How will I flashed my BIOS then? Did you try to flashed your vbios before?


I have, but it has been quite some time since I have played around with it.

This would probably be a good place to start reading

http://www.overclock.net/t/1353325/tutorial-atiwinflash-how-to-flash-the-bios-of-your-ati-cards

You could always contact PowerColor tech support for some specific instructions as well.


----------



## microkid21

This is the instructions from Powercolor tech
Quote:


> The fixed BIOS attached.
> Pls. try to flash the BIOS by following steps.
> 
> Please make sure you update it in pure DOS mode, here you can see
> information about how to create USB Pendisk boot DOS,
> http://www.overclock.net/ati/671675-how-flash-your-5850-bios.html
> 
> 1. Extract BIOS file and utility to hard disk or USB Pen disk. Here is an
> example of using USB Pen disk.
> 2. Restart your PC in MS-DOS mode.
> 3. In DOS prompt, change directory to your BIOS location, for example,
> C:\>fixed\ 4. Type "atiflash.exe -ai", this is to check your VGA card
> adapter location. If you have only one adapter, the adapter should be at
> 0(zero).
> Type "atiflash -s 0 back.bin" to save your original BIOS.
> (backup.bat) This can help you keep original BIOS.
> Then type "atiflash -p 0 F3401LAA.ROM -f" to update VGA BIOS. "-p"
> is to program, "-f" is to program forcibly. (update.bat) 5. Restart PC to
> finish when you see "Restart System To Complete VBIOS Update" message.
> 
> If you still have same problems, after you done the procedure above, I think
> it is better way to do RMA service.
> 
> One more thing, please do not release the BIOS attached to the public.
> Flashing the BIOS may result the damage to the card.
> 
> BR
> 
> Don Cho
> PR and Marketing Consultant
> PowerColor


Is ATIFlash and ATIWinFlash the same?


----------



## ricko99

Anyone has any idea why I can't underclock my Asus r9 280x? I can overclock it but whenever I try to underclock eg. 1070/1575, it stays at the stock clock of 1070/1600









EDIT: apparently in game it shows 1070/1575 but gpu-z doesn't show it. weird enough


----------



## E-mil

Quote:


> Originally Posted by *mAs81*
> 
> Hi and welcome to OCN
> 
> 
> 
> 
> 
> 
> 
> 
> Does CCC fan profiling also crash your games?..It seems strange to me..
> I believe that with VBE7 you can make a new fan profile for your gpu.
> A nice tutorial on flashing your gpu Bios can be found here
> UEFI(Unified Extensible Firmware Interface) is a standard firmware interface for PCs, designed to replace BIOS (basic input/output system).
> I believe that the 270X Gaming supports both legacy and UEFI BIOS,meaning if your computer has an older legacy BIOS or a newer graphical UEFI BIOS then this video card will work on either one.
> _food for thought_


Thank you mAs81 for response.

I didn't try CCC fan profiling, but I will. Are you sure that is possible to make custom fan profile with CCC?

About crashing games. Only a few games are affected, and I know that is not problem with graphics card, but punkbuster and game-bad-code.
I have Maximus IV MBO and Win7 / 64bit

"warning" I am talking about, I read it here: (first post)
http://www.overclock.net/t/1395490/ati-hd-7950-7970-bios-mod-thread

VBE7 is not officialy support R9 cards, but R9 270/280/290 are same cards like HD 7870/7950/7970


----------



## D3VIL82

Hi

I just bought an ASUS r9 280X GPU but facing some problem

When I first boot the system I get the intel splash screen. If I leave the system alone, it goes to a BLACK SCREEN (Not Blank Screen) with a 0_ in the bottom right hand corner . On the actual motherboard, I get 00 on the LED display.

I have tried to do is Boot to a Bootable USB and/or Bootable DVD

I simply select the USB or DVD. What happens? As soon as I press the enter key on my keyboard, a little 0 pops up in the bottom right hand corner of the monitor. The system does not even attempt to process my command to boot to the media. It's like it is predeterminedthat it will not boot no matter what I do.

I was told that 00 on the LED means the board is in normal boot state and that there may be a problem with the board. I have no other tricks up my sleeve when it comes to making this board boot. I have removed all unnecessary devices and tried the most basic boot to no luck.

Processor -i7 2600k
Motherboard - DZ68ZV
Graphics Card - ASUS r9 280X DirectCU II TOP
PSU - AX750


----------



## microkid21

Quote:


> Originally Posted by *Roboyto*
> 
> Even if it is not exceeding 30% while gaming that is no good. A game may not force absolute worst case temps like Furmark, but can easily load the GPU to the point of overheating if the fans don't exceed 30%. The cooler is good, but it's not THAT good.


Ok I run furmark a while ago, and this is the results. All settings are default. The fan speed is set to auto, and jump to 50% at 72c immediately.


And this Is the weird part, look at my clock speed on pic no. 1 it only shows that it is 1150. And after the burn in benchmark this is what It shows. GPU core clock and memory is 1180 / 1400. With the maximum temp that I get 81c with 60% fan speed (auto) No BSOD Occurred.










It only shows that the HSF are in good shape. And it is not the auto fan settings which causes my system to BSOD while gaming. Sorry for pointing that one out. I guess I've made a mistake. One thing I noticed, look at my 12v rail. Is it normal while running furmark the sensors shows that it is 11.9 / 11.8 only? My PSU is Corsair CS650M 80+ Gold, which I think its kinda low for the 12v rail right?

I'll continue to run the testing with the ram at 4th attempt, ram no2 installed on slot no.1 of the motherboard and see how it goes.


----------



## Roboyto

Quote:


> Originally Posted by *microkid21*
> 
> This is the instructions from Powercolor tech
> Is ATIFlash and ATIWinFlash the same?


No, ATIFlash will be for DOS, so if they suggested to go through DOS then do that.


----------



## microkid21

I'm not satisfied on the first attempt of running furmark benchmark so I had to run it again and before that I noticed this weird increase in clock rate while idling. What does this mean?


And after the benchmark ends, I take a screenshot and GPU z shows.


The 12v rail drops to 11.88. Is this normal?


----------



## Devildog83

Quote:


> Originally Posted by *microkid21*
> 
> @Devildog83 What's the BIOS version of your Devil R9 270x?
> Mine is 015.040.000.000.000000
> 
> Mine is the same.


----------



## Devildog83

Quote:


> Originally Posted by *microkid21*
> 
> I'm not satisfied on the first attempt of running furmark benchmark so I had to run it again and before that I noticed this weird increase in clock rate while idling. What does this mean?
> 
> 
> And after the benchmark ends, I take a screenshot and GPU z shows.
> 
> 
> The 12v rail drops to 11.88. Is this normal?


11.88 is OK, 11.50 I would worry a bit.


----------



## Devildog83

Quote:


> Originally Posted by *Roboyto*
> 
> This thing is extremely nice! I am very, very impressed with the quality of this card for its price. As an added bonus I took a gamble on a NewEgg open box for $183. The box itself had been open, but the GPU was still perfectly sealed in the static bag
> 
> 
> 
> 
> 
> 
> 
> All the original packaging, accessories, and mouse pad are there and in perfect condition.
> 
> The out of box overclock is pretty impressive. I had an Asus DC2 that I put a waterblock on and it was only capable of hitting ~1225 on the core. I'm anxious to see how well this (not so) little Devil will perform


My HD 7870 was a review sample/open box and it has rocked from day 1. It's a chance you take but the retail was $260 and I paid $210. I paid $230 for the 270x.


----------



## M1kuTheAwesome

Took the time and found the patience to replace the thermal pad under the VRM heatsink with MX4 thermal paste. My VRM1 temps (OC 1219MHz core @ 1300mV) went from 81C to 77C in Sleeping Dogs benchmark (3xloop), the most heat productive and power consuming real world test that I've found. Will keep an eye on the temps for a while but it seems I knocked those few degrees off to keep VRM temps under 80.


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> 11.88 is OK, 11.50 I would worry a bit.


That voltage is normal.

Don't use GPU-z to check clocks during furmark, mine didn't report right either. What did furmark say at the top of the page? The card should boost to 1180, as mine did.

I would continue to test RAM in different slots and if that still gives you BSOD I would go to a local store and buy some RAM to test with. Just make sure you can return it once the package is open.


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> My HD 7870 was a review sample/open box and it has rocked from day 1. It's a chance you take but the retail was $260 and I paid $210. I paid $230 for the 270x.


I've had about 75% of open box from NewEgg or Microcenter be solid as a rock. Sometimes they're DOA or just poor performers which was probably why they were returned in the first place.


----------



## microkid21

Quote:


> Originally Posted by *Roboyto*
> 
> That voltage is normal.
> 
> Don't use GPU-z to check clocks during furmark, mine didn't report right either. What did furmark say at the top of the page? The card should boost to 1180, as mine did.



Quote:


> Originally Posted by *Roboyto*
> I would continue to test RAM in different slots and if that still gives you BSOD I would go to a local store and buy some RAM to test with. Just make sure you can return it once the package is open.


Ok attempt no. 4 : This is the test with no issues / problem encountered! I manage to finish 1 mission
Ram no. 2 at slot no. 1








These are the results.




Noticed that 2nd attempt which is Ram stick no. 2 on slot no. 2 of the motherboard, I've played a little longer than the first attempt, no BSOD but the game crash with the "Display Driver stopped responding and has recovered" message appeared on system tray.

I guess Ram no. 2 does not have a problem but lets see, I'll continue the testing with the 2 remaining slots.


----------



## Skye12977

Alright I'm trying to see how far I can push my 270x for the AMD vs Nvidia fanboy competition.
I'm not able figure out how to add more voltage to it, I really want to get my core past 1300.


----------



## heroxoot

Quote:


> Originally Posted by *microkid21*
> 
> I'm not satisfied on the first attempt of running furmark benchmark so I had to run it again and before that I noticed this weird increase in clock rate while idling. What does this mean?
> 
> 
> And after the benchmark ends, I take a screenshot and GPU z shows.
> 
> 
> The 12v rail drops to 11.88. Is this normal?


Never trust software. If I trusted software my 3.3v rail showing 2.9v would be worrying. But since my HDD has never shut down on me, neither of them, I can only assume its wrong. Use a multimeter to test your PSU.


----------



## Roboyto

Quote:


> Originally Posted by *microkid21*
> 
> 
> Ok attempt no. 4 : This is the test with no issues / problem encountered! I manage to finish 1 mission
> Ram no. 2 at slot no. 1
> 
> 
> 
> 
> 
> 
> 
> 
> These are the results.
> 
> 
> 
> 
> Noticed that 2nd attempt which is Ram stick no. 2 on slot no. 2 of the motherboard, I've played a little longer than the first attempt, no BSOD but the game crash with the "Display Driver stopped responding and has recovered" message appeared on system tray.
> 
> I guess Ram no. 2 does not have a problem but lets see, I'll continue the testing with the 2 remaining slots.


Good to see the core/memory are running at appropriate speeds in Furmark.

Finish testing both RAM sticks in all slots, make sure you're notating the results of each slot/stick combination. You may find a pattern of some sort to tell you what is acting up.


----------



## MarlowXim

In for day 7

stq : 10


----------



## Roboyto

Quote:


> Originally Posted by *D3VIL82*
> 
> Hi
> 
> I just bought an ASUS r9 280X GPU but facing some problem
> 
> When I first boot the system I get the intel splash screen. If I leave the system alone, it goes to a BLACK SCREEN (Not Blank Screen) with a 0_ in the bottom right hand corner . On the actual motherboard, I get 00 on the LED display.
> 
> I have tried to do is Boot to a Bootable USB and/or Bootable DVD
> 
> I simply select the USB or DVD. What happens? As soon as I press the enter key on my keyboard, a little 0 pops up in the bottom right hand corner of the monitor. The system does not even attempt to process my command to boot to the media. It's like it is predeterminedthat it will not boot no matter what I do.
> 
> I was told that 00 on the LED means the board is in normal boot state and that there may be a problem with the board. I have no other tricks up my sleeve when it comes to making this board boot. I have removed all unnecessary devices and tried the most basic boot to no luck.
> 
> Processor -i7 2600k
> Motherboard - DZ68ZV
> Graphics Card - ASUS r9 280X DirectCU II TOP
> PSU - AX750


Try disconnecting everything and removing the RAM. You should get POST beeps/codes on the LED display for no RAM installed. If it doesn't respond to this critical portion of the POST then the board is bad.


----------



## microkid21

Quote:


> Originally Posted by *Roboyto*
> 
> Good to see the core/memory are running at appropriate speeds in Furmark.
> 
> Finish testing both RAM sticks in all slots, make sure you're notating the results of each slot/stick combination. You may find a pattern of some sort to tell you what is acting up.


Attempt 5 : Ram1 Slot no. 3 of the motherboard.
The result was failed. I tested it twice with that configuration, the BSOD occurred 2 times.


On attempt no.6, lets see if Ram2 can get through and passed the test on slot no 3. If it goes through then I might say that I have a faulty RAM.


----------



## JaredLaskey82

Right. I have two EK-FC R9-280X DCU2 backplates on the way.

I was just wondering if anyone has taken appart their ASUS 280X DCU2 Top cards to watercool and so on.
I am not water cooling, just putting the backplates on each card after painting them white.

Just want to know if there is any thermal pads between the ref DCU2 heatsink and memory modules or VRM that I am going to have to replace.

I have good thermal compound I am going to be using, but want to know now so I can have everything ready to go and not have to wait to order thermal pads in and have down time on my cards.


----------



## JaredLaskey82

Quote:


> Originally Posted by *cyph3rz*
> 
> Ok I did it! I painted the shroud of my Gigabyte R9 280X REV1.0 for the fun of it. I should have been more patient painting it but it came out pretty good. I used Tamiya TS-49 bright red color which is a popular color and it closely matches the red of my Corsair AF120 fan rings. I'll be getting red sleeved 24 pin mobo cable and red sleeved PCI-e cable soon. Let me know what you think guys!


Not bad. Works well with the colour scheme you have going on.
I think I may have started a trend here.


----------



## microkid21

Attempt 6: Ram 2 slot no. 3 of the motherboard
Successful, 1 played for 1 hour and manage to finish two missions, with no issues / problem encountered.

The culprit of having BSOD for days with error code *0xa000001(atikmdag.sys) ATI Kernel Mode Driver* is because one of my RAM is faulty.
It's clear now. I'll have to RMA this before it can damage my whole system. Thank you Roboyto for giving me some tips and pointing out whats the cause of this weird crash / BSOD.

Devildog83 Please add me on the list. Thank you!
*Powercolor R9 270x Devil 1180 / 1400*


----------



## Devildog83

*microkid21* has been added. Welcome!! I am glad you have figured out your problem.


----------



## heroxoot

Quote:


> Originally Posted by *microkid21*
> 
> Attempt 6: Ram 2 slot no. 3 of the motherboard
> Successful, 1 played for 1 hour and manage to finish two missions, with no issues / problem encountered.
> 
> The culprit of having BSOD for days with error code *0xa000001(atikmdag.sys) ATI Kernel Mode Driver* is because one of my RAM is faulty.
> It's clear now. I'll have to RMA this before it can damage my whole system. Thank you Roboyto for giving me some tips and pointing out whats the cause of this weird crash / BSOD.


If both your ram came in 1 kit they make you RMA the whole kit. You may have an issue considering you only have 2x4gb kit. I had a kit that was completely bad and had to send it in, but Gskill asks for the whole kit back. RMA is pretty quick tho.


----------



## Roboyto

Quote:


> Originally Posted by *microkid21*
> 
> Attempt 6: Ram 2 slot no. 3 of the motherboard
> Successful, 1 played for 1 hour and manage to finish two missions, with no issues / problem encountered.
> 
> The culprit of having BSOD for days with error code *0xa000001(atikmdag.sys) ATI Kernel Mode Driver* is because one of my RAM is faulty.
> It's clear now. I'll have to RMA this before it can damage my whole system. Thank you Roboyto for giving me some tips and pointing out whats the cause of this weird crash / BSOD.
> 
> Devildog83 Please add me on the list. Thank you!
> *Powercolor R9 270x Devil 1180 / 1400*
> 
> 
> Spoiler: Warning: Spoiler!


I'm glad that you got to the heart of the issue.

When you contact G.Skill for RMA ask if they can do an advanced RMA. If they have this option, they will put a pending charge on a debit/credit card and ship you new RAM; you will incur a small fee for shipping. Once you get the new RAM, just drop your faulty RAM in the box and send it back with the included pre-paid shipping label. Once they receive the faulty RAM they remove the charge from your debit/credit card and you have 0 down time.


----------



## Roboyto

Quote:


> Originally Posted by *JaredLaskey82*
> 
> Right. I have two EK-FC R9-280X DCU2 backplates on the way.
> 
> I was just wondering if anyone has taken appart their ASUS 280X DCU2 Top cards to watercool and so on.
> I am not water cooling, just putting the backplates on each card after painting them white.
> 
> Just want to know if there is any thermal pads between the ref DCU2 heatsink and memory modules or VRM that I am going to have to replace.
> 
> I have good thermal compound I am going to be using, but want to know now so I can have everything ready to go and not have to wait to order thermal pads in and have down time on my cards.


http://www.techpowerup.com/reviews/ASUS/R9_280X_Direct_Cu_II_TOP/4.html

It doesn't look like the card has any heatsinks on the RAM. So you wouldn't have to change anything if you didn't want to.



The VRMs have a heatsink on them already, and it would be easily removable. You can see the spring loaded button that holds the heatsink down. They just require squeezing the pin from the backside of the card and it should come off easily.

I don't know if these cards have issues with VRM temperatures, but if you wanted to upgrade the thermal pad my suggestion would be Fujipoly Ultra Extreme. I used these pads as an upgrade from what came with my XSPC waterblock for my R9 290, and temperatures came down 25 percent! http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures

If you decide to upgrade the thermal pads you will need this: http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html?tl=g8c487s1797&id=TLswenkp

To really make use of the backplates I would suggest using thermal pads on the backside of the VRMs as well to make contact with the backplate; turning it into a heatsink. VRMs in Orange

If you decide to also add thermal pads to the rear of the card you would need a 2nd pad that I linked above.



Just make sure the backplates will fit with that bracket that is on the top of the card. Bracket in Yellow

Edit: Just realized you said you bought the backplate specifically for DC2 cards.


----------



## JaredLaskey82

Quote:


> Originally Posted by *Roboyto*
> 
> http://www.techpowerup.com/reviews/ASUS/R9_280X_Direct_Cu_II_TOP/4.html
> 
> It doesn't look like the card has any heatsinks on the RAM. So you wouldn't have to change anything if you didn't want to.
> 
> 
> 
> The VRMs have a heatsink on them already, and it would be easily removable. You can see the spring loaded button that holds the heatsink down. They just require squeezing the pin from the backside of the card and it should come off easily.
> 
> I don't know if these cards have issues with VRM temperatures, but if you wanted to upgrade the thermal pad my suggestion would be Fujipoly Ultra Extreme. I used these pads as an upgrade from what came with my XSPC waterblock for my R9 290, and temperatures came down 25 percent! http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures
> 
> If you decide to upgrade the thermal pads you will need this: http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html?tl=g8c487s1797&id=TLswenkp
> 
> To really make use of the backplates I would suggest using thermal pads on the backside of the VRMs as well to make contact with the backplate; turning it into a heatsink. VRMs in Orange
> If you decide to also add thermal pads to the rear of the card you would need a 2nd pad that I linked above.
> 
> 
> Just make sure the backplates will fit with that bracket that is on the top of the card. Bracket in Yellow
> Edit: Just realized you said you bought the backplate specifically for DC2 cards.


The EK backplate comes will thermal pads for the VRM and memory, so I am set for that. I just did not want to pull apart my card and then find I need new pads to connect the heatsink to the memory and so on. If I did need them, I would order them now so I have everything ready for pulling apart both cards and so forth.


----------



## Roboyto

Quote:


> Originally Posted by *JaredLaskey82*
> 
> The EK backplate comes will thermal pads for the VRM and memory, so I am set for that. I just did not want to pull apart my card and then find I need new pads to connect the heatsink to the memory and so on. If I did need them, I would order them now so I have everything ready for pulling apart both cards and so forth.


There have been several people in the 290X thread with EK blocks that have had similar temperature drops to mine by changing thermal pad; you may want to consider it.


----------



## JaredLaskey82

Quote:


> Originally Posted by *Roboyto*
> 
> There have been several people in the 290X thread with EK blocks that have had similar temperature drops to mine by changing thermal pad; you may want to consider it.


But you just said there are no thermal pads connected directly to the heatsink.

My temps are good as I am using Arctic Silver 5 for the GPU and my case is well ventilated. Even overclocked to 1100 MHz I am not hitting over 78 Deg.


----------



## Recr3ational

Be careful with AS 5.
I say this because, SOME people say it's conductive, I've never had any issues but, sometimes I get people literally shouting at me that it is lol.


----------



## Roboyto

Quote:


> Originally Posted by *JaredLaskey82*
> 
> But you just said there are no thermal pads connected directly to the heatsink.
> 
> My temps are good as I am using Arctic Silver 5 for the GPU and my case is well ventilated. Even overclocked to 1100 MHz I am not hitting over 78 Deg.


I said the RAM didn't have anything on them.

The VRMs do have a heatsink and thus thermal pad underneath. If you plan on going further than 1100 and adding more voltage I would want to keep the VRMs as cool as possible.


----------



## microkid21

Quote:


> Originally Posted by *heroxoot*
> 
> If both your ram came in 1 kit they make you RMA the whole kit. You may have an issue considering you only have 2x4gb kit. I had a kit that was completely bad and had to send it in, but Gskill asks for the whole kit back. RMA is pretty quick tho.


They are Kit, But its ok. I'll just RMA them both.


----------



## JaredLaskey82

Quote:


> Originally Posted by *Roboyto*
> 
> I said the RAM didn't have anything on them.
> 
> The VRMs do have a heatsink and thus thermal pad underneath. If you plan on going further than 1100 and adding more voltage I would want to keep the VRMs as cool as possible.


Ah yes.

Im more of a gamer than an overclocker, so a Firestrike mark of 10582 is enough to keep me happy for now.
If i wanted to go higher I will be building a custom cooling loop.


----------



## microkid21

Quote:


> Originally Posted by *Roboyto*
> 
> I'm glad that you got to the heart of the issue.
> 
> When you contact G.Skill for RMA ask if they can do an advanced RMA. If they have this option, they will put a pending charge on a debit/credit card and ship you new RAM; you will incur a small fee for shipping. Once you get the new RAM, just drop your faulty RAM in the box and send it back with the included pre-paid shipping label. Once they receive the faulty RAM they remove the charge from your debit/credit card and you have 0 down time.


There still a shop warranty for my RAM, so I'll have to return it back to the shop where I purchased the item. And this came as Kit that's why I'll have to return them both.


----------



## Roboyto

Quote:


> Originally Posted by *microkid21*
> 
> There still a shop warranty for my RAM, so I'll have to return it back to the shop where I purchased the item. And this came as Kit that's why I'll have to return them both.


Even better!


----------



## Devildog83

Quote:


> Originally Posted by *microkid21*
> 
> There still a shop warranty for my RAM, so I'll have to return it back to the shop where I purchased the item. And this came as Kit that's why I'll have to return them both.


May the RA Gods be kind to you and may your RAM arrive in a speedy manor and function perfectly.


----------



## Roboyto

Quote:


> Originally Posted by *JaredLaskey82*
> 
> Ah yes.
> 
> Im more of a gamer than an overclocker, so a Firestrike mark of 10582 is enough to keep me happy for now.
> If i wanted to go higher I will be building a custom cooling loop.


Loops are fun, but expensive! Very happy with my 290 under water, got a superb overclocker; 1300/1700. Able to run eyefinity with the one card with very respectable frames in everything I've played so far.


----------



## microkid21

Quote:


> Originally Posted by *Devildog83*
> 
> May the RA Gods be kind to you and may your RAM arrive in a speedy manor and function perfectly.


Thanks a lot!







I'll test it immediately after RMA, 1 stick at a time.


----------



## JaredLaskey82

Quote:


> Originally Posted by *Roboyto*
> 
> Loops are fun, but expensive! Very happy with my 290 under water, got a superb overclocker; 1300/1700. Able to run eyefinity with the one card with very respectable frames in everything I've played so far.


I know. I am looking around $700 AUD for the loup I am wanting to built in 6 months or so.


----------



## mxfreek09

Got my Gigabyte R9 270x tonight. I threw it in my spare parts system and was playing around with some OC's. Heres the results so far. (Stock is on left - OC on right)


----------



## Devildog83

Quote:


> Originally Posted by *mxfreek09*
> 
> Got my Gigabyte R9 270x tonight. I threw it in my spare parts system and was playing around with some OC's. Heres the results so far. (Stock is on left - OC on right)


What clocks? Not a bad overclocked score there. Almost as good as my 7870 Devil high in valley.


----------



## mxfreek09

Quote:


> Originally Posted by *Devildog83*
> 
> What clocks? Not a bad overclocked score there. Almost as good as my 7870 Devil high in valley.


1200/1500 is what I have been sitting at stable. I made a run at 1225/1500 ONCE, but haven't been able to get it again. I am pretty happy with it as is though, 65c seems to be its max and thats in a little Core 1000. If I stop being lazy I may move it into my main rig and see how it runs.


----------



## NBAasDOGG

Hi gens,

I bought myself two MSI R9 280X Gaming 3G. This thing is awesome and has new PCB comparing to older version (this one has wider PCB with 10VRMS). I'm actually very lucky it is not voltage locked, but the problem is that MSI afterburner (beta19) does not allow more than +100mv, whish gives me 1.256v and a stable 1220/1800. My question is; how can i get 1.3v on these cards, and how to change memory voltage? I remember that previous bèta's allowed a tweak to the voltage slider, but that is not possible with beta19. I really want to put more volts on these card, but how?

Thank Guus


----------



## Recr3ational

Quote:


> Originally Posted by *NBAasDOGG*
> 
> Hi gens,
> 
> I bought myself two MSI R9 280X Gaming 3G. This thing is awesome and has new PCB comparing to older version (this one has wider PCB with 10VRMS). I'm actually very lucky it is not voltage locked, but the problem is that MSI afterburner (beta19) does not allow more than +100mv, whish gives me 1.256v and a stable 1220/1800. My question is; how can i get 1.3v on these cards, and how to change memory voltage? I remember that previous bèta's allowed a tweak to the voltage slider, but that is not possible with beta19. I really want to put more volts on these card, but how?
> 
> Thank Guus


Are you sure its not voltage locked?
Maybe its just voltage limited?


----------



## NBAasDOGG

Quote:


> Originally Posted by *Recr3ational*
> 
> Are you sure its not voltage locked?
> Maybe its just voltage limited?


Nope, it's not voltage locked. On stock volts it's not possible to hit even 1160mhz, but when i put on +100mv imget a stable 1220mhz, maybe even more. When i put volts on -100mv the card runs about 9C cooler. Btw, i also use multimeter, and voltage gets to 1.256-1.26 when overvolt. On stock it undervolts to 0.850v.

So you know how to get 1.3v?


----------



## Recr3ational

no, most of our max slider goes to 1.3v, have you tried trixx?


----------



## Devildog83

*mxfreek09 and NBAasDOGG* have been added, Welocome!!

NBAasDOGG, I have put your memory clocks at 1600 for now. If you would like them updated just post them.


----------



## cremelo

I'm Back








Still on Stock my Toxic







Already sell my Asus









But now with 16Gb Ram its looks the Toxic have better performance....





PS: Anyone know if the EVGA Precision works well with 280x ? My MSI Afterburner Beta says expired


----------



## Devildog83

Quote:


> Originally Posted by *cremelo*
> 
> I'm Back
> 
> 
> 
> 
> 
> 
> 
> 
> Still on Stock my Toxic
> 
> 
> 
> 
> 
> 
> 
> Already sell my Asus
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But now with 16Gb Ram its looks the Toxic have better performance....
> 
> 
> 
> 
> 
> PS: Anyone know if the EVGA Precision works well with 280x ? My MSI Afterburner Beta says expired


Afterburner has a new beta. http://event.msi.com/vga/afterburner/download.htm


----------



## Algy

hi guys, i have a little problem over here. if I let idling my pc like 15 minutes, it turned off the display and sometimes wont wake up. this isn't happen before with my 570, but now with 280x it happens.
I'm using the lastest beta, 14.3. any help? thanks in advance, and sorry for me english


----------



## Roboyto

Quote:


> Originally Posted by *Algy*
> 
> hi guys, i have a little problem over here. if I let idling my pc like 15 minutes, it turned off the display and sometimes wont wake up. this isn't happen before with my 570, but now with 280x it happens.
> I'm using the lastest beta, 14.3. any help? thanks in advance, and sorry for me english


May want to try 13.12 drivers, they are more stable compared to the Beta versions. Use Display Driver Uninstaller to get rid of old drivers

http://www.guru3d.com/files_get/display_driver_uninstaller_download,9.html

See how that works for you.


----------



## Anonizer

Hello there! Just got my Sapphire 270X Vapor-X today, and never thought the bottleneck would be this hard as this is the first time experiencing it. LOL.








Can you guys recommend some programs to test it with?
I already tested it with Unigine Heaven Benchmark 4.0, is using FurMark safe? I read some posts about it killing gpus.
Is it safe to put the power limit to max (+50)? Also what does the UEFI button do? I'm using Windows Pro 8.1 that has been updated to the latest updates earlier. Thanks!









Sorry for the noob questions, cause this is my first time getting a GPU like this, GPUs I used are Sapphire 5670, and the rest are integrated ones.








Driver version is 14.3.
Anyone has an idea on why my GPU-Z isn't like this one


Spoiler: GPU-Z






Here's mine.


Spoiler: Mine!


----------



## BruceB

Quote:


> Originally Posted by *Anonizer*
> 
> Hello there! Just got my Sapphire 270X Vapor-X today, and never thought the bottleneck would be this hard as this is the first time experiencing it. LOL.
> 
> 
> 
> 
> 
> 
> 
> 
> Can you guys recommend some programs to test it with?
> I already tested it with Unigine Heaven Benchmark 4.0, is using FurMark safe? I read some posts about it killing gpus....


I recommend 3DMark (free demo Version on steam is good). As for Furmark: it's fine to use on GPUs, just Keep an eye on the temperature. The MAX. for a 280x is 85C I expect it would be the same for a 270X but check that, just to be sure. If your Card gets hotter than the MAX stop furmark IMMEDIATELY and let it cool down. If your Card gets much hotter than the max temp consistantly (i.e. also during normal gaming) then RMA it.

Powercolor Cards are pretty well known for having poorly-applied thermal paste (like my 280X did), it's a known Problem causing them to overheat (92C+) and they're more than happy to replace it (got my replacement in 8 working days!).

So, yes furmark is fine but *watch the temps*.









Quote:


> Originally Posted by *Anonizer*
> 
> Anyone has an idea on why my GPU-Z isn't like this one
> 
> 
> Spoiler: GPU-Z
> 
> 
> 
> 
> 
> 
> Here's mine.
> 
> 
> Spoiler: Mine!


Different Version number? one is 0.7.6 and the other is 0.7.7.


----------



## Anonizer

Quote:


> Originally Posted by *BruceB*
> 
> I recommend 3DMark (free demo Version on steam is good). As for Furmark: it's fine to use on GPUs, just Keep an eye on the temperature. The MAX. for a 280x is 85C I expect it would be the same for a 270X but check that, just to be sure. If your Card gets hotter than the MAX stop furmark IMMEDIATELY and let it cool down. If your Card gets much hotter than the max temp consistantly (i.e. also during normal gaming) then RMA it.
> 
> Powercolor Cards are pretty well known for having poorly-applied thermal paste (like my 280X did), it's a known Problem causing them to overheat (92C+) and they're more than happy to replace it (got my replacement in 8 working days!).
> 
> So, yes furmark is fine but *watch the temps*.
> 
> 
> 
> 
> 
> 
> 
> 
> Different Version number? one is 0.7.6 and the other is 0.7.7.


I see, thanks for answering my questions and for the tips! I'll download 3DMark now, then FurMark later. Did you push the power limit on your r9 280x to the max through afterburner?

As for GPU-Z I think the version is not the problem(??) cause I browsed the other pages and saw someone having 0.7.7 but it shows complete details. Thanks for pointing that out though, I'll be







on myself if that was the case.









Spoiler: GPU-Z!



http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-270-owners-club/5170#post_22078389


----------



## BruceB

No, I haven't OC'd my 280X yet, but if I were going to push it then I'd check what the max voltage is (quick Google says 1.3V, but don't take my word for it) just to be sure I wasn't going to fry my Card. Afterburner is pretty good at not letting you damage your Card with over volting but I think it's best to do your Research on this one!









Having said that, some Cards are _voltage locked_ (some of the 280x's anyway) which means that no matter what you put in afterburner the voltage won't Change. Google should be able to tell you if your Card is voltage locked or not.

As for GPUZ I'm still on 0.7.2, so Im way out of the Loop on this one, maybe someone else knows the answer?


----------



## NBAasDOGG

Oke gens,

I found a hack way to to bypass the MSI afterburner voltage slider. Now i can add +500 or -500 volts on the card.
I will post soon how to do!


----------



## Roboyto

Quote:


> Originally Posted by *Anonizer*
> 
> Hello there! Just got my Sapphire 270X Vapor-X today, and never thought the bottleneck would be this hard as this is the first time experiencing it. LOL.
> 
> 
> 
> 
> 
> 
> 
> 
> Can you guys recommend some programs to test it with?
> I already tested it with Unigine Heaven Benchmark 4.0, is using FurMark safe? I read some posts about it killing gpus.
> Is it safe to put the power limit to max (+50)? Also what does the UEFI button do? I'm using Windows Pro 8.1 that has been updated to the latest updates earlier. Thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sorry for the noob questions, cause this is my first time getting a GPU like this, GPUs I used are Sapphire 5670, and the rest are integrated ones.
> 
> 
> 
> 
> 
> 
> 
> 
> Driver version is 14.3.
> Anyone has an idea on why my GPU-Z isn't like this one
> 
> 
> Spoiler: GPU-Z
> 
> 
> 
> 
> 
> 
> Here's mine.
> 
> 
> Spoiler: Mine!


Unigine is great as well as anything 3DMark. Best stability test I have come to find these days is the Final Fantasy XIV Benchmark.

Furmark gives you an absolute worst case scenario so temperature and power draw from that is going to be worse than just about anything else you subject your card to. It was killing GPUs back in the day more frequently when Nvidia Fermi was new and those cards ran as hot as, or hotter, than the reference 290(X) now. It COULD do damage if you're not careful, but more than likely the GPU will cause a BSOD to stop it from blowing up once temperatures get too high.

If you have strange issues with games or benchmarks, I would suggest using the 13.12 drivers as anything 14.X are still having the bugs worked out.

50% power is OK, and it is going to increase the amperage/wattage the card will draw, usually extending Overclocking abilities. Just be mindful of temperatures when increasing this, voltage and clocks.

The GPU-Z looks different because your card has less sensors to monitor various stats.. Although yours does look a little scarce..


----------



## Devildog83

Quote:


> Originally Posted by *Anonizer*
> 
> Hello there! Just got my Sapphire 270X Vapor-X today, and never thought the bottleneck would be this hard as this is the first time experiencing it. LOL.
> 
> 
> 
> 
> 
> 
> 
> 
> Can you guys recommend some programs to test it with?
> I already tested it with Unigine Heaven Benchmark 4.0, is using FurMark safe? I read some posts about it killing gpus.
> Is it safe to put the power limit to max (+50)? Also what does the UEFI button do? I'm using Windows Pro 8.1 that has been updated to the latest updates earlier. Thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sorry for the noob questions, cause this is my first time getting a GPU like this, GPUs I used are Sapphire 5670, and the rest are integrated ones.
> 
> 
> 
> 
> 
> 
> 
> 
> Driver version is 14.3.
> Anyone has an idea on why my GPU-Z isn't like this one
> 
> 
> Spoiler: GPU-Z
> 
> 
> 
> 
> 
> 
> Here's mine.
> 
> 
> Spoiler: Mine!


Your in the Club !!!

Unigen Valley and 3DMark Vantage have the most crash's for me and will let me know when I am not stable more than anything. Also try "Catzilla", it's a good system bench and fun to watch.

I would not use Furmark unless your just doing a burn in because it's just too intensive and not need real world conditions. Just my opinion.

I would use HWinfo64 to keep track of GPU voltages and temps.


----------



## Roboyto

Quote:


> Originally Posted by *NBAasDOGG*
> 
> Oke gens,
> 
> I found a hack way to to bypass the MSI afterburner voltage slider. Now i can add +500 or -500 volts on the card.
> I will post soon how to do!


I would be very careful with doing this. A slip of the finger with too much and you could fry your card in an instant.


----------



## Anonizer

Quote:


> Originally Posted by *Roboyto*
> 
> Unigine is great as well as anything 3DMark. Best stability test I have come to find these days is the Final Fantasy XIV Benchmark.
> 
> Furmark gives you an absolute worst case scenario so temperature and power draw from that is going to be worse than just about anything else you subject your card to. It was killing GPUs back in the day more frequently when Nvidia Fermi was new and those cards ran as hot as, or hotter, than the reference 290(X) now. It COULD do damage if you're not careful, but more than likely the GPU will cause a BSOD to stop it from blowing up once temperatures get too high.
> 
> If you have strange issues with games or benchmarks, I would suggest using the 13.12 drivers as anything 14.X are still having the bugs worked out.
> 
> 50% power is OK, and it is going to increase the amperage/wattage the card will draw, usually extending Overclocking abilities. Just be mindful of temperatures when increasing this, voltage and clocks.
> 
> The GPU-Z looks different because your card has less sensors to monitor various stats.. Although yours does look a little scarce..


Oh, ok. I'll stick with Unigine and 3DMark for now and FFXIV later! So overclocking then is not recommended since I'm only using antec vp 450.









Yes, the sensors tab looks the same as my 5670. Is that a bad thing?








Quote:


> Originally Posted by *Devildog83*
> 
> Your in the Club !!!
> 
> Unigen Valley and 3DMark Vantage have the most crash's for me and will let me know when I am not stable more than anything. Also try "Catzilla", it's a good system bench and fun to watch.
> 
> I would not use Furmark unless your just doing a burn in because it's just too intensive and not need real world conditions. Just my opinion.
> 
> I would use HWinfo64 to keep track of GPU voltages and temps.


Thanks! Will try Catzilla too, so that I can make sure that this card is good 99.9% lol.

Got HWinfo64 earlier, not much detail on VRM's etc. Not sure what I'm missing out here.










Spoiler: HWINFO







=

Just finished running 3DMark, and the results are


Spoiler: 3DMARK



Need to upgrade CPU as *soon* as possible.




























Will try the latest beta driver later!









Thanks for the help BruceB, Roboyto and Devildog83!







+


----------



## Roboyto

Quote:


> Originally Posted by *Anonimyr*
> 
> Hello there! Just got my Sapphire 270X Vapor-X today, and never thought the bottleneck would be this hard as this is the first time experiencing it. LOL.


I just picked up a Powercolor 270X Devil Edition and am very pleased with it so far. I haven't had a chance to OC yet but am looking forward to it.

I do know the rough limits of these cards from experience with a 7870 I had under a full Waterblock. My ASUS DC2 7870 would start to see a limit on core around 1200 with additional power/Voltage. I don't know if the 270X has had any improvements or not so you may be able to exceed those clocks. The GCN architecture seems to peak in the high 1200 to low 1300 range when pushing max voltage... unless you get a magical card of course :-D

I was very pleased seeing the factory Powercolor overclock on this 270X being 1180, knowing full well what a 7870 is capable of at 1200+. From all the reviews I have seen for the Devil, it likely won't push far past ~1225, but I'm certainly going to give it a shot. I'll have results from my endeavors up here and in my build log soon


----------



## Roboyto

Quote:


> Originally Posted by *Anonizer*
> 
> Oh, ok. I'll stick with Unigine and 3DMark for now and FFXIV later! So overclocking then is not recommended since I'm only using antec vp 450.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for the help BruceB, Roboyto and Devildog83!
> 
> 
> 
> 
> 
> 
> 
> +


What 80+ rating is that PSU? I'm only using a 450W PSU at the moment, it is 80+ Gold rated however. Rosewill Capstone PSUs are an extremely great product for the price! Usually made by Superflower with wonderful performance under all ranges of load.

I fully intend to find the maximum clocks for this card and will have my i7 3770k pushing at least 4.5Ghz
I'll let you know if I run into any power shortage from my PSU


----------



## bardacuda

Quote:


> Originally Posted by *Anonizer*
> 
> Hello there! Just got my Sapphire 270X Vapor-X today, and never thought the bottleneck would be this hard as this is the first time experiencing it. LOL.
> 
> 
> 
> 
> 
> 
> 
> 
> Can you guys recommend some programs to test it with?
> I already tested it with Unigine Heaven Benchmark 4.0, is using FurMark safe? I read some posts about it killing gpus.
> Is it safe to put the power limit to max (+50)? Also what does the UEFI button do? I'm using Windows Pro 8.1 that has been updated to the latest updates earlier. Thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sorry for the noob questions, cause this is my first time getting a GPU like this, GPUs I used are Sapphire 5670, and the rest are integrated ones.
> 
> 
> 
> 
> 
> 
> 
> 
> Driver version is 14.3.
> Anyone has an idea on why my GPU-Z isn't like this one
> 
> 
> Spoiler: GPU-Z
> 
> 
> 
> 
> 
> 
> Here's mine.
> 
> 
> Spoiler: Mine!


What the power limit does is, it will throttle the voltage and/or clocks to stay within a certain TDP range. So say your card has a TDP of 150W, but you apply an overclock that would cause it to draw 160W under load, it will throttle to a lower power state while under load, reducing performance.

Increasing this slider does not cause it to suddenly draw any more power than it normally would anyway....it just increases the point at which your card will throttle.

Normally you don't need to do anything with this as it will still stay under the TDP even with a healthy overclock, although with my ASUS 270 I had to put it at +20% to keep from throttling as the BIOS rates the TDP at a measly 113W (which is less than it needs to be so I just modded the BIOS to make it have a 136W TDP and leave the power slider at +0%....same diff)

So basically it's fine to increase this as much as you want and as long as you keep an eye on your temps and don't apply harmful voltage levels it is safe, but if you want it to be useful as a failsafe, it's best to keep it as low as possible, and only increase it IF you experience nuisance throttling, and doing so still does not put you above a harmful thermal threshold.


----------



## BruceB

When it Comes to PSUs always go _Quality_ not _quantity_,and don't Forget it's not just about power, but also how constant the energy is (some say how 'clean' the energy is) and many other factors.

Check this thread: *My old thread* about halfway down the first page _shilka_ lists a few good PSUs to look at.

also check this out: *OCN FAQ: Recommended Power Supplies
*
I have a Seasonic 660xp (80+ Platinum) and I can say that I've had no Problems and that their customer Support is really good (I needed some more SATA power cables for my modular PSU, no Problem!).

One last Thing: like _shilka_ says *PSU brands are meaningless, look up the OEM*. i.e. Google who made the unit, not who's sticker is on the side!


----------



## bardacuda

Just to add to Bruce's post which is good info btw (notice I also have a Seasonic in my sig rig) Shilka has a thread with all the links to all the relevant info for the different PSU brands. Basically anything manufactured by Seasonic or Superflower is usually good quality. Other ones can be good too but you need to look at them on a case-by-case basis to make a determination.

Shilka's PSU Index Thread


----------



## BruceB

Quote:


> Originally Posted by *bardacuda*
> 
> Just to add to Bruce's post which is good info btw (notice I also have a Seasonic in my sig rig) Silka has a thread that with all the links to all the relevant info for the different PSU brands. Basically anything manufactured by Seasonic or Superflower is usually good quality. Other ones can be good too but you need to look at them on a case-by-case basis to make a determination.
> 
> Silka's PSU Index Thread


Lol. That's the second link in my post!

Great minds think alike!


----------



## Roboyto

Quote:


> Originally Posted by *bardacuda*
> 
> What the power limit does is, it will throttle the voltage and/or clocks to stay within a certain TDP range. So say your card has a TDP of 150W, but you apply an overclock that would cause it to draw 160W under load, it will throttle to a lower power state while under load, reducing performance.
> 
> Increasing this slider does not cause it to suddenly draw any more power than it normally would anyway....it just increases the point at which your card will throttle.
> 
> Normally you don't need to do anything with this as it will still stay under the TDP even with a healthy overclock, although with my ASUS 270 I had to put it at +20% to keep from throttling as the BIOS rates the TDP at a measly 113W (which is less than it needs to be so I just modded the BIOS to make it have a 136W TDP and leave the power slider at +0%....same diff)
> 
> So basically it's fine to increase this as much as you want and as long as you keep an eye on your temps and don't apply harmful voltage levels it is safe, but if you want it to be useful as a failsafe, it's best to keep it as low as possible, and only increase it IF you experience nuisance throttling, and doing so still does not put you above a harmful thermal threshold.


Power is *not* voltage.

Power slider increases the amount of amperage and thus wattage the card can draw. It is true it won't do it suddenly, it will only take advantage of that additional amperage under load.

If you're overclocking than you likely do need to increase this on just about any card. When I am going to OC and bench a card the power slider is the first thing that gets maxed out.

Your 270 only has (1) PCIe 6-pin power connector, good for 75W, add the 75W the slot provides and you card should be capable of 150W of power. The 270 and 270X are identical spec wise, except for the PCIe power connectors. 270s only have (1) 6-pin PCIe connector which severely limits the OC capabilities of it compared to the 270X with (2) 6-pin connectors.

For example, my R9 290 has a 6-pin and 8-pin connector. That would be 75(slot)+ 75(6-pin) + 150(8-pin) = 300W total. When I push my card well past stock clocks, as high as 1300/1700, I have the voltage and power sliders maxed. My card draws 359W under benchmark loads with power at 50% and +200mV in Trixx. See photo below:


The PCIe slot gives 75W of power, 6-pin is 75W and 8-pin is 150W. Your card is capable of 150W max by PCIe specification. Your BIOS had it set to 113 probably because of the capabilities of the cooling and power delivery system on the card. 270 isn't meant to be an outstanding overclocker. However, you could use the power slider to bring that allowance up, probably very close to or slightly over 150W, you have just gone about it a different way. Don't use the slider with your modification to the BIOS though, that could cause too much amperage/wattage for your card.

The card will throttle when it runs into a heat issue. If the card doesn't have enough voltage or power for certain clock speeds on the core/memory it will likely display artifacts, freeze, or BSOD.

If you're using something like Trixx or AB you *probably* won't be able to apply a harmful amount of voltage *unless* you are flashing BIOSes or modding these programs to go outside their designated limits. Some cards respond very well to additional voltage and you will be able to keep increasing your speeds as the voltage goes up...this is not always the case however. Some cards hit a wall and no matter how much additional voltage they receive, the speeds don't increase...this is The Silicon Lottery.

Adding voltage can cause the card to wear/degrade faster than normal, but odds are you won't kill the card with the voltage allowed in these programs UNLESS you allow the card to overheat dramatically and most likely repeatedly. It is unlikely that just by increasing the voltage slider to max in Trixx or AB that you will fry your card.

As you said, you must be mindful of temperatures. The core temperature will cause throttling undoubtedly, but if you let the VRMs get too hot then you can have more serious problems.


----------



## bardacuda

Yes this is all true. I didn't intend to say that voltage and wattage are the same, however the wattage is directly proportional to the square of the voltage. So increasing the voltage will increase the power draw. Power is the product of both the voltage and amperage (P = E x I) where "E" is the voltage and "I" is the amperage, however "I" is also directly proportional to the voltage given a fixed resistance (E = I x R) where R = resistance. Therefore, by increasing the voltage, you increase the amperage by the same proportion given a fixed resistance, hence the power increasing proportionally with the _square_ of the voltage. An increase in frequency also corresponds to an increase in power, however the effect is negligible compared to the voltage increase alone so we can basically ignore that.

Anyway just be aware that increasing voltage increases your power draw at an exponential rate and if you overclock you _may_ need to increase the power limit to avoid throttling which in turn will make the power limit less useful as a failsafe, but the point is that increasing the power limit alone without doing anything to the applied voltage or load does not do anything in and of itself.


----------



## BruceB

Quote:


> Originally Posted by *Roboyto*
> 
> For example, my R9 290 has a 6-pin and 8-pin connector. That would be *75(slot)+ 75(6-pin) + 150(8-pin) = 300W total*. When I push my card well past stock clocks, as high as 1300/1700, I have the voltage and power sliders maxed. *My card draws 359W* under benchmark loads with power at 50% and +200mV in Trixx. See photo below:


Drawing 359W from 300W Input.
Just out of interest: does that work ok?
I expected the Card to shut down if it treid to draw more power than it could.


----------



## shilka

Quote:


> Originally Posted by *Roboyto*
> 
> What 80+ rating is that PSU? I'm only using a 450W PSU at the moment, it is 80+ Gold rated however. Rosewill Capstone PSUs are an extremely great product for the price! Usually made by Superflower with wonderful performance under all ranges of load.
> 
> I fully intend to find the maximum clocks for this card and will have my i7 3770k pushing at least 4.5Ghz
> I'll let you know if I run into any power shortage from my PSU


Sory i am a little late in here, Anyway what i wanted to say is 80 plus has nothing to do with quality.
You can find junk thats gold rated the most famous or infamous of those is the Raidmax AE and the EVGA SuperNova NEX models as well
Quote:


> Originally Posted by *BruceB*
> 
> One last Thing: like _shilka_ says *PSU brands are meaningless, look up the OEM*. i.e. Google who made the unit, not who's sticker is on the side!


This is better then google

http://www.realhardtechx.com/index_archivos/Page5471.htm


----------



## FtW 420

Quote:


> Originally Posted by *BruceB*
> 
> Drawing 359W from 300W Input.
> Just out of interest: does that work ok?
> I expected the Card to shut down if it treid to draw more power than it could.


The power connectors' rated specs are ratings, the 6 pin pci-e plugs are rated for 75W like the pci-e slot, but a card can still pull 500W + from 2 x 6pin connectors & the slot if you push it that far.


----------



## bardacuda

Quote:


> Originally Posted by *BruceB*
> 
> Drawing 359W from 300W Input.
> Just out of interest: does that work ok?
> I expected the Card to shut down if it treid to draw more power than it could.


That would probably have more to do with your motherboard and PSU, since it is drawing more wattage from the PCI-E slot and/or 6-pin/8-pin connectors than it's rated for. I always wondered though if the extra 59W in this case would be split among the 3 sources, and in what proportions, or if the PCI-E slot is hard-capped at 75W and all the extra power comes from the PSU cables or what. I would be wary of drawing extra power through the PCI-E slot, although my concerns are not necessarily justified and it probably depends on the quality of the board. Drawing the extra power from the cables only would just make them produce more heat than they are supposed to and doubt it would really be a problem in a well-ventilated case, although it probably puts more stress on the PSU internally as well.


----------



## BruceB

Ok, so These numbers are Guideline Minimums rather than a hard-and-fast Max value then? Good to know, makes it even more worthwhile to get decent components!


----------



## shilka

PCI-E cables can and often carry more power then what they are rated for, Not that anything bad is going to happen the cables can deal with the extra power


----------



## Anonizer

Quote:


> Originally Posted by *Roboyto*
> 
> What 80+ rating is that PSU? I'm only using a 450W PSU at the moment, it is 80+ Gold rated however. Rosewill Capstone PSUs are an extremely great product for the price! Usually made by Superflower with wonderful performance under all ranges of load.
> 
> I fully intend to find the maximum clocks for this card and will have my i7 3770k pushing at least 4.5Ghz
> I'll let you know if I run into any power shortage from my PSU


Yes, the 270x devil's clocks are pretty high! I'll be waiting the results of your 270x devil! :thumbsup: I hope you have a magical card there







The reviews says that it can't qualify for 80+ certification cause it doesn't have a PFC circuit, though the efficiency of the PSU is at 80~85%. Overclocks everywhere! Thanks! :thumbsup:
Quote:


> Originally Posted by *bardacuda*
> 
> What the power limit does is, it will throttle the voltage and/or clocks to stay within a certain TDP range. So say your card has a TDP of 150W, but you apply an overclock that would cause it to draw 160W under load, it will throttle to a lower power state while under load, reducing performance.
> 
> Increasing this slider does not cause it to suddenly draw any more power than it normally would anyway....it just increases the point at which your card will throttle.
> 
> Normally you don't need to do anything with this as it will still stay under the TDP even with a healthy overclock, although with my ASUS 270 I had to put it at +20% to keep from throttling as the BIOS rates the TDP at a measly 113W (which is less than it needs to be so I just modded the BIOS to make it have a 136W TDP and leave the power slider at +0%....same diff)
> 
> So basically it's fine to increase this as much as you want and as long as you keep an eye on your temps and don't apply harmful voltage levels it is safe, but if you want it to be useful as a failsafe, it's best to keep it as low as possible, and only increase it IF you experience nuisance throttling, and doing so still does not put you above a harmful thermal threshold.


Oh, thanks for the informative posts bardacuda, roboyto. That clears up the things about that power limit thing. I'll take note of that one. +

-

Had been overclocking the card, runs fine at heaven at 1170/1450 but gives a red screen at the firestrike portion of 3dMark until I lowered it at 1140/1450. As for the memory I can only get it to 1460 anything above will cause a "amd driver has stopped responding" at heaven, so didn't bother to test it on 3dMark. I didn't adjust anything besides those two, fan speed at 70% and the temps never got up to 60c. Power limit and voltage at defaults though.


----------



## bardacuda

Quote:


> Originally Posted by *Anonizer*
> 
> Yes, the 270x devil's clocks are pretty high! I'll be waiting the results of your 270x devil! :thumbsup: I hope you have a magical card there
> 
> 
> 
> 
> 
> 
> 
> The reviews says that it can't qualify for 80+ certification cause it doesn't have a PFC circuit, though the efficiency of the PSU is at 80~85%. Overclocks everywhere! Thanks! :thumbsup:
> Oh, thanks for the informative posts bardacuda, roboyto. That clears up the things about that power limit thing. I'll take note of that one. +
> 
> -
> 
> Had been overclocking the card, runs fine at heaven at 1170/1450 but gives a red screen at the firestrike portion of 3dMark until I lowered it at 1140/1450. As for the memory I can only get it to 1460 anything above will cause a "amd driver has stopped responding" at heaven, so didn't bother to test it on 3dMark. I didn't adjust anything besides those two, fan speed at 70% and the temps never got up to 60c. Power limit and voltage at defaults though.


What type of memory do your cards have? Mine both have elpida and are basically unstable over 1400







I can get through Valley benchmark at 1500 (and even 1600 although with some pretty severe artifacting) although any increase over 1500 results in a loss in performance anyway, but for long-term stability (24/7 mining) 1500 doesn't work for me. When trying to find your max stable clocks, it's best to leave the memory at stock (normally 1400), and increase only the core (and voltage) till you find your max core, doing your stability tests after each change to verify. After you've done that, leave the core and voltage set to your newly found maximums, and then start increasing the memory, and stability testing after each increase until you've find your max mem clock. Then you're done.

With this method, when you find an instability, you'll know whether it's the core or the mem that's the problem and can adjust the right one.

Keep in mind you may have to increase your power limit too and it's perfectly fine...just make sure you keep your voltage and temps in check. If you notice a drop in performance (or even if you don't) when you're doing your stability testing, check AB, or GPU-Z sensor tab, to make sure your GPU usage is staying near 100% and your clocks haven't adjusted downward. If they have that means that you need to increase the power limit. Remember power limit is just that...the limit to how much power your GPU will allow itself to draw...and is just an extra safety measure. Putting it to +50% is basically saying "I don't want this safety feature, turn it off." Putting it to +20% is more like saying "I still want this safety feature to work, but what I consider safe is higher than what the manufacturer considers safe, because it is, because I'm being careful about my volts and temps anyway."


----------



## BruceB

@Anonizer:
I'd love to see your _Fire Strike_ benchmark when you"ve run it!!


----------



## Anonizer

Quote:


> Originally Posted by *bardacuda*
> 
> What type of memory do your cards have? Mine both have elpida and are basically unstable over 1400
> 
> 
> 
> 
> 
> 
> 
> I can get through Valley benchmark at 1500 (and even 1600 although with some pretty severe artifacting) although any increase over 1500 results in a loss in performance anyway, but for long-term stability (24/7 mining) 1500 doesn't work for me. When trying to find your max stable clocks, it's best to leave the memory at stock (normally 1400), and increase only the core (and voltage) till you find your max core, doing your stability tests after each change to verify. After you've done that, leave the core and voltage set to your newly found maximums, and then start increasing the memory, and stability testing after each increase until you've find your max mem clock. Then you're done.
> 
> With this method, when you find an instability, you'll know whether it's the core or the mem that's the problem and can adjust the right one.
> 
> Keep in mind you may have to increase your power limit too and it's perfectly fine...just make sure you keep your voltage and temps in check. If you notice a drop in performance (or even if you don't) when you're doing your stability testing, check AB, or GPU-Z sensor tab, to make sure your GPU usage is staying near 100% and your clocks haven't adjusted downward. If they have that means that you need to increase the power limit. Remember power limit is just that...the limit to how much power your GPU will allow itself to draw...and is just an extra safety measure. Putting it to +50% is basically saying "I don't want this safety feature, turn it off." Putting it to +20% is more like saying "I still want this safety feature to work, but what I consider safe is higher than what the manufacturer considers safe, because it is, because I'm being careful about my volts and temps anyway."


I have Elpida too.







Yes, experienced it before with my 5670 can't even add 50+ on memory clock. lol. Been doing +10 mhz every run on core first after every round of successful stability tests after that proceeding on memory, though sometimes I want to skip it since it'll probably crash or when it passes it comes with performance loss.







I'll probably try to fiddle with the voltage and power limits tomorrow, since I don't want to do any harm to the card cause of my inexperience since I still have no idea on what voltage should be the "safe" max and how much should I increase every try.







Thanks for the advise!







I'll hope this will overclock well when voltage and power limit comes into play.
Quote:


> Originally Posted by *BruceB*
> 
> @Anonizer:
> I'd love to see your _Fire Strike_ benchmark when you"ve run it!!





Spoiler: Firestrike with 14.3 and 1140/1450







Will try to run again using 14.4!


----------



## BruceB

Quote:


> Originally Posted by *Anonizer*
> 
> Will try to run again using 14.4!












That's a good score, I think an OC'd 270X might be the sweet spot for price/performance.


----------



## Bruska

Please add me: Shapphire Vapor X r9 270X --> 1250 - 1500

Between, is this result good for my OC?


----------



## bardacuda

Quote:


> Originally Posted by *Bruska*
> 
> Please add me: Shapphire Vapor X r9 270X --> 1250 - 1500
> 
> Between, is this result good for my OC?
> 
> 
> Spoiler: Warning: Spoiler!


That's pretty much exactly where it should be with those clocks. If you tweak the texture filtering in catalyst you should get another fps or two.



You should do a bench for the valley thread







Right now I'm the only 270 (X or no) posted over there and it's kinda lonely. I only have ppl's old 7870 scores to compare it to









http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0/0_100


----------



## Anonizer

@Bruska: I think that's pretty good! 1250!







What memory type do you have?


----------



## Bruska

Quote:


> Originally Posted by *bardacuda*
> 
> That's pretty much exactly where it should be with those clocks. If you tweak the texture filtering in catalyst you should get another fps or two.
> 
> 
> 
> You should do a bench for the valley thread
> 
> 
> 
> 
> 
> 
> 
> Right now I'm the only 270 (X or no) posted over there and it's kinda lonely. I only have ppl's old 7870 scores to compare it to
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0/0_100


Thanks for the tip, didn't know that. I'll change it and see what happens. It also affects in game performance?
Quote:


> Originally Posted by *Anonizer*
> 
> @Bruska: I think that's pretty good! 1250!
> 
> 
> 
> 
> 
> 
> 
> What memory type do you have?


I have Hynix memory, but I set it only 1500 in benchmark because in games it crashes


----------



## bardacuda

Quote:


> Originally Posted by *Bruska*
> 
> Thanks for the tip, didn't know that. I'll change it and see what happens. It also affects in game performance?


No because you don't change the global settings. You have to specifically add valley.exe under the 3D Applications Settings tab in CCC, and then change the settings for that program only. The Valley thread I linked has all the details on how to do this in the orignial post, and also some tips for other things you can do to squeeze out the last bit of performance (setting the process to real-time, ending the explorer.exe and other non-essential processes in task manager, etc.) although these other tweaks do basically nothing for your score compared to the texture filtering tweaks.


----------



## Roboyto

Quote:
Originally Posted by *BruceB* 

Drawing 359W from 300W Input.
Just out of interest: does that work ok?
I expected the Card to shut down if it treid to draw more power than it could.

Worked just fine for all the benches I was running; 359W was the highest I saw. I was using 3DMark11, Unigine Heaven/Valley, and FFXIV benchmark.

If you can dissipate the heat then you can push them further than intended. Keeping the components cooler increases their efficiency as well. Check that link out, by attaching an AIO to a 290X they got a measurable drop in power consumption of 26W.

http://www.legitreviews.com/nzxt-kraken-g10-gpu-water-cooler-review-on-an-amd-radeon-r9-290x_130344/4

Quote:
Originally Posted by *FtW 420* 

The power connectors' rated specs are ratings, the 6 pin pci-e plugs are rated for 75W like the pci-e slot, but a card can still pull 500W + from 2 x 6pin connectors & the slot if you push it that far.

I'm sure they can, and from the looks of your avatar that probably happens in scenarios where LN2







is involved; an unlikely scenario for most of us.

Quote:
Originally Posted by *shilka* 

Sory i am a little late in here, Anyway what i wanted to say is 80 plus has nothing to do with quality.
You can find junk thats gold rated the most famous or infamous of those is the Raidmax AE and the EVGA SuperNova NEX models as well
This is better then google

http://www.realhardtechx.com/index_archivos/Page5471.htm

I wouldn't recommend a Raidmax anything to anyone, probably ever. *Odds are* you are better off with something that has a high rating than to one with no rating at all. A little research before purchasing a PSU can tell you whether or not the rating holds true.

Thanks for that link









Quote:
Originally Posted by *BruceB* 










That's a good score, I think an OC'd 270X might be the sweet spot for price/performance.

7870/270X are arguably the price/performance ratio king, especially, if you're running 1080P. $200 for is a great deal for the performance you get with these cards.

Quote:
Originally Posted by *bardacuda* 

When trying to find your max stable clocks, it's best to leave the memory at stock (normally 1400), and increase only the core (and voltage) till you find your max core, doing your stability tests after each change to verify. After you've done that, leave the core and voltage set to your newly found maximums, and then start increasing the memory, and stability testing after each increase until you've find your max mem clock. Then you're done.

With this method, when you find an instability, you'll know whether it's the core or the mem that's the problem and can adjust the right one.

Keep in mind you may have to increase your power limit too and it's perfectly fine...just make sure you keep your voltage and temps in check.

That's precisely how I get my overclocks. It is time consuming, but if you want to know your cards true maximum capabilities this is it.

Keep track of your clocks, volts, temps, and bench scores when you do this. You'll start to see a trend eventually for where pushing the card past a certain point isn't yielding that much of a performance gain.

My 290 is capable of ~1300/1700 +200mV with power maxed. I haven't run my card at those clocks since I found them because it is extra unnecessary strain on the card, my MoBo, and PSU. After looking at my OC/Bench spreadsheet I found that 1200/1500 was where I was getting 90-95% of the performance of abusing the card at absolute max. For gaming the card is at 1200/1500 +87mV.

Quote:
Originally Posted by *Anonizer* 

Yes, the 270x devil's clocks are pretty high! I'll be waiting the results of your 270x devil! :thumbsup: *I hope you have a magical card there*







The reviews says that it can't qualify for 80+ certification cause it doesn't have a PFC circuit, though the efficiency of the PSU is at 80~85%. Overclocks everywhere! Thanks! :thumbsup:
Oh, thanks for the informative posts bardacuda, roboyto. That clears up the things about that power limit thing. I'll take note of that one. +
-
Had been overclocking the card, runs fine at heaven at 1170/1450 but gives a red screen at the firestrike portion of 3dMark until I lowered it at 1140/1450. As for the memory I can only get it to 1460 anything above will cause a "amd driver has stopped responding" at heaven, so didn't bother to test it on 3dMark. I didn't adjust anything besides those two, fan speed at 70% and the temps never got up to 60c. *Power limit and voltage at defaults though.*

Me too







I have been fortunate with nearly all of my AMD card purchases. I have had phenomenal performers of: (3) HD4890s, HD7770, HD7870, HD7950, and my R9 290.

My buddy just bought a Visiontek R9 280X, so we'll be testing it out very soon. He ordered a Kraken G10 off of ebay last night and a Corsair H55 to install on it









If your card is hitting 1160 on stock volts/power it sounds like you *may* have a very good performer there. The GPU core can safely run in excess of 90C, and most VRMs are capable of tolerating over 100C. I wouldn't really suggest letting either of them hang in the 80+ range for extended periods of time however.

Quote:
Originally Posted by *Anonizer* 


> I'll probably try to fiddle with the voltage and power limits tomorrow, since I don't want to do any harm to the card cause of my inexperience since I still have no idea on what voltage should be the "safe" max and how much should I increase every try.
> 
> 
> 
> 
> 
> 
> 
> Thanks for the advise!
> 
> 
> 
> 
> 
> 
> 
> I'll hope this will overclock well when voltage and power limit comes into play.


Small jumps in voltage, maybe 7-10mV each time. With air cooling you will likely run into problems keeping the card cool enough before you get outside of safe voltages. Make sure you keep an eye on those VRM temps once you start adding volts/power


----------



## Devildog83

Quote:


> Originally Posted by *Bruska*
> 
> Please add me: Shapphire Vapor X r9 270X --> 1250 - 1500
> 
> Between, is this result good for my OC?
> 
> 
> 
> You have been added. Welcome !!! Not a bad Valley score. Here is a HD7870 Devil runs I did a while back, can't get the 270X that high and won't try now because they are in cross-fire.


----------



## Anonizer

Quote:


> Originally Posted by *Roboyto*
> 
> That's precisely how I get my overclocks. It is time consuming, but if you want to know your cards true maximum capabilities this is it.
> 
> Keep track of your clocks, volts, temps, and bench scores when you do this. You'll start to see a trend eventually for where pushing the card past a certain point isn't yielding that much of a performance gain.
> 
> My 290 is capable of ~1300/1700 +200mV with power maxed. I haven't run my card at those clocks since I found them because it is extra unnecessary strain on the card, my MoBo, and PSU. After looking at my OC/Bench spreadsheet I found that 1200/1500 was where I was getting 90-95% of the performance of abusing the card at absolute max. For gaming the card is at 1200/1500 +87mV.
> 
> Me too
> 
> 
> 
> 
> 
> 
> 
> I have been fortunate with nearly all of my AMD card purchases. I have had phenomenal performers of: (3) HD4890s, HD7770, HD7870, HD7950, and my R9 290.
> 
> My buddy just bought a Visiontek R9 280X, so we'll be testing it out very soon. He ordered a Kraken G10 off of ebay last night and a Corsair H55 to install on it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If your card is hitting 1160 on stock volts/power it sounds like you may have a very *good* performer there. The GPU core can safely run in excess of 90C, and most VRMs are capable of tolerating over 100C. I wouldn't really suggest letting either of them hang in the 80+ range for extended periods of time however.
> 
> Small jumps in voltage, maybe 7-10mV each time. With air cooling you will likely run into problems keeping the card cool enough before you get outside of safe voltages. Make sure you keep an eye on those VRM temps once you start adding volts/power


Yeah, it's definitely time consuming especially when running 3dMark then find out that it'll crash at the last part of firestrike.







Thanks, noted!
Overclocks!







I hope, so that I can have more fun with this new card. I find overclocking more fun than playing games, not sure what's the problem with me. lol. There's a problem though, I don't know how to check my VRM temps, HWinfo64 only shows VRM Power Out, GPU-Z only shows up to Memory Usage (Dynamic) and VDDC.

=

I just searched about bios versions of r9 270x vapor-x


Spoiler: Bios



http://www.techpowerup.com/vgabios/index.php?manufacturer=Sapphire&model=R9+270X


Mine's "015.039.000.001.000000" and the latest one is "015.041.000.000.000000"
Should I upgrade? Is it safe?


----------



## Roboyto

Quote:


> Originally Posted by *Anonizer*
> 
> Yeah, it's definitely time consuming especially when running 3dMark then find out that it'll crash at the last part of firestrike.
> 
> 
> 
> 
> 
> 
> 
> Thanks, noted!
> Overclocks!
> 
> 
> 
> 
> 
> 
> 
> I hope, so that I can have more fun with this new card. I find overclocking more fun than playing games, not sure what's the problem with me. lol. There's a problem though, I don't know how to check my VRM temps, HWinfo64 only shows VRM Power Out, GPU-Z only shows up to Memory Usage (Dynamic) and VDDC.
> 
> =
> 
> I just searched about bios versions of r9 270x vapor-x
> 
> 
> Spoiler: Bios
> 
> 
> 
> http://www.techpowerup.com/vgabios/index.php?manufacturer=Sapphire&model=R9+270X
> 
> 
> Mine's "015.039.000.001.000000" and the latest one is "015.041.000.000.000000"
> Should I upgrade? Is it safe?


I forgot already about your limited list of stats in GPU-Z...hmm, I'm not sure why your card is like that. My Devil 270X has the VRM temperature.

Maybe try contacting manufacturer regarding the BIOS flash and the seemingly small list of statistics shown in GPU-Z and HWiNFO. I can't give you an informative answer on how safe the flash is...It may work fine, and it may not.


----------



## Anonizer

Quote:


> Originally Posted by *Roboyto*
> 
> I forgot already about your limited list of stats in GPU-Z...hmm, I'm not sure why your card is like that. My Devil 270X has the VRM temperature.
> 
> Maybe try contacting manufacturer regarding the BIOS flash and the seemingly small list of statistics shown in GPU-Z and HWiNFO. I can't give you an informative answer on how safe the flash is...It may work fine, and it may not.


Elpida memory that can't be overclocked + no VRM temps. Oh why









I already messaged them, I hope they reply soon!









=

Stock clocks, power limit, voltage.


Spoiler: Fire Strike Run







Core - 1210 Mem - 1450 VDDC - 1240 Power Limit - 0 (Not changed)

Max temps not exceeding 65c at 100% fan speed.


Spoiler: Fire Strike Run







When I add 5~ to core/memory it'll result in a red screen on Fire Strike. Tested +50 on power limit and 1300 VDDC still red screen.


----------



## bardacuda

@Anonizer

Sounds like you've reached the memory limit. Memory does not respond to the voltage as it only changes the voltage applied to the core. Also a red screen is the kind of thing that would happen with unstable VRAM. Unstable core would more likely result in flickering artifacts here and there (but not the whole screen) or BSOD/reboot.

Going so high above stock with the voltage should easily, easily, get you another 5MHz on the core (and more likely up to 1250+ MHz stable).

This is why you should only increase one thing or the other between stability tests, but again it looks like you've already found your max mem clock (1450), so just leave it there and focus on the core.

EDIT: Just wanted to add, on air, 1300VDDC is probably around the range where you might want to start thinking about your VRM temps. Since you don't have a sensor you'll have to just be careful and cross your fingers basically. If you're going to play around with 1.3V or more you should at least have a fan blowing directly on the PCB of the card. I like to just take the side panel off the case and blast it with a regular stand mount fan or a box fan like a person would use in their living room or whatever.


----------



## Anonizer

@bardacuda

I see! Thanks! I overclocked the core first until I found the value that doesn't give me a red screen (1210). I didn't change the memory while doing that, the default memory clock of this card is 1450.
When overclocking this card the only things I have experienced are "red screen that makes pc reboot after 5 seconds" and "amd display driver stopped responding".

The card will stop functioning if something goes wrong with the VRMs right? On MSI AB the core voltage slider is grayed out even after enabling unofficial overclocking and checking the unlock voltage options. On sapphire trixx the max value of the VDDC is 1300. Guess I'm lucky since my side panel fan blows directly to the PCB of this card and it's pretty near too, it also runs at max speed.

So if I wanted to add another couple of MHz on the core I should downclock the memory?


----------



## Recr3ational

If you're at 1.3v on air. Those crashes means either overheating or lack of voltage. What was the core clock?


----------



## bardacuda

Ohh ok...hmm....sounds like it is a core stability issue then and not the mem in that case. Please ignore my last post lol. I wouldn't downclock the memory at this point and you probably have some headroom to actually increase the memory, but need to work out the issues with the core first.

It is very strange that 1210 MHz works with 1.24V though...and 1215 MHz crashes with 1.3V. I mean...you do hit a wall at some point, but really it's more of a "slope that gets very steep very quickly" than an actual wall. An extra 60mV should get you an extra 5MHz even if 1210MHz is your "wall".

EDIT: About the VRMs, the answer is yes. If you kill your VRM circuits you kill your card and you will have to hope they take an RMA.

Maybe check in the GPU-Z sensor tab to verify that trixx actually did apply 1.3V and you weren't still running at 1.24...cuz like I say it is very strange that only a 5Mhz increase would become unstable with an extra 60mV applied.

EDIT2: If you're using the Basic version of 3DMark, although it is a good stability test, I'd recommend something else to do a quick check for stability before sitting through 3 different demos and benchmarks. Something like MSI Kombustor or Furmark would be ok just to do a quick 5 minute check or something. Or try one of the Unigine benchmarks for a longer but less intensive test (still way quicker than 3DMark). After you've found the general range of where your wall is, and your thermal and voltage limits, then maybe try getting 3DMark stable last. You may have to lower your OC a bit to be stable in FireStrike and stay under the thermal ceiling after sitting through all the other demos and benches along the way, as opposed to say Valley. But at least you'll know you're already very close to stable and will probably only have to knock off 10 - 20 MHz or add 10 - 20mV or something like that. Should save a lot of time.


----------



## BruceB

Just a quick question about OC capabilities of various cards:
My Powercolor 280X is made by _Tul Corp._ when you look on 3DMark's leaderboard for single 280X GPUs most of the top ranking/highest overclocked cards are from _PC Partner_.
Which brands are manufactured by _PC Partner_? Why are these cards better for OCing?


----------



## Majentrix

Sapphire is PC Partner, and the top cards are probably the Toxic edition 280xs, which use top binned GPUs and high-quality memory.


----------



## Xtreme21

I just picked up a XFX DD R9 280 (non-X) Black Edition!



Installed yesterday and today I'm going to push her to her limits.


----------



## Roboyto

Quote:


> Originally Posted by *Anonize url=*
> @bardacuda
> 
> I see! Thanks! I overclocked the core first until I found the value that doesn't give me a red screen (1210). I didn't change the memory while doing that, the default memory clock of this card is 1450.
> When overclocking this card the only things I have experienced are "red screen that makes pc reboot after 5 seconds" and "amd display driver stopped responding".
> 
> The card will stop functioning if something goes wrong with the VRMs right? On MSI AB the core voltage slider is grayed out even after enabling unofficial overclocking and checking the unlock voltage options. On sapphire trixx the max value of the *VDDC is 1300*. Guess I'm lucky since my side panel fan blows directly to the PCB of this card and it's pretty near too, it also runs at max speed.
> 
> *So if I wanted to add another couple of MHz on the core I should downclock the memory?*


You may have just hit the limit of your card honestly, you have to figure the 270X is still a 7870 which stock are 1000 core clock. Getting to 1200 is a 20% boost, which is a respectable OC. I have had cards that get far less than 20%. _You're just seeing less of the overclock capabilities because your card came with an overclock of 1100 MHz already._

My ASUS DC2 only got a little further than your clocks with similar voltage while being fully water cooled. I don't know how many of these cards get very far far past 1200.

The core speed is much more important on these cards. If you have Overclocked the RAM, reducing it back to stock speeds might help you push the core a little further.

I've been trying to locate a picture of the heatsink on that card and I can't find one. You may want to remove the heatsink to see if/how it is cooling the VRM.

It looks like there are 8 screws in total to remove your HSF.


I checked on EK's CoolingConfigurator website and it does *NOT* look like your card has heatsinks on the VRMs, this is likely inhibiting your ability to overclock. The VRMs for core/board are marked with the yellow, there are 5 in total; the little black squares. The Orange one is the VRM for RAM, and this is less important. At the very least, the lack of heatsinks will reduce the longevity of the card.



If you're at 1.3V then you are likely pushing the limits of air cooling for the VRM, especially if your card does not have heatsinks on the VRM.

This is a picture of my devil with heatsink/fan removed, I have the heatsinks for VRM marked in yellow. Your card should look similar to this, but may not be exact. The Devil 270X has upgraded VRMs for enhanced OCing and stability.


If you want to add heatsinks it's not a terribly difficult procedure, would just have to make sure that they will clear the fins of the cooler.

It is surprising that the Vapor-X edition doesn't have VRM sinks, my HD4890 did


----------



## BruceB

Quote:


> Originally Posted by *Majentrix*
> 
> Sapphire is PC Partner, and the top cards are probably the Toxic edition 280xs, which use top binned GPUs and high-quality memory.


Sapphire _is_ PC Partner?! That's interesting. Do they make cards for other brands?
The Toxic is a sick card, it pretty much dominates the 280X scene. Has anyone OC'd a Powercolor before?


----------



## Devildog83

Quote:


> Originally Posted by *BruceB*
> 
> Sapphire _is_ PC Partner?! That's interesting. Do they make cards for other brands?
> The Toxic is a sick card, it pretty much dominates the 280X scene. Has anyone OC'd a Powercolor before?


Not a 280x but I have overclocked the hell out of my Powercolor 270x and 7870.


----------



## Roboyto

Quote:


> Originally Posted by *BruceB*
> 
> Sapphire _is_ PC Partner?! That's interesting. Do they make cards for other brands?
> The Toxic is a sick card, it pretty much dominates the 280X scene. Has anyone OC'd a Powercolor before?


I'm in the midst of OCing my Powercolor 270X Devil Edition. It came at 1180/1400 out of the box and it has more headroom than I had anticipated. I will be posting results soon.


----------



## Devildog83

Quote:


> Originally Posted by *Roboyto*
> 
> I'm in the midst of OCing my Powercolor 270X Devil Edition. It came at 1180/1400 out of the box and it has more headroom than I had anticipated. I will be posting results soon.


I've been looking back at some of my old clocks ot the 7870 Devil and found this. Probably close to the high core clock stable but the memory could have gone higher. I just don't think it made a performance difference. Both the 7870 and 270x Devil's have 6Ghz memory chips so you think. The 270x is stable at 1240/1575. Can't find any clips of the 270x clocks because it went almost straight to X-Fire.


----------



## Recr3ational

Quote:


> Originally Posted by *BruceB*
> 
> Sapphire _is_ PC Partner?! That's interesting. Do they make cards for other brands?
> The Toxic is a sick card, it pretty much dominates the 280X scene. Has anyone OC'd a Powercolor before?


I got the powercolour 280x
its at 1250/1650 atm. it would go so much higher but the voltage limits it


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> I've been looking back at some of my old clocks ot the 7870 Devil and found this. Probably close to the high core clock stable but the memory could have gone higher. I just don't think it made a performance difference. Both the 7870 and 270x Devil's have 6Ghz memory chips so you think. The 270x is stable at 1240/1575.


Are they 6GHz chips that are down clocked? 270X Devil only at 5.6GHz out of the box, which is 600MHz over reference 7870


----------



## Horsemama1956

Traded my 770 for an MSI 270x Gaming and cash. Which drivers should I use, 13.12s or a beta driver? I don't play BF4 or theif, so I don't need Mantle.

I doubt I'm going to bother pushing it any further.
http://www.techpowerup.com/gpuz/4ngwx/


----------



## Devildog83

Quote:


> Originally Posted by *Roboyto*
> 
> Are they 6GHz chips that are down clocked? 270X Devil only at 5.6GHz out of the box, which is 600MHz over reference 7870


I believe so, the memory controller or the bios may lock them down or something. The 270x will go past that with ease to 6350 or so. I would much rather have higher core clocks but I think volt lock keeps the 270x a bit low.

You have been added. Let my know when you get you overclock done and i will post clocks. You got a pic?


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> I got the powercolour 280x
> its at 1250/1650 atm. it would go so much higher but the voltage limits it


Hey Rec, how is the PSU cover coming. By the way what are you making it out of?


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> I believe so, the memory controller or the bios may lock them down or something. The 270x will go past that with ease to 6350 or so. I would much rather have higher core clocks but I think volt lock keeps the 270x a bit low.
> 
> You have been added. Let my know when you get you overclock done and i will post clocks. You got a pic?




Best I have at the moment, using my phone while benching from my comfy couch :-D

270X is in my HTPC for my wife. New batman is coming out soon and wasn't available on PS3..so sold my PS3 to upgrade my HTPC for batman


----------



## diggiddi

Quote:


> Originally Posted by *Horsemama1956*
> 
> Traded my 770 for an MSI 270x Gaming and cash. Which drivers should I use, 13.12s or a beta driver? I don't play BF4 or theif, so I don't need Mantle.
> 
> I doubt I'm going to bother pushing it any further.
> http://www.techpowerup.com/gpuz/4ngwx/


Just go for the beta if it has issues then WHQL


----------



## boot318

Hi guys! I have an Sapphire Vapor-X R9 280x and currently overclocking it. Is this card, or that particular BIOS, voltage lock? I can't get over 1.191 for nothing.... even if I set it to 1.3. The 'Power Limit' seems to be working though. Thankfully I can overvolt the memory. I guess my true question is "How much voltage is getting into the danger zone on GPUs?" Currently 1.551v.

[EDIT] I guess the VDDC went down 1.189 since I raised the memory volts a little bit.


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> I believe so, the memory controller or the bios may lock them down or something. The 270x will go past that with ease to 6350 or so. I would much rather have higher core clocks but I think volt lock keeps the 270x a bit low.
> 
> You have been added. Let my know when you get you overclock done and i will post clocks. You got a pic?


PowerColor R9 270X Devil Edition

1260/1570 @ 1300mV +20% power

Screenies for FFXIV Benchmark and 3DMark11. I find FFXIV to be a more strenuous test than 3DMark and Unigine; If I'm stable in FFXIV, then I'll be stable in either of the others.

3DMark11 Score could be better, but I haven't played around with CPU OC in this board yet. When I had it in my Z77 Extreme6, cooled by custom loop with 360mm radiator, it was 100% stable at 4.8GHz. Since then it has been delidded, but now it is only cooled by an Antec Kuhler 620.

http://www.techpowerup.com/gpuz/adu6x/



http://www.3dmark.com/3dm11/8223810



Next step is removing HSF to upgrade TIM and thermal pads for VRMs...they are both getting much too hot for my liking. Hopefully the Fujipoly Ultra Extreme will make a significant impact like it did with my XSPC waterblock on my R9 290.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Hey Rec, how is the PSU cover coming. By the way what are you making it out of?


Hey dude, its goign well, just need to sand it down and paint it









Im using 3mm frosted acrylic


----------



## Devildog83

Quote:


> Originally Posted by *Roboyto*
> 
> PowerColor R9 270X Devil Edition
> 1260/1570 @ 1300mV +20% power
> 
> Screenies for FFXIV Benchmark and 3DMark11. I find FFXIV to be a more strenuous test than 3DMark and Unigine; If I'm stable in FFXIV, then I'll be stable in either of the others.
> 
> 3DMark11 Score could be better, but I haven't played around with CPU OC in this board yet. When I had it in my Z77 Extreme6, cooled by custom loop with 360mm radiator, it was 100% stable at 4.8GHz. Since then it has been delidded, but now it is only cooled by an Antec Kuhler 620.
> 
> http://www.techpowerup.com/gpuz/adu6x/
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8223810
> 
> 
> 
> 
> Next step is removing HSF to upgrade TIM and thermal pads for VRMs...they are both getting much too hot for my liking. Hopefully the Fujipoly Ultra Extreme will make a significant impact like it did with my XSPC waterblock on my R9 290.


So Trixx allows you to get 1300mV? Mine wouldn't allow any voltage increase at all.

Just downloaded the newest version and it allows me to up it to 1.3 also. That's awesome, I am ready to pump it up a bit.


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> So Trixx allows you to get 1300mV? Mine wouldn't allow any voltage increase at all.
> 
> Just downloaded the newest version and it allows me to up it to 1.3 also. That's awesome, I am ready to pump it up a bit.


Sweet, push it to the limit!









I will likely be making a lengthy post here soon regarding the stock heatsinks for VRM. I'll let you know if my upgrade to them has made any difference.


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> So Trixx allows you to get 1300mV? Mine wouldn't allow any voltage increase at all.
> 
> Just downloaded the newest version and it allows me to up it to 1.3 also. That's awesome, I am ready to pump it up a bit.


Quote:


> Originally Posted by *Roboyto*
> 
> Sweet, push it to the limit!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will likely be making a lengthy post here soon regarding the stock heatsinks for VRM. I'll let you know if my upgrade to them has made any difference.


*So I removed the cooler on the Devil to see if I could improve temperatures for VRMs and core. I am pleased to say that I was able to decrease both temperatures by ~10% with a little effort.*



Spoiler: New TIM & Thermal Pads - Lots of Pics!



*WAY too much paste from the factory!*





*Thermal tape BARELY making any contact with VRMs*.

*The larger sink pulled off with barely any effort at all.*



*This smaller sink was making considerably better contact than the first, but it is still sub-par.*



*I didn't want to remove the thermal tape from the factory sinks in case I ever have to RMA the card, I want to be certain I can put it back to stock configuration. I had some sinks lying around from some Arctic Accelero coolers, that I put some Fujipoly Extreme thermal pad on.* *The contact is much better on the thermal pad compared to the tape.*



*The sinks I had were too wide to fit in between the chokes and caps, so out came the Dremel/safety glasses to do some trimming*











*It fits in there now*











*Chopped up a few to cover all VRMs thoroughly*



*Perfect Fit*



*Next was cleaning off all that extra paste...Arcticlean FTW!*







*Scraped off the heatisnk*



*There was a tiny little bit of tape stuck on the VRMs, had to clean that off as well. ArctiClean took it off with ease.*



*Cut some thermal pad*



*All in place*



*New TIM*



*HSF wouldn't go back on, 2 sinks were too tall...trimmed em down*







*The fruits of my labor:*

*FFXIV Core reduced 9C and VRMs reduced 7C*





*3DMark11 Core reduced 8C VRMs reduced 7C*


----------



## Tobe404

Thought I might as well ask this in here...

How close/what's the difference between the GPU and VRM temps as a general rule? Although in this case it's more specific to a 280x (Gigabyte Rev 2).

I have no VRM temp monitor on my card which is the main reason I'm asking.

Just tested 1185/1500 in Heaven 2.5. Ran through okay got to about 65c max GPU temp. Hoping to push it to 1200 on the core. This is at 1175v. Then I'll see how the memory hadles being OC'd.

Cheers.


----------



## Roboyto

Quote:


> Originally Posted by *Tobe404*
> 
> Thought I might as well ask this in here...
> 
> How close/what's the difference between the GPU and VRM temps as a general rule? Although in this case it's more specific to a 280x (Gigabyte Rev 2).
> 
> I have no VRM temp monitor on my card which is the main reason I'm asking.
> 
> Just tested 1185/1500 in Heaven 2.5. Ran through okay got to about 65c max GPU temp. Hoping to push it to 1200 on the core. This is at 1175v. Then I'll see how the memory hadles being OC'd.
> 
> Cheers.


*Just did some extensive benchmarking with a friends Visiontek R9 280X, which doesn't have as nice of a cooler as your Gigabyte. On his card the core/VRM1 temperatures were always within a few degrees C of each other if not matching. The Visiontek does have a backplate on it from the factory, so that could be assisting in VRM cooling. *

*Your card does have a heatsink on the VRM1 so that is a good thing. The 2.0 board is not reference, just in case you were planning to go full water.*



*The Visiontek cooler really surprised me at how well it cooled both the core/VRM1, as well as how quiet it is at 80% fan. The clocks on it are decent 1190/1700 @1300mV +20% power; and that is with Elpida RAM.*

*You've got nice clocks for that voltage.*


----------



## Roboyto

Quote:


> Originally Posted by *boot318*
> 
> 
> 
> Hi guys! I have an Sapphire Vapor-X R9 280x and currently overclocking it. Is this card, or that particular BIOS, voltage lock? I can't get over 1.191 for nothing.... even if I set it to 1.3. The 'Power Limit' seems to be working though. Thankfully I can overvolt the memory. I guess my true question is "How much voltage is getting into the danger zone on GPUs?" Currently 1.551v.
> 
> [EDIT] I guess the VDDC went down 1.189 since I raised the memory volts a little bit.



*VDDC is the value you're going to want to watch. If you were at 1.551V on the core the card would, likely, be toast. *


*I'm not sure what MVDDC is exactly because my cards don't display that value. My educated guess would be memory voltage, as most GDDR5 operates at around 1.5V*


*Dangerzone on these cards I believe is north of 1.3V; Someone else here will have a more exact answer for you since I don't have extensive experience with the 79XX/280(X) cards.*


*I highly doubt the card is voltage locked. If it were hardware limited you wouldn't be able to adjust the setting in an OC program. I experienced this with an XFX 7950 I briefly owned.*

*What program are you using to OC? If it is Trixx make sure you have ULPS disabled and 'Force Constant Voltage' checked in the options tab. If you are using AfterBurner I suggest giving Trixx a shot as well.*



*You must be adding a decent amount of voltage to the memory, it is rare to see VRM2 hotter than VRM1.*


----------



## Devildog83

*Roboyto*, it's nice to see that drop. My 270x is in the lower slot and I have zero heat issues with it but the 7870 does have serious heat issues. Running Valley at 1250/1460 in X-Fire the 7870 hit 91C which is way hotter than the 270x around 70C max. I am going to replace the TIM and pads soon and had planned to do so. It's very good to see those temp drops.


----------



## JCH979

Very nice post Roboyto









I've been contemplating replacing the TIM and maybe the pads as well on my card now that temps are starting to go up. I've never done anything like this though, the thought of keeping the warranty sticker intact or messing something up has me a bit shaky..


----------



## Tobe404

Just did 3 runs of Heaven 2.5 at 1200/1500/1165v with no crashes. Everything at it's highest setting at 1080p. Would this be considered stable? Yet to game at these clocks so we'll see on that front.

I tried upping the memory but that actually made my FPS worse.

Best I ever got (out of luck because I haven't been able to repeat it after numerous runs at the same clocks, all be it 1 extra average frame and 8 extra at max lol) was 48.2 average/118 max/26 min/1215 overall score (forgot to save it).

Consistent numbers I get around 47 average/110 max/24 min/1185 overall score.

Oh and max temp was 67c


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> *Roboyto*, it's nice to see that drop. My 270x is in the lower slot and I have zero heat issues with it but the 7870 does have serious heat issues. Running Valley at 1250/1460 in X-Fire the 7870 hit 91C which is way hotter than the 270x around 70C max. I am going to replace the TIM and pads soon and had planned to do so. It's very good to see those temp drops.


*You should at least be able to cool that core temperature down with some good thermal paste.*

*With your cards hanging upside down as opposed to mine standing vertically, you may need something with a little more adhesion than thermal pads*







*. I am going to check the card after about a week to make sure the sinks are still making good contact with the pads.*

*I'm also contemplating sticking some thermal pads between the backplate and VRMs to assist in cooling even further. The 270X has some ventilation holes on the backplate for VRMs, but I don't have any way to get airflow over that spot on the card.*

Quote:


> Originally Posted by *JCH979*
> 
> Very nice post Roboyto
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've been contemplating replacing the TIM and maybe the pads as well on my card now that temps are starting to go up. I've never done anything like this though, the thought of keeping the warranty sticker intact or messing something up has me a bit shaky..


*Thank you!*

*Do you have the ASUS 280X that is in your rig sig? If so, ASUS shouldn't void your warranty when you remove the factory heatsink. I had my 7870 DC2 under water, returned it to stock, sold it to a friend-of-a-friend who somehow managed to blow it up with 'a tiny overclock'







*

*I was a nice guy and RMA'd the card for him, and had no issues whatsoever.*

*If you really want to be sure that your warranty will be intact after removing the HSF, either call or e-mail ASUS and ask them. I really don't think it will void your warranty when EK makes blocks specifically for the DC2 cards*









*I was looking on EK's CoolingConfigurator site and there are a few versions of the 280X DC2. Depending on which one you have, the card could have a fairly extravagant heatsink for practically the whole PCB, covering RAM and VRMs.*

*Original*



*V2*


----------



## Roboyto

Quote:


> Originally Posted by *Tobe404*
> 
> Just did 3 runs of Heaven 2.5 at 1200/1500/1165v with no crashes. Everything at it's highest setting at 1080p. Would this be considered stable? Yet to game at these clocks so we'll see on that front.
> 
> I tried upping the memory but that actually made my FPS worse.
> 
> Best I ever got (out of luck because I haven't been able to repeat it after numerous runs at the same clocks, all be it 1 extra average frame and 8 extra at max lol) was 48.2 average/118 max/26 min/1215 overall score (forgot to save it).
> 
> Consistent numbers I get around 47 average/110 max/24 min/1185 overall score.
> 
> Oh and max temp was 67c


*Should be OK, but you'll have to play some games to know for sure. As good of a test as benchmarks are, games are going to load the card differently and may not be 100% stable. If you do run into issues, usually a very small decrease in clock speed will fix the problem.*

*Heaven/Valley can vary by 1-2 FPS even with the same exact clocks; from my experience that is normal.*

*Nice temps, should add your computer with the rig builder in your profile







*


----------



## Unknownm

I was reading up that 280x ram has pre-set speed timings. If I clocked 1600mhz (from 1500) it would switch timings and become stable.

When I apply 1600mhz to memory the 3d applications runs, but after 10 minutes grey screen comes on (memory error). Any BIOS mods or memory voltage mods to push my memory higher?


----------



## Roboyto

Quote:


> Originally Posted by *Unknownm*
> 
> I was reading up that 280x ram has pre-set speed timings. If I clocked 1600mhz (from 1500) it would switch timings and become stable.
> 
> When I apply 1600mhz to memory the 3d applications runs, but after 10 minutes grey screen comes on (memory error). Any BIOS mods or memory voltage mods to push my memory higher?


For Trixx make sure you have ULPS disabled and 'Force Constant Voltage' checked in the Settings Tab. Have you increased the Power Limit? That may help. If it's not stable at 1600, you can always try to lower the clock 5-10MHz.

You could try Afterburner. Depending on if your card will allow, I believe the Aux Voltage setting in AB adjusts memory voltage. Adding a little voltage may help.

*It's really hit or miss with Elpida RAM.* Not nearly as many cards with Elpida RAM get good memory clocks compared to Hynix.

My buddy just bought a VisionTek 280X with Elpida RAM, and his is rock solid at 1175/1700. I was shocked considering my PowerColor Devil 270X has same RAM chips in it and I could only manage 1565 on the memory.


----------



## Unknownm

Quote:


> Originally Posted by *Roboyto*
> 
> For Trixx make sure you have ULPS disabled and 'Force Constant Voltage' checked in the Settings Tab. Have you increased the Power Limit? That may help. If it's not stable at 1600, you can always try to lower the clock 5-10MHz.
> 
> You could try Afterburner. Depending on if your card will allow, I believe the Aux Voltage setting in AB adjusts memory voltage. Adding a little voltage may help.
> 
> *It's really hit or miss with Elpida RAM.* Not nearly as many cards with Elpida RAM get good memory clocks compared to Hynix.
> 
> My buddy just bought a VisionTek 280X with Elpida RAM, and his is rock solid at 1175/1700. I was shocked considering my PowerColor Devil 270X has same RAM chips in it and I could only manage 1565 on the memory.


Both options are already set in Trixx. Also I don't have any aux voltage settings in AB, AB doesn't even give me a voltage setting for my gpu unlike Trixx

Also yes 20% powerlimit is set in Trixx/AMD Overdrive


----------



## Roboyto

Quote:


> Originally Posted by *Unknownm*
> 
> Both options are already set in Trixx. Also I don't have any aux voltage settings in AB, AB doesn't even give me a voltage setting for my gpu unlike Trixx
> 
> Also yes 20% powerlimit is set in Trixx/AMD Overdrive


*Well...only other thing I can think of is driver version. Are you on a beta 14.X? They are still a little buggy, but 13.12 is solid.*

*Otherwise, you may have simply hit the limit for your card on the RAM clock.*

*Black screen is definitely caused by too high of a memory clock from my experience, may just have to back it off a little bit unfortunately.*


----------



## boot318

Quote:


> Originally Posted by *Roboyto*
> 
> 
> *VDDC is the value you're going to want to watch. If you were at 1.551V on the core the card would, likely, be toast. *
> 
> *I'm not sure what MVDDC is exactly because my cards don't display that value. My educated guess would be memory voltage, as most GDDR5 operates at around 1.5V*
> 
> *Dangerzone on these cards I believe is north of 1.3V; Someone else here will have a more exact answer for you since I don't have extensive experience with the 79XX/280(X) cards.*
> 
> *I highly doubt the card is voltage locked. If it were hardware limited you wouldn't be able to adjust the setting in an OC program. I experienced this with an XFX 7950 I briefly owned.*
> *What program are you using to OC? If it is Trixx make sure you have ULPS disabled and 'Force Constant Voltage' checked in the options tab. If you are using AfterBurner I suggest giving Trixx a shot as well.*
> 
> 
> 
> *You must be adding a decent amount of voltage to the memory, it is rare to see VRM2 hotter than VRM1.*


Thanks!!! +rep

The only reason I started messing with MVDDC (which I assume is memory voltage) is because VDDC wouldn't go over 1.191, no matter what I set. I tried Trixx but switch to MSI. If I didn't touch the MVDDC I could barely get over 1600 on the memory. I'll try thy disabling ULPS and 'Force Constant Voltage' but I could live with the card if memory voltage isn't too high. .051 over stock. I probably could lower it a little also.


----------



## Roboyto

Quote:


> Originally Posted by *boot318*
> 
> Thanks!!! +rep
> 
> The only reason I started messing with MVDDC (which I assume is memory voltage) is because VDDC wouldn't go over 1.191, no matter what I set. I tried Trixx but switch to MSI. If I didn't touch the MVDDC I could barely get over 1600 on the memory. I'll try thy disabling ULPS and 'Force Constant Voltage' but I could live with the card if memory voltage isn't too high. .051 over stock. I probably could lower it a little also.


*You're welcome!*

*The force constant voltage might get you a little more OC on the core, hard to say for sure until you give it a go.*

*I don't think you're really hurting anything on the memory with .051V additional. I was surprised to see VRM2 hotter than VRM1*


----------



## Arkanon

Couple of weeks on VDDC of 1.412V. Card still running strong. No thermal throttling what so ever, no glitches, no hicups. Fan speed locked at 50% max and not exceeding 75°C. After some considerable testing it's safe to assume the 270x at clock speeds above 1350MHz on the core matches or comes near to match a stock out of the box 280x/7970GHZ.


----------



## Roboyto

Quote:


> Originally Posted by *Arkanon*
> 
> Couple of weeks on VDDC of 1.412V. Card still running strong. No thermal throttling what so ever, no glitches, no hicups. Fan speed locked at 50% max and not exceeding 75°C. After some considerable testing it's safe to assume the 270x at clock speeds above 1350MHz on the core matches or *comes near to match a stock out of the box 280x/7970GHZ.*


*I would agree to it coming near 280X stock speeds. Just did some benching on my 270X Devil and a VisionTek 280X. For FFXIV benchmark, my card* *at 1260/1565* *scored 90% as high as the stock 280X.*

*It is safe to say that a well clocked 270X can easily match/best a 7950/280 at stock speeds.*

*P.S. You must have one hell of a 270X to hit 1350







Got some pics?*

*What card/cooler is that to not exceed 75C at that voltage/clocks and only 50% fan speed? My 270X at 1260/1565 1.3V was hitting 72C in 3DMark11 with fans at 80%*


----------



## Unknownm

Quote:


> Originally Posted by *Roboyto*
> 
> *Well...only other thing I can think of is driver version. Are you on a beta 14.X? They are still a little buggy, but 13.12 is solid.*
> 
> *Otherwise, you may have simply hit the limit for your card on the RAM clock.*
> 
> *Black screen is definitely caused by too high of a memory clock from my experience, may just have to back it off a little bit unfortunately.*


Only differences I ever seen was flashing my rev01 card to rev02 BIOS, which allowed trixx voltage slider to go up 1.3v from 1.256b. However after applying 1.3v, I could not get any better results with the core overclock so I assume it doesn't actually apply 1.3v and my limit is 1.256v


----------



## Zero_

After 4 months on this thread, finally bothered myself enough to do a screencap. Can I join?


----------



## boot318

Quote:


> Originally Posted by *Roboyto*
> 
> *You're welcome!*
> 
> *The force constant voltage might get you a little more OC on the core, hard to say for sure until you give it a go.*
> 
> *I don't think you're really hurting anything on the memory with .051V additional. I was surprised to see VRM2 hotter than VRM1*


It finally work! I don't know how to feel having to force constant voltage but I don't have AB apply the overclock when I boot anyway. Kinda strange because at 1.25v with core @1200, under load I see the VDDC around 1.20-1.21. Poor components or is that normal for an GPU? Makes me want to set the VDDC to 1.3 and go for maximum overclock since I know the memory can be pushed harder than 1700.


----------



## Anonizer

Quote:


> Originally Posted by *Roboyto*
> 
> You may have just hit the limit of your card honestly, you have to figure the 270X is still a 7870 which stock are 1000 core clock. Getting to 1200 is a 20% boost, which is a respectable OC. I have had cards that get far less than 20%. _You're just seeing less of the overclock capabilities because your card came with an overclock of 1100 MHz already._
> 
> My ASUS DC2 only got a little further than your clocks with similar voltage while being fully water cooled. I don't know how many of these cards get very far far past 1200.
> 
> The core speed is much more important on these cards. If you have Overclocked the RAM, reducing it back to stock speeds might help you push the core a little further.
> 
> I've been trying to locate a picture of the heatsink on that card and I can't find one. You may want to remove the heatsink to see if/how it is cooling the VRM.
> 
> It looks like there are 8 screws in total to remove your HSF.
> 
> 
> I checked on EK's CoolingConfigurator website and it does *NOT* look like your card has heatsinks on the VRMs, this is likely inhibiting your ability to overclock. The VRMs for core/board are marked with the yellow, there are 5 in total; the little black squares. The Orange one is the VRM for RAM, and this is less important. At the very least, the lack of heatsinks will reduce the longevity of the card.
> 
> 
> 
> If you're at 1.3V then you are likely pushing the limits of air cooling for the VRM, especially if your card does not have heatsinks on the VRM.
> 
> This is a picture of my devil with heatsink/fan removed, I have the heatsinks for VRM marked in yellow. Your card should look similar to this, but may not be exact. The Devil 270X has upgraded VRMs for enhanced OCing and stability.
> 
> 
> If you want to add heatsinks it's not a terribly difficult procedure, would just have to make sure that they will clear the fins of the cooler.
> 
> It is surprising that the Vapor-X edition doesn't have VRM sinks, my HD4890 did


Sapphire support replied to my ticket. Here's what they said:

"Did you see VRM temp before for this card ? Some of model VRM temp information has been remove"

I'm wondering if I could install heatsinks on the VRMs cause the thermal pads (please correct me if I'm wrong since I don't know what they are, just taking a guess) presses on the VRMs.
The HSF looks like this


Spoiler: HSF







I read the RMA/warranty policy and it says:

1. Product Warranty will not be valid even if returned after purchased for the following cases:
- Products that are defaced or physically damaged and modified by customer.

2. GPU or ASIC device (VGA) is sensitive to thermal (heat) issue. The heat sink or fan is designed to meet the requirements for reliability of the product and the warranty is rendered invalid if the product is dismantled or the heat sink or cooler fan are removed as this may causes damage to the GPU or ASIC device . Non-compliance will cause the warranty of the product to be void and repair will be at the users cost.

Will Sapphire know if I removed the HSF? Or if I changed the thermal paste?


----------



## deepcool0922

Joined!


----------



## Roboyto

Quote:


> Originally Posted by *boot318*
> 
> It finally work! I don't know how to feel having to force constant voltage but I don't have AB apply the overclock when I boot anyway. Kinda strange because at 1.25v with core @1200, under load I see the VDDC around 1.20-1.21. Poor components or is that normal for an GPU? Makes me want to set the VDDC to 1.3 and go for maximum overclock since I know the memory can be pushed harder than 1700.


*Force constant voltage is only applied while the card is under load, it will still reduce voltage/locks when necessary. No worries







*

*The difference in voltage is likely due to voltage droop, this is common on just about any card.*

*As long as you can keep the core/VRMs cool enough, moving the voltage over to 1.3V is fine.*


----------



## Roboyto

Quote:


> Originally Posted by *Anonizer*
> 
> Sapphire support replied to my ticket. Here's what they said:
> 
> "Did you see VRM temp before for this card ? Some of model VRM temp information has been remove"
> 
> I'm wondering if I could install heatsinks on the VRMs cause the thermal pads (please correct me if I'm wrong since I don't know what they are, just taking a guess) presses on the VRMs.
> The HSF looks like this
> 
> 
> Spoiler: HSF
> 
> 
> 
> 
> 
> 
> 
> I read the RMA/warranty policy and it says:
> 
> 1. Product Warranty will not be valid even if returned after purchased for the following cases:
> - Products that are defaced or physically damaged and modified by customer.
> 
> 2. GPU or ASIC device (VGA) is sensitive to thermal (heat) issue. The heat sink or fan is designed to meet the requirements for reliability of the product and the warranty is rendered invalid if the product is dismantled or the heat sink or cooler fan are removed as this may causes damage to the GPU or ASIC device . Non-compliance will cause the warranty of the product to be void and repair will be at the users cost.
> 
> Will Sapphire know if I removed the HSF? Or if I changed the thermal paste?


*If that is what the HSF looks like, then the purple pads would be there for contacting the VRMs.*

*Does your card have 'Warranty Void' stickers on any of the screws? If there are stickers over the screws stating 'Warranty Void' or something of the sort, then it is at your discretion if you want to remove the HSF or not.*

*Typically your warranty will still be valid as long as there is no physical damage to the card and it is returned to stock configuration.*


----------



## Obsyd

As promised here is the card.


I'm having weird problems with it lately. Sometimes when I browse the web my monitor turns black as if the signal is lost for a few seconds (1 or 2) and it turns on again. While gaming I never experienced this or any other problem.
I'm using HDMI and I'm not over clocking the card yet. I'm on the 015.041 BIOS. The problem happens in windows and linux.
Is this "normal"? Or should I be worried?


----------



## Anonizer

Quote:


> Originally Posted by *Roboyto*
> 
> *If that is what the HSF looks like, then the purple pads would be there for contacting the VRMs.*
> 
> *Does your card have 'Warranty Void' stickers on any of the screws? If there are stickers over the screws stating 'Warranty Void' or something of the sort, then it is at your discretion if you want to remove the HSF or not.*
> 
> *Typically your warranty will still be valid as long as there is no physical damage to the card and it is returned to stock configuration.*


Just removed the HSF, and it does look like that. Then the HSF also cools the VRMs then?

Nope, luckily it doesn't.

Thanks for the help!







+


----------



## Devildog83

Quote:


> Originally Posted by *Roboyto*
> 
> *Force constant voltage is only applied while the card is under load, it will still reduce voltage/locks when necessary. No worries
> 
> 
> 
> 
> 
> 
> 
> *
> 
> *The difference in voltage is likely due to voltage droop, this is common on just about any card.*
> 
> *As long as you can keep the core/VRMs cool enough, moving the voltage over to 1.3V is fine.*
> [/quote
> 
> Edit: I had the sensor set to show max.


----------



## Recr3ational

Has anyone accurately manage to go above 1.3 yet? I want to push my cards more.


----------



## Calaphos

The Asus R9 280x or an gtx 770 for gaming on 1440p ?
Woud prefer to play modern and future game at least on high settings.
Which card does perform better?


----------



## Devildog83

Quote:


> Originally Posted by *Calaphos*
> 
> The Asus R9 280x or an gtx 770 for gaming on 1440p ?
> Woud prefer to play modern and future game at least on high settings.
> Which card does perform better?


From the reviews I have seen the 280x is slightly better in performance at 1440p res. Most likely due to 3 Gb's of memory over just 2 Gb's on the 770.


----------



## diggiddi

Quote:


> Originally Posted by *Roboyto*
> 
> *Force constant voltage is only applied while the card is under load, it will still reduce voltage/locks when necessary. No worries
> 
> 
> 
> 
> 
> 
> 
> *
> 
> *The difference in voltage is likely due to voltage droop, this is common on just about any card.*
> 
> *As long as you can keep the core/VRMs cool enough, moving the voltage over to 1.3V is fine.*


That is not good for overclocking right? ie leave that unchecked when trying to OC


----------



## Arkanon

Quote:


> Originally Posted by *Roboyto*
> 
> *I would agree to it coming near 280X stock speeds. Just did some benching on my 270X Devil and a VisionTek 280X. For FFXIV benchmark, my card* *at 1260/1565* *scored 90% as high as the stock 280X.*
> 
> *It is safe to say that a well clocked 270X can easily match/best a 7950/280 at stock speeds.*
> 
> *P.S. You must have one hell of a 270X to hit 1350
> 
> 
> 
> 
> 
> 
> 
> Got some pics?*
> 
> *What card/cooler is that to not exceed 75C at that voltage/clocks and only 50% fan speed? My 270X at 1260/1565 1.3V was hitting 72C in 3DMark11 with fans at 80%*


I have a regular MSI 270X gaming. For some unexplainable reason i'm able to push 1.4 VDDC through the card, so that explains pretty much why i can clock the card so high.



and a 3DM11 result from the run i just did with a very bloated windows that hasn't been rebooted in a few days.
http://www.3dmark.com/3dm11/8229129


----------



## Roboyto

Quote:


> Originally Posted by *Anonizer*
> Just removed the HSF, and it does look like that. Then the HSF also cools the VRMs then?
> 
> Nope, luckily it doesn't.
> 
> Thanks for the help!
> 
> 
> 
> 
> 
> 
> 
> +


You're welcome.

That's good the HSF helps cool the VRMs. You can always change the thermal pads on the heatsink and likely improve the cooling of the VRMs. I have had great luck with the Fujipoly pads. They are expensive, but they do work very well.

If you want the best ones here's the link:

http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html


----------



## Roboyto

Quote:


> Originally Posted by *Arkanon*
> 
> I have a regular MSI 270X gaming. For some unexplainable reason i'm able to push 1.4 VDDC through the card, so that explains pretty much why i can clock the card so high.
> 
> 
> 
> and a 3DM11 result from the run i just did with a very bloated windows that hasn't been rebooted in a few days.
> http://www.3dmark.com/3dm11/8229129


I saw some of your previous posts and tried HIS iTurbo with my Devil 270X and iTurbo allowed another 18mV compared to Trixx. It also allows the power limit to go to 100% as opposed to 20%.

I added the extra 18mV, 1318,and increased power limit to 50% and got mine to 1280/1550.

You got an unbelievable card there!


----------



## Roboyto

Quote:


> Originally Posted by *diggiddi*
> 
> That is not good for overclocking right? ie leave that unchecked when trying to OC


No, you want that box checked for OC. Disable ULPS in the options


----------



## Arkanon

I suppose you have voltage control in AB? What i did was simply unlock the voltage control in AB (1300mv), after that i installed HIS iTurbo and maxed out the voltage slider there as well. I suspect that both the programs each add about 100mv extra to the card, allthough i can't confirm that actually got me to the 1.4 ish volts on the card. Would be nice if someone else tried to reproduce the same steps i did and see if it works for their cards as well.


----------



## Horsemama1956

My card allows voltage up to 1350 in After burner, but I can't get much in term of voltage readings in GPUz.


----------



## diggiddi

Quote:


> Originally Posted by *Roboyto*
> 
> No, you want that box checked for OC. Disable ULPS in the options


Gottit repped


----------



## Roboyto

Quote:


> Originally Posted by *Arkanon*
> 
> I suppose you have voltage control in AB? What i did was simply unlock the voltage control in AB (1300mv), after that i installed HIS iTurbo and maxed out the voltage slider there as well. I suspect that both the programs each add about 100mv extra to the card, allthough i can't confirm that actually got me to the 1.4 ish volts on the card. Would be nice if someone else tried to reproduce the same steps i did and see if it works for their cards as well.


I will test that theory when I get home from work tonight.









If I can get to 1.4V and push the clocks even further, I would probably put an AIO on it to make sure it doesn't make any noise


----------



## Roboyto

Quote:


> Originally Posted by *Obsyd*
> 
> As promised here is the card.
> 
> 
> I'm having weird problems with it lately. Sometimes when I browse the web my monitor turns black as if the signal is lost for a few seconds (1 or 2) and it turns on again. While gaming I never experienced this or any other problem.
> I'm using HDMI and I'm not over clocking the card yet. I'm on the 015.041 BIOS. The problem happens in windows and linux.
> Is this "normal"? Or should I be worried?


That is definitely not normal behavior, but a few simple things may be able to fix your problem.

Do you have a different HDMI cable to substitute; or even a DVI? That is an uncommon issue, from my experience, a cable could be the culprit.

What driver version are you running? Anything Beta 14.X have had all sorts of strange quirks, I would suggest 13.12

It is always best to do a driver scrub, even if you don't get a different card. Display Driver Uninstaller: http://www.wagnardmobile.com/DDU/download/DDU%20v12.6.4.exe

Is that the original BIOS for the card, an updated one, or from a different card? Changing BIOSes can have unforeseen repercussions.


----------



## Roboyto

Quote:


> Originally Posted by *Recr3ational*
> 
> Has anyone accurately manage to go above 1.3 yet? I want to push my cards more.


I never have, but I have seen more than one person speak of 7950/7970 at 1.4V under water with some very high clocks. Of course this all depends on if you get cards that will allow that voltage and are capable of taking advantage of it. It may require flashing the BIOS to get that kind of voltage. Use the search function for the forums, you should be able to find some information on very high clocks for 7970's.

I bought (2) 290s from NewEgg on a combo deal that were 5 serial numbers apart. One card went to 1080/1350 no matter what I tried. The other card is a freakin' beast running up to 1300/1700...silicon lottery winner


----------



## Recr3ational

yeah, i've looked around for people with 280x, who has a bios with more voltage but noone has.


----------



## mikemykeMB

Anyone know or has been using a well soldered PCIe Flexible Riser Cable that's not cheap in design, and reliable for not causing issues?
....and don't say amazon, newegg, or DIYmod, been there and not liking the reviews. Then again this must be a bad idea to use? Wanted to flip GPU downwards so top side is shown while in the case and also to show the cooling block and what nots.


----------



## Roboyto

Quote:


> Originally Posted by *mikemykeMB*
> 
> Anyone know or has been using a well soldered PCIe Flexible Riser Cable that's not cheap in design, and reliable for not causing issues?
> ....and don't say amazon, newegg, or DIYmod, been there and not liking the reviews. Then again this must be a bad idea to use? Wanted to flip GPU downwards so top side is shown while in the case and also to show the cooling block and what nots.


I read on a build log for a computer mounted to a wall, with everything spread out...Really sweet looking! 3M risers are the way to go. Can't remember the name of the build log however.


----------



## Roboyto

Quote:


> Originally Posted by *Recr3ational*
> 
> yeah, i've looked around for people with 280x, who has a bios with more voltage but noone has.


Will the card take a 7970 BIOS?


----------



## mikemykeMB

Quote:


> Originally Posted by *Roboyto*
> 
> I read on a build log for a computer mounted to a wall, with everything spread out...Really sweet looking! 3M risers are the way to go. Can't remember the name of the build log however.


I remember that one also, just wanted to search out for a better cable for doing this so it doesn't crack up on me and do something I'd prefer not to.









Will always have the GPU OC'd to it's limit and don't see any want to trim it back for this.

Did see something on the TPU web with a minion based case mod tho. Kinda sparked an idea I had awhile back ago.


----------



## Devildog83

Quote:


> Originally Posted by *mikemykeMB*
> 
> Anyone know or has been using a well soldered PCIe Flexible Riser Cable that's not cheap in design, and reliable for not causing issues?
> ....and don't say amazon, newegg, or DIYmod, been there and not liking the reviews. Then again this must be a bad idea to use? Wanted to flip GPU downwards so top side is shown while in the case and also to show the cooling block and what nots.


search OCN for megaman or red1776 and they should know. I do know they found very nice ones somewhere.


----------



## mikemykeMB

Quote:


> Originally Posted by *Devildog83*
> 
> search OCN for megaman or red1776 and they should know. I do know they found very nice ones somewhere.


Cool thanks, will do, found couple megaman's???, but will send a pm to red1776.


----------



## Devildog83

Quote:


> Originally Posted by *mikemykeMB*
> 
> Cool thanks, will do, found couple megaman's???, but will send a pm to red1776.


Mega Man - http://www.overclock.net/u/314268/mega-man


----------



## eAT5

some Ultra Bf4 1080p with framerates on my 280x's


----------



## GuestVeea

Quote:


> Originally Posted by *eAT5*
> 
> some Ultra Bf4 1080p with framerates on my 280x's


Are you using Beta drivers with Mantle, or directx?


----------



## Roboyto

Quote:


> Originally Posted by *Arkanon*
> 
> I suppose you have voltage control in AB? What i did was simply unlock the voltage control in AB (1300mv), after that i installed HIS iTurbo and maxed out the voltage slider there as well. I suspect that both the programs each add about 100mv extra to the card, allthough i can't confirm that actually got me to the 1.4 ish volts on the card. Would be nice if someone else tried to reproduce the same steps i did and see if it works for their cards as well.


I installed AB (newest beta) and enabled:


Hardware Control & Monitoring
Low-Level Hardware Access Interface
Unlock Voltage Control - Reference/Standard MSI/Extended MSI
Unlock Voltage Monitoring
Extend Official Overclocking Limits
Disable ULPS
Unofficial Overclocking Mode - With/Without PowerPlay Support

No combination allowed to increase the voltage any further than 1318; which I got from HIS iTurbo previously. No Matter what I did AB does not allow me to adjust voltage, the slider is still grayed out.

1318mV is now available in Trixx, where it was previously capped at 1300.

Maybe if I flash to MSI BIOS?


----------



## eAT5

Quote:


> Originally Posted by *GuestVeea*
> 
> Are you using Beta drivers with Mantle, or directx?


in those i was using

CCC 14.3 DirectX


















but later on i switched to Mantle and was getting better FPS

between 100 and 160 FPS with mantle on metro and locker....

im trying OBS to record but quality blows in the end....


----------



## Arkanon

Quote:


> Originally Posted by *Roboyto*
> 
> I installed AB (newest beta) and enabled:
> 
> Hardware Control & Monitoring
> Low-Level Hardware Access Interface
> Unlock Voltage Control - Reference/Standard MSI/Extended MSI
> Unlock Voltage Monitoring
> Extend Official Overclocking Limits
> Disable ULPS
> Unofficial Overclocking Mode - With/Without PowerPlay Support
> No combination allowed to increase the voltage any further than 1318; which I got from HIS iTurbo previously. No Matter what I did AB does not allow me to adjust voltage, the slider is still grayed out.
> 
> 1318mV is now available in Trixx, where it was previously capped at 1300.
> 
> Maybe if I flash to MSI BIOS?


Have you tried by unchecking low-level hardware access interface ?
I need to run AB with it unchecked otherwise i have no voltage control either.
I strongly doubt my card is a one off or 'faulty' for that matter so i do strongly believe it's all in the settings.


----------



## Unknownm

Quote:


> Originally Posted by *GuestVeea*
> 
> Are you using Beta drivers with *Mantle, or directx?*


Sorry I do not mean to be coming off as mean.

If someone takes a screenshot of BF4 with FPS reporting on the bottom left (bf4 fps draw is on top right) and you ask a question if that person is using Directx, mantle. You should know hes using DirectX because far as I know, no screenshot 3d application supports mantle and if it does, it doesn't take it correctly (Please correct me if I'm wrong)

When I looked at his post, already knew it was DX only because you can see fraps reporting FPS on bottom left. Type in "screenshot.render" in console inside BF4 to take a screenshot of mantle


----------



## eAT5

im gonna try 14.4 Mantle tonight....


----------



## Tobe404

So I've managed to get my card up to 1150/1600 at 1170v. Tops out around 70c now. Custom fan profile. Want to see if I can push the memory to 1700.

Only odd thing is. If the memory is anywhere above 1500 it does worse in Heaven benchmarks.

Yet a memory OC improved my FPS in Crysis 3 by about 2 (min and average). Did a multiple run through using fraps. This could still just be margin of error however.


----------



## Unknownm

Quote:


> Originally Posted by *Tobe404*
> 
> So I've managed to get my card up to 1150/1600 at 1170v. Tops out around 70c now. Custom fan profile. Want to see if I can push the memory to 1700.
> 
> Only odd thing is. If the memory is anywhere above 1500 it does worse in Heaven benchmarks.
> 
> Yet a memory OC improved my FPS in Crysis 3 by about 2 (min and average). Did a multiple run through using fraps. This could still just be margin of error however.


Quote:


> Since the latency can be decreased by increasing the clock frequency (lower time per cycle), it appears that the memory frequency itself has a major impact on the performance. However as long as the VBIOS is calibrated as it should, the performance is identical at 1250MHz (DDR-5000) and at 1500MHz (DDR-6000) for example.
> 
> When the VBIOS works properly the timings are adjusted by the memory controller based on presets defined by the VBIOS.
> For example on 7970 card the VBIOS should contain timing presets for following clock regions:
> 
> 400MHz (0-400MHz)
> 800MHz (401-800MHz)
> 900MHz (801-900MHz)
> 1000MHz (901-1000MHz)
> 1125MHz (1001-1125MHz)
> 1250MHz (1126-1250MHz)
> 1375MHz (1251-1375MHz)
> 1500MHz (1376-1500MHz)
> 1625MHz (1501-1625MHz)
> 1750MHz (1626-VCO Max)
> 
> So when the default memory clocks of a 7970 card is 1500MHz, the memory controller will drive timings defined by the 1500MHz clock profile (if present). As soon as the user sets the memory clock to 1501MHz (e.g.) the memory controller will start driving timings defined by the 1625MHz profile. The same thing happens if the clocks are lowered to 1375MHz for example.
> 
> Since the higher frequency profiles will usually have looser timings a slight increase in clock frequency might actually increase the memory latency instead of lowering it. For example at 1500MHz the GDDR5 timings for Hynix modules are 16-17-17 (tCL-tRCD-tRP, in MEMCLKs). This means that the actual cycle time is 10.66ns for tCL, 11.33ns for tRCD and tRP (1000 / 1500 * delay in MEMCLKs). For the 1625MHz clock profile the timings are 17-18-18. This means that if you overclock the memory from 1500MHz to 1550MHz the actual memory latency will increase while the bandwidth increases: 10.967ns for tCL and 11.612ns for tRCD and tRP. This is one of the reasons why overclocking the memory is not always desireable.


https://litecointalk.org/index.php?topic=12369.0


----------



## Roboyto

Quote:


> Originally Posted by *Arkanon*
> 
> Have you tried by unchecking low-level hardware access interface ?
> I need to run AB with it unchecked otherwise i have no voltage control either.
> I strongly doubt my card is a one off or 'faulty' for that matter so i do strongly believe it's all in the settings.


No luck


----------



## bardacuda

Quote:


> Originally Posted by *Unknownm*
> 
> Since the latency can be decreased by increasing the clock frequency (lower time per cycle), it appears that the memory frequency itself has a major impact on the performance. However as long as the VBIOS is calibrated as it should, the performance is identical at 1250MHz (DDR-5000) and at 1500MHz (DDR-6000) for example.
> 
> When the VBIOS works properly the timings are adjusted by the memory controller based on presets defined by the VBIOS.
> For example on 7970 card the VBIOS should contain timing presets for following clock regions:
> 
> 400MHz (0-400MHz)
> 800MHz (401-800MHz)
> 900MHz (801-900MHz)
> 1000MHz (901-1000MHz)
> 1125MHz (1001-1125MHz)
> 1250MHz (1126-1250MHz)
> 1375MHz (1251-1375MHz)
> 1500MHz (1376-1500MHz)
> 1625MHz (1501-1625MHz)
> 1750MHz (1626-VCO Max)
> 
> So when the default memory clocks of a 7970 card is 1500MHz, the memory controller will drive timings defined by the 1500MHz clock profile (if present). As soon as the user sets the memory clock to 1501MHz (e.g.) the memory controller will start driving timings defined by the 1625MHz profile. The same thing happens if the clocks are lowered to 1375MHz for example.
> 
> Since the higher frequency profiles will usually have looser timings a slight increase in clock frequency might actually increase the memory latency instead of lowering it. For example at 1500MHz the GDDR5 timings for Hynix modules are 16-17-17 (tCL-tRCD-tRP, in MEMCLKs). This means that the actual cycle time is 10.66ns for tCL, 11.33ns for tRCD and tRP (1000 / 1500 * delay in MEMCLKs). For the 1625MHz clock profile the timings are 17-18-18. This means that if you overclock the memory from 1500MHz to 1550MHz the actual memory latency will increase while the bandwidth increases: 10.967ns for tCL and 11.612ns for tRCD and tRP. This is one of the reasons why overclocking the memory is not always desireable.
> 
> https://litecointalk.org/index.php?topic=12369.0


Nice! Thank you! +rep

I knew something fishy was happening every 250MHz....but didn't know why. Now I know!


----------



## Tobe404

So basically there is no point OCing memory on a current GPU then? Or am I still missing somehting?


----------



## Arkanon

Quote:


> Originally Posted by *Roboyto*
> 
> No luck


that's too bad. Well, in any case i hope that someone figures out how to increase the voltage to about 1.4v so everyone can enjoy the full potential of their cards given they have proper cooling sollutions. As far as i can tell and from personal experience it seems fully safe to run the core at 1.4v, atleast temp wise. Only time will tell how much strain is put on the lifespan of the card by pushing that voltage through the core.


----------



## Arkanon

Quote:


> Originally Posted by *Tobe404*
> 
> So basically there is no point OCing memory on a current GPU then? Or am I still missing somehting?


1500Mhz seems to be the sweet spot for the 270/270x. It yields some better performance. Anything above that seems to suffer from diminishing returns.Atleast for the cards with Elpida memory. Those few around with samsung/hynix memory should be able to push higher without any loss in performance.


----------



## bardacuda

@Tobe404

Well that was specific to mining since it is so very sensitive to latency. Probably carries over to gaming/benching somewhat as well but if you can get (close) to 1625 or 1750 without going over there is probably some benefit to be had...although still not much compared to what OCing the core will get you. Also this was specific to 7970/280x and depends on the RAM used. If you read the original post it explains how some VBIOSes are just not well calibrated anyway so it would be more beneficial to actually fix the BIOS first before playing with clocks.


----------



## Tobe404

Quote:


> Originally Posted by *Arkanon*
> 
> 1500Mhz seems to be the sweet spot for the 270/270x. It yields some better performance. Anything above that seems to suffer from diminishing returns.Atleast for the cards with Elpida memory. Those few around with samsung/hynix memory should be able to push higher without any loss in performance.


And 280x - so it seems... Ah well. Have to try push the core as much as I can then.
Need to find a BIOS which will allow up to 1.4v though. Currently I max out at 1.2v.
Current max temp in games is 70c so I've still got room.


----------



## Recr3ational

Quote:


> Originally Posted by *Tobe404*
> 
> And 280x - so it seems... Ah well. Have to try push the core as much as I can then.
> Need to find a BIOS which will allow up to 1.4v though. Currently I max out at 1.2v.
> Current max temp in games is 70c so I've still got room.


If you do find a bios @ 1.4, do tell me. My cards is currently at 1250/1650 @ 40c under load.
It could go WAY more if the voltage allows it.


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *Unknownm*
> 
> https://litecointalk.org/index.php?topic=12369.0


Quote:


> Originally Posted by *Recr3ational*
> 
> If you do find a bios @ 1.4, do tell me. My cards is currently at 1250/1650 @ 40c under load.
> It could go WAY more if the voltage allows it.


How did you get the core clock so high? Mine will throw artifacts at anything over 1220...


----------



## Tobe404

Quote:


> Originally Posted by *Recr3ational*
> 
> If you do find a bios @ 1.4, do tell me. My cards is currently at 1250/1650 @ 40c under load.
> It could go WAY more if the voltage allows it.


What brand cards? What voltage? I'm assuming you're under water?
I found a Bios that will allow up to 1.3v but that's it so far...
Anyone know what the difference between a F31 and F60 Bios is?
Was on F31 (1.2v max) and now F60 (1.3v max). Everything else seems the same.
Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> How did you get the core clock so high? Mine will throw artifacts at anything over 1220...


I am currently stable at 1150/1500/1170v with no increase in Power Limit.
Doesn't seem like i can increase the core anymore yet. Tried 1200 at 1.3v but no go.
I'm hoping if I mess around with the Power Limits a bit I can get to at least 1200 on the core (1250 would be nice, but you know... Luck of the draw).


----------



## Roboyto

Quote:


> Originally Posted by *Arkanon*
> 
> that's too bad. Well, in any case i hope that someone figures out how to increase the voltage to about 1.4v so everyone can enjoy the full potential of their cards given they have proper cooling sollutions. As far as i can tell and from personal experience it seems fully safe to run the core at 1.4v, atleast temp wise. Only time will tell how much strain is put on the lifespan of the card by pushing that voltage through the core.


I think it's probably because you have an MSI card and BIOS. Similarly to how GPU Tweak doesn't work without an ASUS BIOS. I may try flashing BIOS on my card.


----------



## lightsout

Anyone figured out how to get voltage control in after burner with a Sapphire Dual X R9 270 (crossfire). I can adjust it with trixx, (gpuz reports the change) but trixx does not seem stable to me and has caused issues every time I install it.

I have tried every combination in the settings of after burner to no avail.


----------



## Recr3ational

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> How did you get the core clock so high? Mine will throw artifacts at anything over 1220...


Luck maybe?
Quote:


> Originally Posted by *Tobe404*
> 
> What brand cards? What voltage? I'm assuming you're under water?
> I found a Bios that will allow up to 1.3v but that's it so far...
> Anyone know what the difference between a F31 and F60 Bios is?
> Was on F31 (1.2v max) and now F60 (1.3v max). Everything else seems the same.
> I am currently stable at 1150/1500/1170v with no increase in Power Limit.
> Doesn't seem like i can increase the core anymore yet. Tried 1200 at 1.3v but no go.
> I'm hoping if I mess around with the Power Limits a bit I can get to at least 1200 on the core (1250 would be nice, but you know... Luck of the draw).


My card is the PowerColour Turbo Duo,
My voltage wall is 1.3v, why is yours 1.2v? Try and use the powercolours bios?
Yeah I'm underwater.


----------



## JCH979

Quote:


> Originally Posted by *Recr3ational*
> 
> If you do find a bios @ 1.4, do tell me. My cards is currently at 1250/1650 @ 40c under load.
> It could go WAY more if the voltage allows it.


I'm curious, what is your Valley score with those clocks?


----------



## Recr3ational

Quote:


> Originally Posted by *JCH979*
> 
> I'm curious, what is your Valley score with those clocks?


I had it up quite long ago.
I'm willing to do it again for you after I finish work if you like?

Though I can only use one card for the minute as my second card has a lot of air bubbles inside the block, so it's not being powered.


----------



## Tobe404

Quote:


> Originally Posted by *Recr3ational*
> 
> Luck maybe?
> My card is the PowerColour Turbo Duo,
> My voltage wall is 1.3v, why is yours 1.2v? Try and use the powercolours bios?
> Yeah I'm underwater.


It's not maxed out at 1.2v anymore. I found a Gigabyte (my brand of 280x) Bios that allows up to 1.3v.
Doesn't seem like it makes any difference to being able to go above 1150 on the core though. Even with Power Limit set to 20.
Oh well. Better than nothing. and at least at 1150 the voltage isn't that high.


----------



## Recr3ational

Quote:


> Originally Posted by *Tobe404*
> 
> It's not maxed out at 1.2v anymore. I found a Gigabyte (my brand of 280x) Bios that allows up to 1.3v.
> Doesn't seem like it makes any difference to being able to go above 1150 on the core though. Even with Power Limit set to 20.
> Oh well. Better than nothing. and at least at 1150 the voltage isn't that high.


Wow that sucks. You should be getting what your happy with. Especially with the money that we put in to buy these gpus.


----------



## JCH979

Quote:


> Originally Posted by *Recr3ational*
> 
> I had it up quite long ago.
> I'm willing to do it again for you after I finish work if you like?
> 
> Though I can only use one card for the minute as my second card has a lot of air bubbles inside the block, so it's not being powered.


That would be nice if it's not too much to ask. I also did not realize you had two cards but running it with one would be fine. It never occurred to me to checkout other scores and see if my own results were ok. I've only compared with my own results from my everyday 1200/1666 to my max 1270/1840 when I benched my card back on 13.12 drivers.


----------



## Recr3ational

Quote:


> Originally Posted by *JCH979*
> 
> That would be nice if it's not too much to ask. I also did not realize you had two cards but running it with one would be fine. It never occurred to me to checkout other scores and see if my own results were ok. I've only compared with my own results from my everyday 1200/1666 to my max 1270/1840 when I benched my card back on 13.12 drivers.


Not a problem mate.
You shouldn't really compare with other peoples. As everyone has different cards and some are more lucky than others, but it's a good way to check.
You are right to compare vs your own scores.

Though when I was running my 7950 with one of the 280x, sometimes the scores are higher, so there's a factor somewhere that changes things around.


----------



## Arkanon

Quote:


> Originally Posted by *Roboyto*
> 
> I think it's probably because you have an MSI card and BIOS. Similarly to how GPU Tweak doesn't work without an ASUS BIOS. I may try flashing BIOS on my card.


Might be so. Also: both the gaming and hawk card come with a bios switch on the card with the second bios being optimised for LN2 cooling. I suspect the ln2 bios allows for more voltage, you just have to find a way to get it up higher by software but most programs only allow around 1.3V. So in the end i think that the special bios + upping the voltage slider in both programs allow me to go up to 1.4V. Seems like the only logical conclusion i can draw for now as i don't know what else can explain me getting the card up that high. If you want i could make a copy of the LN2 bios and upload it here, just not sure if cards made by other manufacturers would accept the msi signed bios.


----------



## Roboyto

Quote:


> Originally Posted by *Arkanon*
> 
> Might be so. Also: both the gaming and hawk card come with a bios switch on the card with the second bios being optimised for LN2 cooling. I suspect the ln2 bios allows for more voltage, you just have to find a way to get it up higher by software but most programs only allow around 1.3V. So in the end i think that the special bios + upping the voltage slider in both programs allow me to go up to 1.4V. Seems like the only logical conclusion i can draw for now as i don't know what else can explain me getting the card up that high. If you want i could make a copy of the LN2 bios and upload it here, just not sure if cards made by other manufacturers would accept the msi signed bios.


I'll give the flash a shot to see what happens.

Are you able to get 1400 on a single slider? Or you just move both programs to 1300 and you end up with 1.4V?

I didn't test the card under load last night after I couldn't exceed 1320 on the slider; plus I was doing BD rips and didn't want any crashes. I'll try after work and see if under load it exceeds 1.3V


----------



## mcdoubleyou

Hey guys,
I am planning on buying a R9 280X. However I am gonna put a NZXT Kraken G10 on it for better performance so I was looking for the cheapest card with reference design. Now the cheapest reference card I could find is the Club 3D R9 280X RoyalQueen edition. But I wonder if you can get the same performance as in a RoyalKing edition by overclocking. Does the RoyalKing clock better then the RoyalQueen when they use the same cooler or is the only difference the factory overclock?

Thanks in advance


----------



## Roboyto

Quote:


> Originally Posted by *Tobe404*
> 
> So basically there is no point OCing memory on a current GPU then? Or am I still missing somehting?


It can certainly help your performance. Someone started talking about cryptocurrency mining which can be very sensitive to RAM speeds.

For gaming, generally speaking, an OC on the RAM is going to increase your performance. For these AMD cards it is still most important to get as much out of the core speeds first, then worry about the memory.

When you get to the point of finding the peak/threshold for RAM clocks, you may see a small decrease in performance if you push a little too far. Backing down the clocks slightly at that point will probably give you your best performance.


----------



## Roboyto

Quote:


> Originally Posted by *mcdoubleyou*
> 
> Hey guys,
> I am planning on buying a R9 280X. However I am gonna put a NZXT Kraken G10 on it for better performance so I was looking for the cheapest card with reference design. Now the cheapest reference card I could find is the Club 3D R9 280X RoyalQueen edition. But I wonder if you can get the same performance as in a RoyalKing edition. I know the main difference is the RoyalKing is factory overclocked but are there any more differences why the RoyalKing would perform better?
> 
> Thanks in advance


Club 3D just became available through Newegg in the states, so I don't know much about them.

2 big questions:

Can you remove the factory HSF without voiding the warranty?

Does the Royal King edition have upgraded components; PCB, power plugs, VRMs, etc?

Also, not sure where you are going to get a Kraken G10 presently as NZXT still hasn't shipped orders from as far back as January. I recently contacted FrozenCPU and they're expecting availability some time near the end of May. If you can find one anywhere, they will likely be severely marked up.

The 280X can do just fine with an air cooler, so my suggestion would be to get a very nice air cooled one for the time being and once the Kraken G10 are available again, make the upgrade then.

Friend of mine just bought a Visiontek R9 280X from NewEgg and it is a solid performer and very quiet even with fans at 80%; Clocked to 1185/1700. The visiontek also comes with a backplate and carries a lifetime warranty.

Remember, you get what you pay for in most cases.


----------



## mcdoubleyou

Well, I live in Belgium and I can get a g10 shipped to me for €32,42, wich is a pretty decent price IMO. I just checked the site and I can't seem to find any differences besides the factory overclock. Could be I am missing something but I guess that's the only difference. A little bit stupid they charge 50 bucks more for jus a factory overclock but that doesn't matter







also the watercooling is partly cosmetical too, but I'm kinda new to the whole PC building thing







so far I've been learning a lot though








Oh yeah and I read from another user that with Club 3D, replacing the stock cooler does not voide the warranty


----------



## Arkanon

Quote:


> Originally Posted by *Roboyto*
> 
> I'll give the flash a shot to see what happens.
> 
> Are you able to get 1400 on a single slider? Or you just move both programs to 1300 and you end up with 1.4V?
> 
> I didn't test the card under load last night after I couldn't exceed 1320 on the slider; plus I was doing BD rips and didn't want any crashes. I'll try after work and see if under load it exceeds 1.3V


+100mv in AB ( totaled to 1.296mv in gpu-z)
+125mv in Iturbo (totaled to 1.413mv in gpu-z)
So yes, i had to max out the voltage sliders of both programs seperatly. Good luck, hope it works. I'll keep my fingers crossed.

270xGaming.zip 41k .zip file


----------



## Recr3ational

Its all extreme preset, just had to disable full screen as i have 3 monitors.

Also, 1250/1650 vs 1200/1600 is like 6 fps difference, so not worth the extra voltage.


----------



## Roboyto

Quote:


> Originally Posted by *mcdoubleyou*
> 
> Well, I live in Belgium and I can get a g10 shipped to me for €32,42, wich is a pretty decent price IMO. I just checked the site and I can't seem to find any differences besides the factory overclock. Could be I am missing something but I guess that's the only difference. A little bit stupid they charge 50 bucks more for jus a factory overclock but that doesn't matter
> 
> 
> 
> 
> 
> 
> 
> also the watercooling is partly cosmetical too, but I'm kinda new to the whole PC building thing
> 
> 
> 
> 
> 
> 
> 
> so far I've been learning a lot though
> 
> 
> 
> 
> 
> 
> 
> 
> Oh yeah and I read from another user that with Club 3D, replacing the stock cooler does not voide the warranty


If you can get the G10 easily, then I say DO IT! :-D


----------



## Devildog83

Today I went to a local computer repair shop and while there I asked about why the top card in my X-Fire set up might be getting hot. I told him that at one point it hit 91c, which is hot but then he said that the card is toast and I should never go above 50c. At this point I know he was full of it because most cards these days will run a lot hotter then 50c. I told him that the card could run at 65c for hours without any damage and he thought I was nuts. My card is getting too hot but I am just going to replace the TIM and thermal pads and see if that doesn't help. I also am using Sapphire Trix now and for some reason it seems to get hotter as I would never even see 80c before so I am going back to Afterburner. It's also not very stable with Trixx.


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> Today I went to a local computer repair shop and while there I asked about why the top card in my X-Fire set up might be getting hot. I told him that at one point it hit 91c, which is hot but then he said that the card is toast and I should never go above 50c. At this point I know he was full of it because most cards these days will run a lot hotter then 50c. I told him that the card could run at 65c for hours without any damage and he thought I was nuts. My card is getting too hot but I am just going to replace the TIM and thermal pads and see if that doesn't help. I also am using Sapphire Trix now and for some reason it seems to get hotter as I would never even see 80c before so I am going back to Afterburner. It's also not very stable with Trixx.


50C







AMD/ATI cards can run at 90C all day without issues. I emailed ATI tech support back in the 5XXX days because my buddy had Xfire 5790s and the top card was doing the same yours is, 80s and saw 90 occasionally. They said it was no problem at all. It will likely reduce life expectancy, but they can do it.

My Vapor-X 4890 would hit 110 with big OC and voltages with the stock cooler.... Then I went water and it never broke 50C.

Upgrade TIM and thermal pads, and maybe switch around your fan setup. Got some pics of how airflow is running through your case?

Seems different cards do better with different tuning programs. I just use whatever works


----------



## Devildog83

Quote:


> Originally Posted by *Roboyto*
> 
> 50C
> 
> 
> 
> 
> 
> 
> 
> AMD/ATI cards can run at 90C all day without issues. I emailed ATI tech support back in the 5XXX days because my buddy had Xfire 5790s and the top card was doing the same yours is, 80s and saw 90 occasionally. They said it was no problem at all. It will likely reduce life expectancy, but they can do it.
> 
> My Vapor-X 4890 would hit 110 with big OC and voltages with the stock cooler.... Then I went water and it never broke 50C.
> 
> Upgrade TIM and thermal pads, and maybe switch around your fan setup. Got some pics of how airflow is running through your case?


I know, that guy didn't know what he was talking about.

I just replaced Trixx with Afterburner and ran Valley at my everyday clocks and hit 81c which is better but still 14c over the bottom card @ 67c max. I will, as I said, replace the TIM and thermal pads and see if I can get it to come down more. I also ran it at the user defined setting and I normally just ramp it up to a solid 85% fan when benching and it keeps it from heating up that much.


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> I know, that guy didn't know what he was talking about.
> 
> I just replaced Trixx with Afterburner and ran Valley at my everyday clocks and hit 81c which is better but still 14c over the bottom card @ 67c max. I will, as I said, replace the TIM and thermal pads and see if I can get it to come down more. I also ran it at the user defined setting and I normally just ramp it up to a solid 85% fan when benching and it keeps it from heating up that much.


I wouldn't lose any sleep over 80C on the top card. It can be better, but it is definitely not going to blow up. If you were in the 95C Area, then I would begin to worry.

One of the most popular TIM for GPU is Gelid GC Extreme; Always drops temps on GPUs whenever I have used it. It has no cure time so you get max performance immediately, and is non-conductive so if you get it on you PCB you have nothing to worry about.


----------



## Devildog83

Quote:


> Originally Posted by *Roboyto*
> 
> I wouldn't lose any sleep over 80C on the top card. It can be better, but it is definitely not going to blow up. If you were in the 95C Area, then I would begin to worry.
> 
> One of the most popular TIM for GPU is Gelid GC Extreme; Always drops temps on GPUs whenever I have used it. It has no cure time so you get max performance immediately, and is non-conductive so if you get it on you PCB you have nothing to worry about.


Thanks, I was just about to ask that question.


----------



## Arkanon

Valley 1.0 run. If i'm not mistaken those scores are about bang on stock 280x region.


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> Thanks, I was just about to ask that question.


I'm quite fond of ArctiClean. 2 step cleaning and purifier for removing TIM and other things. Excellent stuff!


----------



## Devildog83

Quote:


> Originally Posted by *Devildog83*
> 
> Thanks, I was just about to ask that question.


What about pads? What's the best to use on a GPU.


----------



## Recr3ational

Quote:


> Originally Posted by *Roboyto*
> 
> I'm quite fond of ArctiClean. 2 step cleaning and purifier for removing TIM and other things. Excellent stuff!


I love articlean. It's a life saver.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> I love articlean. It's a life saver.


I meant thermal pads for the GPU.

Good lord I need a drink.


----------



## Devildog83

I have added a link to PSU reviews to the OP because it seems to be right in the ballpark for a 280x and even 2 x 270/270x. Thanks to Shilka for the post. I also tried to add the updated beta drivers 14.4 but I need lessoms on how to use the interface better.


----------



## shilka

Quote:


> Originally Posted by *Devildog83*
> 
> I have added a link to PSU reviews to the OP because it seems to be right in the ballpark for a 280x and even 2 x 270/270x. Thanks to Shilka for the post. I also tried to add the updated beta drivers 14.4 but I need lessoms on how to use the interface better.


750 watts is enough even for 2x 280x cards, Unless you start to volt mod them. Anyway yes you are free to link it if you want it was made to be used


----------



## Devildog83

Quote:


> Originally Posted by *shilka*
> 
> 750 watts is enough even for 2x 280x cards, Unless you start to volt mod them. Anyway yes you are free to link it if you want it was made to be used


True, my devil's were overclocked to the max and running prime95 plus valley at the same time and drew just over 600w at the wall max peak and that was with an overclocked FX 8350 9 fans and lights.


----------



## shilka

Quote:


> Originally Posted by *Devildog83*
> 
> True, my devil's were overclocked to the max and running prime95 plus valley at the same time and drew just over 600w at the wall max peak and that was with an overclocked FX 8350 9 fans and lights.


I have an older 1000 watts thead which i need to rework when i get around to it
http://www.overclock.net/t/1438987/best-fully-modular-1000-watts-psu


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> What about pads? What's the best to use on a GPU.


Fujipoly Ultra Extreme!

http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html

Check my post here from a few days ago:

http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/10#post_22105025

That was using only the Ultra pad which has 2/3 the thermal conductance of the Ultra Extreme. I dropped 7C on the VRMs for my 270X, about 10%. If you use the Ultra Extreme, then you should get similar or better results.

The one thing to be wary of how well the heatsinks will stick to the thermal pad. They are tacky, but if your card(s) are installed in a standard fashion where the card is upside down, then they may not have enough adhesion to keep the sinks attached. It really depends on your card. If you have one where the VRM cooling is part of the whole heatsink, then you have nothing to worry about. If it is like my PowerColor where the VRM heatsinks are attached with thermal tape, then the thermal pads may not hold.


----------



## bardacuda

Quote:


> Originally Posted by *Devildog83*
> 
> I just replaced Trixx with Afterburner and ran Valley at my everyday clocks and hit 81c which is better but still 14c over the bottom card @ 67c max. I will, as I said, replace the TIM and thermal pads and see if I can get it to come down more. I also ran it at the user defined setting and I normally just ramp it up to a solid 85% fan when benching and it keeps it from heating up that much.


What if you switch the two cards around? Does the card that's currently on bottom run even worse on top than the card you currently have on top?


----------



## microkid21

Bad news!









After I had RMA'ed my RAM which I thought the culprit of having BSOD / crash , is not yet clear / solve. I purchased a new ram and will be used as temporary while waiting for my RMA ram, and after I tested it to my system and playing for like 30 or 40mins, the BSOD happened again with bug check code *0xa000001 (atikmdag.sys) ATI Kernel mode driver.* Which is the very same BSOD that I've experience for months now. Again my card is Powercolor R9 270X Devil (this is tested on the shop where I purchased it, and found no trouble).


----------



## mxfreek09

Well, with the weather starting to warm up a bit and my Gigabyte R9 270x temps slowly starting to climb as well I decided it was time to try to see if I can do anything to drop the temps a bit. The stock tim wasnt as bad as I was expecting as far as it being a huge mess but I ended up dropping the card by 8c just by swapping the paste to some MX-4. Here are some pics if anyone is interested:


Spoiler: Warning: Spoiler!




The tools










The card










Wasnt as bad as I was expecting really.


Before


After


Before


After


It came out a bit fast, the reason its kind of a mess here is because I removed a bit of excess. May have used a bit too much.


And back into the Core 1000. Please excuse the cable management.











So far I am happy with the results. I have only tested in valley so far with my OC of 1200/1500, but my before temps were up around 78c with the weather warming up and the after temps dropped to 70c at the max but were down around 68c for periods of time.


----------



## diggiddi

Quote:


> Originally Posted by *microkid21*
> 
> Bad news!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> After I had RMA'ed my RAM which I thought the culprit of having BSOD / crash , is not yet clear / solve. I purchased a new ram and will be used as temporary while waiting for my RMA ram, and after I tested it to my system and playing for like 30 or 40mins, the BSOD happened again with bug check code *0xa000001 (atikmdag.sys) ATI Kernel mode driver.* Which is the very same BSOD that I've experience for months now. Again my card is Powercolor R9 270X Devil (this is tested on the shop where I purchased it, and found no trouble).


That is the ATI driver causing the crash, use DDU in safe mode to uninstall and then reinstall it


----------



## microkid21

Quote:


> Originally Posted by *diggiddi*
> 
> That is the ATI driver causing the crash, use DDU in safe mode to uninstall and then reinstall it


I've already done that to no avail. OS is freshly installed.
What I've done so far to fixed / solve the issue:

1.Update Motherboard BIOS
2 .Driver used ( 13.12, 14.3, 14.1, 14.2, 13.10)
3. Using DDU to uninstall previous driver
4. Already followed instructions from AMD
5. Already followed this as well.
6. Drivers are up to date.
7. PSU is Corsair CS650M
8. Windows is up to date


----------



## diggiddi

I think you have a slightly unstable overclock that is messing up your driver
Do you have whocrashed ? download it if you don't


----------



## microkid21

Quote:


> Originally Posted by *diggiddi*
> 
> I think you have a slightly unstable overclock that is messing up your driver
> Do you have whocrashed ? download it if you don't


I don't do overclocking. No overclock was made during gaming and testing. I used bluescreenviewer to have an idea on what causing this weird blue screen the result shows link to my previos post


----------



## diggiddi

Quote:


> Originally Posted by *microkid21*
> 
> I don't do overclocking. No overclock was made during gaming and testing. I used bluescreenviewer to have an idea on what causing this weird blue screen the result shows link to my previos post


I used to use bluescreen viewer too, I find whocrashed simpler


----------



## mcdoubleyou

Nice thanks







do you recommend a closed water loop to go with it? I was thinking about a h55 but will the rad be big enough or do I need a 240mm rad?


----------



## mcdoubleyou

Nice thanks







do you recommend a closed water loop to go with it? I was thinking about a h55 but will the rad be big enough or do I need a 240mm rad?


----------



## microkid21

Quote:


> Originally Posted by *diggiddi*
> 
> I used to use bluescreen viewer too, I find whocrashed simpler


Whichever you use, it still the same. In my case the BSOD points out the atikmdag.sys. Do you have an idea about this?


----------



## Unknownm

If anyone with 280x can get 1.3v + please send me the BIOS. I wanna try to flashing it on my gigabyte card, if it's fails I can always switch back to original bios









Thanks


----------



## diggiddi

Quote:


> Originally Posted by *microkid21*
> 
> Whichever you use, it still the same. In my case the BSOD points out the atikmdag.sys. Do you have an idea about this?


Follow every step :
Step 1: Download and install the latest ati graphic driver.AND DONT TURN OFF OR RESTART YOUR COMPUTER AFTER INSTALLATION

Step 2: Then go to C:\Windows\System32\Drivers and rename atikmdag.sys to atikmdag.sys.old.

Step 3: Go to ati directory (usually in C:\ATI) and find the file atikmdag.sy_.

Step 4: Copy the file to your Desktop directory.

Step 5: Open cmd.exe by going to Start -> type cmd in the search box and hit enter.

Step 6: Change the directory to Desktop by typing chdir Desktop.

Step 7: Then, type EXPAND.EXE atikmdag.sy_ atikmdag.sys. Or,

expand -r atikmdag.sy_ atikmdag.sys

Step 8: When the expansion is complete, copy the new atikmdag.sys from your Desktop to

C:\Windows\System32\Drivers

Step 9: Restart your computer and the problem should be resolved.

Its not faulty mb or card problem! The drivers is the main culprit here!
Source http://www.tomshardware.com/answers/id-1656824/atikmdag-sys-error-bsod-startup.html

These may help you if the above does not
http://support.amd.com/en-us/kb-articles/Pages/737-27116RadeonSeries-ATIKMDAGhasstoppedrespondingerrormessages.aspx
http://www.overclock.net/t/420875/possible-solution-to-atikmdag-driver-crash
http://www.computertipsfree.com/fix-43029-atikmdag-sys-blue-screen-bsod-in-windows-7/
http://forums.guru3d.com/showthread.php?t=363233
http://windows7themes.net/en-us/fix-atikmpag-sys-atikmdag-sys-blue-screen-errors-bsod/
http://atikmdag.com/

I think I used to have that problem and it was definitely a driver issue, it stopped when I updated drivers
What was your last good river ?


----------



## microkid21

Quote:


> Originally Posted by *diggiddi*
> 
> Follow every step :
> Step 1: Download and install the latest ati graphic driver.AND DONT TURN OFF OR RESTART YOUR COMPUTER AFTER INSTALLATION
> 
> Step 2: Then go to C:\Windows\System32\Drivers and rename atikmdag.sys to atikmdag.sys.old.
> 
> Step 3: Go to ati directory (usually in C:\ATI) and find the file atikmdag.sy_.
> 
> Step 4: Copy the file to your Desktop directory.
> 
> Step 5: Open cmd.exe by going to Start -> type cmd in the search box and hit enter.
> 
> Step 6: Change the directory to Desktop by typing chdir Desktop.
> 
> Step 7: Then, type EXPAND.EXE atikmdag.sy_ atikmdag.sys. Or,
> 
> expand -r atikmdag.sy_ atikmdag.sys
> 
> Step 8: When the expansion is complete, copy the new atikmdag.sys from your Desktop to
> 
> C:\Windows\System32\Drivers
> 
> Step 9: Restart your computer and the problem should be resolved.
> 
> Its not faulty mb or card problem! The drivers is the main culprit here!
> Source http://www.tomshardware.com/answers/id-1656824/atikmdag-sys-error-bsod-startup.html
> 
> These may help you if the above does not
> http://support.amd.com/en-us/kb-articles/Pages/737-27116RadeonSeries-ATIKMDAGhasstoppedrespondingerrormessages.aspx
> http://www.overclock.net/t/420875/possible-solution-to-atikmdag-driver-crash
> http://www.computertipsfree.com/fix-43029-atikmdag-sys-blue-screen-bsod-in-windows-7/
> http://forums.guru3d.com/showthread.php?t=363233
> http://windows7themes.net/en-us/fix-atikmpag-sys-atikmdag-sys-blue-screen-errors-bsod/
> http://atikmdag.com/
> 
> I think I used to have that problem and it was definitely a driver issue, it stopped when I updated drivers
> What was your last good river ?


I've already done that to no avail. OS is freshly installed.
What I've done so far to fixed / solve the issue:

1.Update Motherboard BIOS
2 .Driver used ( *13.12*, 14.3, 14.1, 14.2, 13.10)
3. Using DDU to uninstall previous driver
4. Already followed instructions from AMD
*5. Already followed this as well.*
6. Drivers are up to date.
7. PSU is Corsair CS650M
8. Windows is up to date


----------



## Tobe404

microkid21 - This might be a long shot but have you tried a different GPU bios?
I'd only recommend this as a last resort and only if you have a dual bios on your GPU.


----------



## Arkanon

Rather than flashing the bios i'd just opt for the choice of RMA'ing the card.


----------



## bardacuda

Except that he's already said the card has been tested in a different machine and worked fine....although it was in the shop where he bought it so they could be biased.


----------



## Arkanon

afaik the problem didn't occur right away either but only after playing a game for a while. They might have tested it, by running 1 benchmark or so, which wasn't long enough for the problem to manifest.


----------



## bardacuda

That's why I say they could be biased







....Should try it in a friend's machine and put the gears to it. If any crash or even any little flicker or artifact is seen for the briefest of milliseconds, RMA that sucker.


----------



## Tobe404

Quote:


> Originally Posted by *Unknownm*
> 
> If anyone with 280x can get 1.3v + please send me the BIOS. I wanna try to flashing it on my gigabyte card, if it's fails I can always switch back to original bios
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks


What clocks/voltage are you sitting on at the moment?


----------



## mcdoubleyou

Ok, thanks







also would a h55 be enough to watercool a r9 280x or should I get a 240mm rad?


----------



## JCH979

Quote:


> Originally Posted by *Recr3ational*
> 
> 
> 
> Its all extreme preset, just had to disable full screen as i have 3 monitors.
> 
> Also, 1250/1650 vs 1200/1600 is like 6 fps difference, so not worth the extra voltage.




Here's my run at 1270/1840, I did try higher but there wasn't any difference.

I'm still not sure if it's good or what and I also agree that there isn't that much of a difference going from my everyday 1200/1666, just wanted to see how far I could push this card with my current config and 1.3V.


----------



## Recr3ational

Quote:


> Originally Posted by *Unknownm*
> 
> If anyone with 280x can get 1.3v + please send me the BIOS. I wanna try to flashing it on my gigabyte card, if it's fails I can always switch back to original bios
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks


Use the Power Colour Turbo Duo bios, if you can find some on techpowerup

Link

It shows up as reference design but says Turbo Duo.


----------



## mikemykeMB

This had probably been answered..but, if enabling CCC graphics OD and MSI-AB both cause a game crash?


----------



## Roboyto

Quote:


> Originally Posted by *mikemykeMB*
> 
> This had probably been answered..but, if enabling CCC graphics OD and MSI-AB both cause a game crash?


Wouldn't suggest using OverDrive at all, let alone with another overclocking utility.

Not sure if they have fixed this issue with CCC, but it used to save whatever settings you apply and loads them on boot. So if you push the card too far causing crashes, you can end up in a crashing boot loop.


----------



## Roboyto

Quote:


> Originally Posted by *mcdoubleyou*
> 
> Ok, thanks
> 
> 
> 
> 
> 
> 
> 
> also would a h55 be enough to watercool a r9 280x or should I get a 240mm rad?


120mm radiator more than enough even to cool a 290X with large OC.

Friend of mine is going to be using H55 on his VisionTek 280X, PM me in a few days and I can let you know how it goes.


----------



## mikemykeMB

Quote:


> Originally Posted by *Roboyto*
> 
> Wouldn't suggest using OverDrive at all, let alone with another overclocking utility.
> 
> _Not sure if they have fixed this issue with CCC,_ but it used to save whatever settings you apply and loads them on boot. So if you push the card too far causing crashes, you can end up in a crashing boot loop.


That might be what is happening, have to reset after crash..will change up setting(s) and disable OD and try again later today..thanks man


----------



## Tobe404

This is my Valley score at 1200/1500/1235v



Valley seems to trip out a bit quicker on OC's compared to Heaven.


----------



## rdr09

Quote:


> Originally Posted by *Roboyto*
> 
> Wouldn't suggest using OverDrive at all, let alone with another overclocking utility.
> 
> _Not sure if they have fixed this issue with CCC,_ but it used to save whatever settings you apply and loads them on boot. So if you push the card too far causing crashes, you can end up in a crashing boot loop.


+rep.


----------



## Roboyto

Quote:


> Originally Posted by *mikemykeMB*
> 
> That might be what is happening, have to reset after crash..will change up setting(s) and disable OD and try again later today..thanks man


No problem, let us know how it goes.

Quote:


> Originally Posted by *rdr09*
> 
> +rep.


Thanks!


----------



## Devildog83

Quote:


> Originally Posted by *mcdoubleyou*
> 
> Ok, thanks
> 
> 
> 
> 
> 
> 
> 
> also would a h55 be enough to watercool a r9 280x or should I get a 240mm rad?


I would try the H75, much better than the H55


----------



## Devildog83

Quote:


> Originally Posted by *mikemykeMB*
> 
> This had probably been answered..but, if enabling CCC graphics OD and MSI-AB both cause a game crash?


Yes Graphics overdrive in CCC and any other clocking tool will cause instability. Disable Graphics overdrive in CCC.


----------



## Devildog83

Quote:


> Originally Posted by *microkid21*
> 
> Whichever you use, it still the same. In my case the BSOD points out the atikmdag.sys. Do you have an idea about this?


Maybe a dumb question but do you also have Graphics overdrive enabled? It can cause issues and I think it has to do with both overdrive and any other clocking tool running system info at the same time.


----------



## trabunco

Hi. New user here. Mine is Asus R9 270X. Here's my Unigine Valley score.


----------



## mxfreek09

The rig that I have my 270x in right now is built out of all spare/used parts and I have been slowly upgrading it every paycheck. I dont wanna get too far out of control with it but I can help but think, how much is my FX 4130 holding my 270x back? Im on a bare minimum motherboard (M5A78-M LX Plus) and it can only handle an OC of 4.1 and I dont see much of a difference with that, if any at all. Would it be worth it to step up to a FX 6300 or should I wait to see if there is anything worthwhile in the Intel Summer Deal thats coming up? I dont wanna get too far off topic, just looking to get the most out of my 270x.


----------



## Roboyto

Quote:


> Originally Posted by *mxfreek09*
> 
> The rig that I have my 270x in right now is built out of all spare/used parts and I have been slowly upgrading it every paycheck. I dont wanna get too far out of control with it but I can help but think, how much is my FX 4130 holding my 270x back? Im on a bare minimum motherboard (M5A78-M LX Plus) and it can only handle an OC of 4.1 and I dont see much of a difference with that, if any at all. Would it be worth it to step up to a FX 6300 or should I wait to see if there is anything worthwhile in the Intel Summer Deal thats coming up? I dont wanna get too far off topic, just looking to get the most out of my 270x.


With that very old chipset on your current motherboard you would be best to get a new MoBo/CPU to enhance performance of your PC as a whole. The FX4130 is a fair chip, but even an i3 with hyperthreading shows its weaknesses in most scenarios. An entry level Z87/LGA1150 board with an i5 would be ideal for best price/performance ratio now and in the long run, so you don't have to worry about upgrading again too soon.

You could even pick up a used Z77/LGA1155 and an i5, or even i7, for a reasonable price.

I would also keep my eyes peeled in the marketplace here, on eBay, and CraigsList for when the Z97 and Devil's Canyon hit the streets as people will undoubtedly be upgrading to see what these new chips are all about.


----------



## fabiovtec

Best driver for r9 270x? this is my first amd
bf4 is the main game


----------



## Roboyto

Quote:


> Originally Posted by *fabiovtec*
> 
> Best driver for r9 270x? this is my first amd
> bf4 is the main game


Overall 13.12 is rock solid.

You can dabble in beta drivers, 14.4 I believe they are up to, but be aware that they are beta drivers. Mantle may or may not cause issues, if you want to use it. Depending on your other hardware, Mantle may not be necessary.


----------



## bardacuda

Quote:


> Originally Posted by *mxfreek09*
> 
> The rig that I have my 270x in right now is built out of all spare/used parts and I have been slowly upgrading it every paycheck. I dont wanna get too far out of control with it but I can help but think, how much is my FX 4130 holding my 270x back? Im on a bare minimum motherboard (M5A78-M LX Plus) and it can only handle an OC of 4.1 and I dont see much of a difference with that, if any at all. Would it be worth it to step up to a FX 6300 or should I wait to see if there is anything worthwhile in the Intel Summer Deal thats coming up? I dont wanna get too far off topic, just looking to get the most out of my 270x.


Quote:


> Originally Posted by *Roboyto*
> 
> With that very old chipset on your current motherboard you would be best to get a new MoBo/CPU to enhance performance of your PC as a whole. The FX4130 is a fair chip, but even an i3 with hyperthreading shows its weaknesses in most scenarios. An entry level Z87/LGA1150 board with an i5 would be ideal for best price/performance ratio now and in the long run, so you don't have to worry about upgrading again too soon.
> 
> You could even pick up a used Z77/LGA1155 and an i5, or even i7, for a reasonable price.
> 
> I would also keep my eyes peeled in the marketplace here, on eBay, and CraigsList for when the Z97 and Devil's Canyon hit the streets as people will undoubtedly be upgrading to see what these new chips are all about.


I agree with Rob here. If the 4130 is holding you back, a 6100 is not really going to be much better. A little more IPC and 2 extra cores, sure, but you're still still stuck in a dead-end socket with a weak chip, still wondering how much it's holding you back.

Any 3+GHz i5 from Sandy onward is going to be pretty much as good as it gets for a while, since broadwell might not be available till next year and even still is just a die shrink of haswell, which itself was pretty much the same as sandy/ivy with the main improvement being power consumption. If you can wait for the haswell refresh to come out this summer you might be able to pick up a cheap used socket 1150 i5 chip/mobo combo. If not you can prob find cheap used 1155 i5 chip/mobo right now and still have no real reason to upgrade until skylake a year and half or more from now.


----------



## mxfreek09

I think it's gonna come down to what intel offers with their summer deal, if I can get an 1150 CPU cheap from there then I will look into a matx board that supports crossfire and I can pickup a second 270x.


----------



## Unknownm

Quote:


> Originally Posted by *Tobe404*
> 
> What clocks/voltage are you sitting on at the moment?


My bios loads 1.18v default, Trixx allows me to apply 1.256v max. With Rev02 BIOS (mine is Rev01) it allows me to go up to 1.3v but that's it. Was wondering if there is any other BIOS I could try out and hope it pushes more than 1.3v


----------



## Tobe404

Just did another Valley run at 1210/1600/1235v



Proper screen shot this time...


----------



## ricko99

Anyone here with ASUS r9 280x DC2T V1 (two slot version) having severe artifacts when playing games? I tried to downclock both core and memory clock but it still gave me artifacts. This is my MSI AB settings. Note the default clocks from ASUS is 1070/1600 and original r9 280x is 1000/1500. My memory is Hynix (according to GPU-Z)


----------



## JaredLaskey82

Quote:


> Originally Posted by *ricko99*
> 
> Anyone here with ASUS r9 280x DC2T V1 (two slot version) having severe artifacts when playing games? I tried to downclock both core and memory clock but it still gave me artifacts. This is my MSI AB settings. Note the default clocks from ASUS is 1070/1600 and original r9 280x is 1000/1500. My memory is Hynix (according to GPU-Z)


I am having no problems with my two ASUS R9 DCU2 Top cards.
Check the bios version of your card.
You could try GPU Tweak as well to see if that works better for your computer configuration.

I am not running the Beta version of the AMD driver but using the 13.215


----------



## Devildog83

Quote:


> Originally Posted by *trabunco*
> 
> Hi. New user here. Mine is Asus R9 270X. Here's my Unigine Valley score.


DCII OC or TOP, do you have a pic and clocks?


----------



## ricko99

Quote:


> Originally Posted by *JaredLaskey82*
> 
> I am having no problems with my two ASUS R9 DCU2 Top cards.
> Check the bios version of your card.
> You could try GPU Tweak as well to see if that works better for your computer configuration.
> 
> I am not running the Beta version of the AMD driver but using the 13.215


Do you have MSI AB at the same time you have GPU Tweak? Do you keep the power limit to just 100% all the time? Also, I tried 13.12 and 14.x drivers and still gives me artifacts. I've updated the card's bios to the latest one 15.41 as well.


----------



## trabunco

Quote:


> Originally Posted by *Devildog83*
> 
> DCII OC or TOP, do you have a pic and clocks?


Hello. Will this pic do? My clocks were 1300/6000 @ 1.3V.


----------



## Amph

there is a way to change the voltage with the new 280x vapor-x trix?


----------



## mcdoubleyou

Pricier too,and if a h55 is largely good enough there's no need to spend more right


----------



## fabiovtec

Quote:


> Originally Posted by *trabunco*
> 
> Hello. Will this pic do? My clocks were 1300/6000 @ 1.3V.


Which drivers do you use in your 270x?


----------



## Tobe404

@ fabiovtec - Looks like trabunco is using these drivers.

AMD Catalyst 14.1 BETA 1.6 (13.350.1005.0)


----------



## trabunco

Quote:


> Originally Posted by *fabiovtec*
> 
> Which drivers do you use in your 270x?


The 14.3 beta.


----------



## Tobe404

I stand corrected then (sort of). Still the same actual driver version/number though...


----------



## diggiddi

Quote:


> Originally Posted by *JaredLaskey82*
> 
> I am having no problems with my two ASUS R9 DCU2 Top cards.
> Check the bios version of your card.
> You could try GPU Tweak as well to see if that works better for your computer configuration.
> 
> I am not running the Beta version of the AMD driver but using the 13.215


Is that version of Gpu Tweak working well? I tried 2.5.9 and got vertical lines across my screen


----------



## Recr3ational

Hmm gpu tweak, might have to try that to see if it will allow more voltage.

Guys can you help me list every respectable oc software?

Afterburner
Trixx
His iTurbo
GPU Tweak...


----------



## diggiddi

Evga Precision


----------



## Devildog83

*trabunco* Has been added. Welcome!!

Hey guys I just bought the Corsair 750D and will be switching my system into it next week. I am stoked for the room to do serious water-cooling.


----------



## diggiddi

Does anyone here have the 280? what is good with the 280x ? I notice the matrix platinum is out of stock, are they problematic?

I think I'll end up with either a 280 or a280x to crossfire with my 7950, I have to return my 7970's first


----------



## Recr3ational

Quote:


> Originally Posted by *diggiddi*
> 
> Does anyone here have the 280? what is good with the 280x ? I notice the matrix platinum is out of stock, are they problematic?
> 
> I think I'll end up with either a 280 or a280x to crossfire with my 7950, I have to return my 7970's first


keep your 7970s vs 280x and 7950 crossfire.


----------



## diggiddi

Quote:


> Originally Posted by *Recr3ational*
> 
> keep your 7970s vs 280x and 7950 crossfire.


They are acting up and have to go back


----------



## bardacuda

Quote:


> Originally Posted by *trabunco*
> 
> Hello. Will this pic do? My clocks were 1300/6000 @ 1.3V.
> 
> 
> Spoiler: Warning: Spoiler!


Nice clocks and score! The highest 7870/270/X in the Valley thread is 39.4FPS and that was @ 1350 core (but he overshot on the mem to 1550)


----------



## JaredLaskey82

I only am using asus tweak.
It works fine for single or dual c
Quote:


> Originally Posted by *diggiddi*
> 
> Is that version of Gpu Tweak working well? I tried 2.5.9 and got vertical lines across my screen


It is version 2.4.9.2 and works fine. It holds all my saved profiles for single card and crossfire.


----------



## RussSki

I have a new Sapphire Dual-X R9 280X and I was playing some BF4 today.

On the Shanghai map with 64 players everything set on Ultra I dipped down to 28 - 30 FPS sometimes and averaged around 45 - 50 fps, I was just wondering is this below average or is this generally what people with 280X's get on a big 64 player map?

I'm running at default speeds (1020Mhz core clock and 1500Mhz memory clock), I made sure that in GPU-Z the full 1020Mhz was being used and that there was no throttling. My temps are completely fine, both the CPU and GPU don't go above 60c. Also I'm using the latest Beta driver 14.3.

The rest of my specs:
i5 2500k at stock
8GB G.Skill RipjawsX at 1600Mhz
ASUS P8Z68-V Pro/Gen3
Coolermaster V700 PSU
120GB Intel 330 SSD
Windows 7 64bit


----------



## trabunco

Quote:


> Originally Posted by *bardacuda*
> 
> Nice clocks and score! The highest 7870/270/X in the Valley thread is 39.4FPS and that was @ 1350 core (but he overshot on the mem to 1550)


Thanks!


----------



## Majentrix

Does anyone know if the MSI R9 270 is voltage locked?
I know that the Gigabyte model is, and that the Asus model isn't.


----------



## Arkanon

Quote:


> Originally Posted by *Majentrix*
> 
> Does anyone know if the MSI R9 270 is voltage locked?
> I know that the Gigabyte model is, and that the Asus model isn't.


My msi 270x ain't voltage locked. Rumor has it that the newly produced 270's are locked though.


----------



## boyagin

Hello to all 270, 270x, 280, 280x owners!

Just wondering does anyone have any issues with your card? I'm working as a PC technician and selling PC hardware, currently many of my customers have RMA their 270X and 280X (both by ASUS) because of the strange artifacts, tearing, green boxes and lines on the screen while gaming, even watching movie in 1080p.
Wanna have some feedback from Asus owners, I wonder the last batch we received were actually a bad one?


----------



## Arkanon

Quote:


> Originally Posted by *boyagin*
> 
> Hello to all 270, 270x, 280, 280x owners!
> 
> Just wondering does anyone have any issues with your card? I'm working as a PC technician and selling PC hardware, currently many of my customers have RMA their 270X and 280X (both by ASUS) because of the strange artifacts, tearing, green boxes and lines on the screen while gaming, even watching movie in 1080p.
> Wanna have some feedback from Asus owners, I wonder the last batch we received were actually a bad one?


afaik a bios update solved the flickering, artifacting etc etc...
Just head over to the asus main site, for the 270x there's 2 new bioses available that should solve the issue


----------



## boyagin

Yeah. I've seen
Quote:


> Originally Posted by *Arkanon*
> 
> afaik a bios update solved the flickering, artifacting etc etc...
> Just head over to the asus main site, for the 270x there's 2 new bioses available that should solve the issue


Yeah 270x update did solve the problem. Unfortunately, there isn't any for 280x. =(


----------



## Arkanon

check here:

http://support.asus.com/Download.aspx?SLanguage=en&m=R9280X-DC2T-3GD5&p=9&s=10


----------



## Tobe404

Is it just a few manufacturers with the issue or spread accross all R9 270(x)/280(x) cards?
Or then again just luck of the draw? I've seen a few people with Toxics have the issue as well.
So far my Gigabyte R9 280x hasn't shown any of the above issues.


----------



## Arkanon

From what i've read there's something wrong with the memory timings on some of the cards, not all of them, which causes the cards to artifact. Every brand has 270(x)/280x cards out there which are faulty.


----------



## bardacuda

Quote:


> Originally Posted by *boyagin*
> 
> Hello to all 270, 270x, 280, 280x owners!
> 
> Just wondering does anyone have any issues with your card? I'm working as a PC technician and selling PC hardware, currently many of my customers have RMA their 270X and 280X (both by ASUS) because of the strange artifacts, tearing, green boxes and lines on the screen while gaming, even watching movie in 1080p.
> Wanna have some feedback from Asus owners, I wonder the last batch we received were actually a bad one?


I have an R9270-DC2OC. This is the 2nd one now...I RMA'd the first one, but they both have the same problem. Problem though isn't artifacting, it's the cooler sucks balls, and the TDP is set too low. Both would throttle even at stock if I didn't increase the power limit...and when benchmarking/under load (with the power limit set higher so it doesn't throttle, but everything else at stock) the temp just keeps climbing over 90ºC until I manually shut it down.

I haven't bothered to RMA this one because it seems they are all just manufactured this way and the cooler sucks, and I'd just end up in a never-ending RMA feedback loop. Bought it for a 24/7 mining card so I just undervolted the crap out of it and that keeps the temps in line. For anyone using it as a normal graphics card though, unless you plan on watercooling, stay away. And even if you do plan on watercooling, get something better than a 270


----------



## boyagin

Quote:


> Originally Posted by *Arkanon*
> 
> check here:
> 
> http://support.asus.com/Download.aspx?SLanguage=en&m=R9280X-DC2T-3GD5&p=9&s=10


Yeah, I know and tried that version before. But the problem is there there. Thank you by the way =)


----------



## boyagin

Quote:


> Originally Posted by *bardacuda*
> 
> I have an R9270-DC2OC. This is the 2nd one now...I RMA'd the first one, but they both have the same problem. Problem though isn't artifacting, it's the cooler sucks balls, and the TDP is set too low. Both would throttle even at stock if I didn't increase the power limit...and when benchmarking/under load (with the power limit set higher so it doesn't throttle, but everything else at stock) the temp just keeps climbing over 90ºC until I manually shut it down.
> 
> I haven't bothered to RMA this one because it seems they are all just manufactured this way and the cooler sucks, and I'd just end up in a never-ending RMA feedback loop. Bought it for a 24/7 mining card so I just undervolted the crap out of it and that keeps the temps in line. For anyone using it as a normal graphics card though, unless you plan on watercooling, stay away. And even if you do plan on watercooling, get something better than a 270


Thanks for the input =) Appreciate it.


----------



## pitroy

Anyone has r9 270x bios hex mod guide? please show me!

//Sory for my bad English!


----------



## nitroxyl

Does anyone have a Gelid Icy Vision-a aftermarket VGA cooler on their card? If so, how is the performance?


----------



## neurotix

Quote:


> Originally Posted by *bardacuda*
> 
> Nice clocks and score! The highest 7870/270/X in the Valley thread is 39.4FPS and that was @ 1350 core (but he overshot on the mem to 1550)




My 270X Vapor-X @ 1300/1500mhz with an FX-8350 @ 4.7

Now that I have a Haswell instead of a FX-8350, with PCI-E 3.0 and high single thread, I could do even better if I took that card out of my sister's computer and put it in my rig.


----------



## cremelo

Anyone whit 280X CrossFire???
I whant compare whit my CrossFireX









Asus R9 280X DC2 Top @ 1100Mhz / 6400Mhz and Sapphire R9 280X Toxic @ 1100Mhz / 6400Mhz


----------



## Spade616

just got to 1320/1500 @1.318v stable. that's about the highest my 270X can get. 3059 furmark 1080p test, 760 on Heaven


----------



## Arkanon

My highest so far on a
i5-750 @ 4GHz
270x @ 1360/1500

should have print screened it instead of save the file with results
should also yield slightly better results with a better cpu and higher clocks.

edit: no, my bad. just noticed that was my previous highest score on my 270x. highest i got so far was 41.6 fps. Will upload result later today when i'm back on my desktop.


----------



## fabiovtec

Can you guys make 1250mhz with stock voltage?


----------



## Roboyto

Quote:


> Originally Posted by *fabiovtec*
> 
> Can you guys make 1250mhz with stock voltage?


Nope, that would be amazing if you could get that far with stock voltage. 1317mV gets me to 1280 core max on 270X Devil


----------



## Luckael

Hi all, i have a powercolor r9 270x devil, then i have plan to water cool, does Nzxt G10 will fit for 270x devil? Because my temp while playing batman arkham origins reach 91c i don't know if this is normal. My case is corsair 250d. 1x 140mm intake then 1x 120mm exhaust.


----------



## Devildog83

Quote:


> Originally Posted by *fabiovtec*
> 
> Can you guys make 1250mhz with stock voltage?


I have done Valley and CatZilla runs with mine at 1250 using afterburner and +20. the voltage is locked in afterburner. It's very iffy though and I am running X-Fire so my 7870 is on top and seems to help. By it's self the best I get is about 1240. The 7870 Devil I can run at 1280 but it can go to 1.3v.


----------



## Devildog83

Quote:


> Originally Posted by *Luckael*
> 
> Hi all, i have a powercolor r9 270x devil, then i have plan to water cool, does Nzxt G10 will fit for 270x devil? Because my temp while playing batman arkham origins reach 91c i don't know if this is normal. My case is corsair 250d. 1x 140mm intake then 1x 120mm exhaust.


91 is too high so you either need to up the fan profile or lower the clocks. Why the heck would you buy a Devil and them watercool it. They look so smexy. Just going to add you may need more airflow in the case too.


----------



## Luckael

Yes, its sexy. Because im having a high temp. But when im using his 270x max temp is 71c


----------



## Devildog83

Quote:


> Originally Posted by *Luckael*
> 
> Yes, its sexy. Because im having a high temp. But when im using his 270x max temp is 71c


What clocks?


----------



## Luckael

@Devildog83
Default clock. I dont know why my devil gettings high temp

here is my setup


----------



## Devildog83

Quote:


> Originally Posted by *Luckael*
> 
> @Devildog83
> Default clock. I dont know why my devil gettings high temp
> 
> here is my setup


Has it been like that since you got it? Maybe it's just bad TIM application, have you tried reapplying the thermal paste? My HD 7870 Devil is getting warm too and I will try some better TIM when it arrives monday. This was recommended too me by Roboyto and I researched it and found it to be top notch stuff.


----------



## Spade616

Quote:


> Originally Posted by *Luckael*
> 
> @Devildog83
> Default clock. I dont know why my devil gettings high temp
> 
> here is my setup


if your temps are high despite having the fan on max, its just because of your ambient temps. this factor always dictates your overall temps.


----------



## Luckael

Quote:


> Originally Posted by *Devildog83*
> 
> Has it been like that since you got it? Maybe it's just bad TIM application, have you tried reapplying the thermal paste? My HD 7870 Devil is getting warm too and I will try some better TIM when it arrives monday. This was recommended too me by Roboyto and I researched it and found it to be top notch stuff.


i will try by next week to re apply new tim.


----------



## neurotix

What TIM are you getting Devildog?

I like PK-3 Nano myself but apparently I'm a minority because nobody else really seems to like it.


----------



## Spade616

Quote:


> Originally Posted by *Luckael*
> 
> i will try by next week to re apply new tim.


really, the odds that your gpu has poor factory TIM application is really low, and i can almost guarantee that nothing will change even after re-applying it. what are your ambient temps?


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> What TIM are you getting Devildog?
> 
> I like PK-3 Nano myself but apparently I'm a minority because nobody else really seems to like it.


GELID Solutions GC-Extreme Thermal Compound


----------



## Devildog83

Quote:


> Originally Posted by *Spade616*
> 
> really, the odds that your gpu has poor factory TIM application is really low, and i can almost guarantee that nothing will change even after re-applying it. what are your ambient temps?


Well I will try it and get back to you on that, my issue also has to do with it being the top card I think because my 270x devil get's maybe 70c with the HD7870 get's over 80c right now. The ambient temp is low and I have no airflow issues as I am running 9 fans in a C70 case and all else is very cool. It's either reapply the TIM or RMA the card.


----------



## bardacuda

Quote:


> Originally Posted by *Luckael*
> 
> @Devildog83
> Default clock. I dont know why my devil gettings high temp
> 
> here is my setup


How do you have it in that orientation? Is there some kind of expansion card in your PCI-E slot? Looks like the card is right up next to the window on your case. That could be the problem.


----------



## Luckael

Quote:


> Originally Posted by *Spade616*
> 
> really, the odds that your gpu has poor factory TIM application is really low, and i can almost guarantee that nothing will change even after re-applying it. what are your ambient temps?


ambient temp is 32.2c


----------



## Devildog83

Quote:


> Originally Posted by *Luckael*
> 
> ambient temp is 32.2c


It does look like you lack some airflow in that case.


----------



## Luckael

Quote:


> Originally Posted by *Devildog83*
> 
> Well I will try it and get back to you on that, my issue also has to do with it being the top card I think because my 270x devil get's maybe 70c with the HD7870 get's over 80c right now. The ambient temp is low and I have no airflow issues as I am running 9 fans in a C70 case and all else is very cool. It's either reapply the TIM or RMA the card.


i don't think if they accept to RMA the card with high temp,

Quote:


> Originally Posted by *bardacuda*
> 
> How do you have it in that orientation? Is there some kind of expansion card in your PCI-E slot? Looks like the card is right up next to the window on your case. That could be the problem.


becase my case is Corsair 250d mini Itx.


----------



## diggiddi

Quote:


> Originally Posted by *neurotix*
> 
> What TIM are you getting Devildog?
> 
> I like PK-3 Nano myself but apparently I'm a minority because nobody else really seems to like it.


PK3 is good stuff for sure


----------



## neurotix

I love it. Lowered my cpu temps by 7C compared to Arctic Silver 5. No cure time and non conductive. It's also relatively easy to apply and remove, it's sort of clay like but not quite as hard. So I usually use the pea method with my H100i but use a q-tip to push the blob down evenly into a little circle so it spreads more uniformly when pressure is applied. It's easy to clean. I tried IC7 Nano Diamond and was within 2C of AS5, and it was incredibly thick and difficult to apply properly. It also stained my IHS.

Some older TIM tests show PK-1 with better temps than AS5 but worse temps than a lot of competing pastes like MX-4, Ceramique, and so on. I'm yet to really find an up-to-date test showing PK-3 compared to other pastes. I've heard some people say the only thing better is Coolabratory Liquid Ultra, and it's only 2 degrees better or so. From what I've heard, the Liquid Ultra can only be used once per $10 application. I periodically take my water block off and remove the radiator + fans to blow dust out of it. I can't afford to spend $10 on Liquid Ultra each time I do this. I got a huge 30 gram tube of PK-3 for like $30 and have a ton left. It seems to be the next best thing.


----------



## Tobe404

Just did a few Valley runs at 1165/1600/1165v...



Really don't think the extra voltage/temps (1235v/70c)
for 35-45 (1200-1210) more mhz core/1 extra fps/4c hotter is worth it.

Played hours of Crysis 3 on the above clocks last night so I'm hoping it stays stable.


----------



## Roboyto

Quote:


> Originally Posted by *Spade616*
> 
> really, the odds that your gpu has poor factory TIM application is really low, and i can almost guarantee that nothing will change even after re-applying it. what are your ambient temps?


Every GPU I have ever applied new TIM to has lowered its operating temperatures, regardless of it being brand new or 5 years old. They either use too much, too little, or lower performing TIM.

My Devil had WAY too much TIM on it when I opened it up.



Spoiler: Too much TIM









Quote:


> Originally Posted by *neurotix*
> 
> What TIM are you getting Devildog?
> 
> I like PK-3 Nano myself but apparently I'm a minority because nobody else really seems to like it.


I feel you, I love Xigmatek PTI-G4512, but it's not a very popular paste.

Quote:


> Originally Posted by *Luckael*
> 
> Hi all, i have a powercolor r9 270x devil, then i have plan to water cool, does Nzxt G10 will fit for 270x devil? Because my temp while playing batman arkham origins reach 91c i don't know if this is normal. My case is corsair 250d. 1x 140mm intake then 1x 120mm exhaust.


It may fit, the only downside to the Kraken G10 is it is designed to cool reference style cards where the VRMs are at the far end of the board by the PCIe power connectors.. The VRMs for the 270X devil are on the opposite side of the card nearest the video outputs. You would have to be certain you cooled the VRMs with some good airflow otherwise you would have problems.

I have the same case and card. I just received fans to fill every spot in the case: 200mm front, (2) 120mm side, (2) 80mm rear. Keep an eye on my build log to see how it effects temperatures.

http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition

Quote:


> Originally Posted by *nitroxyl*
> 
> Does anyone have a Gelid Icy Vision-a aftermarket VGA cooler on their card? If so, how is the performance?


I know some guys have used them on the 290(X) with pretty good results; Not as good as the Arctic Accelero Extreme 3, but the Arctic is enormous by comparison. Check link below:

http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x

The Gelid should work wonders on a 270/280 card









The Accelero Twin Turbo II is another alternative. I have used that on (3) HD4890s and a 7870. In all 3 instances they were absolutely silent while keeping temperatures amazingly low.


----------



## Arkanon

Highest score so far on my 270x @ 1360/1500
Doubt there's more in it. Also coming ridiculously close to mild overclocked 280x's


----------



## microkid21

Quote:


> Originally Posted by *Tobe404*
> 
> microkid21 - This might be a long shot but have you tried a different GPU bios?
> I'd only recommend this as a last resort and only if you have a dual bios on your GPU.


I haven't tried to update or change GPU bios yet.
How would I know if I have dual bios on my GPU?
Quote:


> Originally Posted by *Arkanon*
> 
> Rather than flashing the bios i'd just opt for the choice of RMA'ing the card.


They did not RMA the card because the card is NTF(No Trouble Found)
Quote:


> Originally Posted by *bardacuda*
> 
> Except that he's already said the card has been tested in a different machine and worked fine....although it was in the shop where he bought it so they could be biased.


I've watched them with my own eye while they conduct the test, they tried several games and run it in highest settings available and the BSOD / crash did not occur. So Its not biased.


----------



## Roboyto

Quote:


> Originally Posted by *microkid21*
> 
> I haven't tried to update or change GPU bios yet.
> How would I know if I have dual bios on my GPU?
> They did not RMA the card because the card is NTF(No Trouble Found)
> I've watched them with my own eye while they conduct the test, they tried several games and run it in highest settings available and the BSOD / crash did not occur. So Its not biased.


At this point I would begin to suspect an issue with your MoBo since the RAM didn't fix the problem, AND using the card in another computer did. It would be highly unlikely to get another bad set of RAM.

Do you have another computer you can put your card in?


----------



## Devildog83

Quote:


> Originally Posted by *Arkanon*
> 
> 
> 
> Highest score so far on my 270x @ 1360/1500
> Doubt there's more in it. Also coming ridiculously close to mild overclocked 280x's


That's awesome, man those are impressive clocks for a 270x.


----------



## Devildog83

My income tax was approved so not only is my 750D going to be here Monday but I will be ordering water-cooling components next week too. Is it just me or does it seem like X-Mas around here.


----------



## Anonizer

@Roboyto

Did you use the spread technique when applying TIM?


----------



## Roboyto

Quote:


> Originally Posted by *Anonizer*
> 
> @Roboyto
> 
> Did you use the spread technique when applying TIM?


Nah, just a little dot in the middle of the GPU die. Set the HSF down on it and give a little pressure and then check to see how it spreads. The GPU die is so smooth and flat it, should spread really well with little to no effort. You could always add a little extra TIM if there isn't enough.


----------



## bardacuda

Quote:


> Originally Posted by *Arkanon*
> 
> 
> 
> Highest score so far on my 270x @ 1360/1500
> Doubt there's more in it. Also coming ridiculously close to mild overclocked 280x's


Noice


----------



## darkelixa

My r9 290 ref sapphire is leaking oil out of a screw, so im looking to buy an r9 280x now. What would be a decent vendors card to buy? Asus , gigabyte??


----------



## neurotix

Quote:


> Originally Posted by *darkelixa*
> 
> My r9 290 ref sapphire is leaking oil out of a screw, so im looking to buy an r9 280x now. What would be a decent vendors card to buy? Asus , gigabyte??


Get the Sapphire Vapor-X or Toxic. You won't be disappointed.


----------



## darkelixa

So sapphire is a good brand?? Just a bit burned becuase my r9 290 is leaking some sort of a weird oil out of the back? But if its a good brand ill buy another. So sapphire is better than the asus and gigabyte versions


----------



## neurotix

Have you considered RMAing the card? You should be able to RMA it with Althon Micro and get a new card within 2-3 weeks.

I'm not sure how their reference cards are, but their custom cards are excellent. They have very good cooling. Coolers like the Tri-X, Toxic and Vapor-X consistently outperform competing coolers, like the Windforce and DCUII. Here you can see that the Tri-X has the best temps compared to the MSI Gaming and ASUS DCUII TOP. If you're getting a 280X you should look into reviews of the Toxic and Vapor-X models. The Vapor-X in particular runs very cool due to the Vapor chamber, which most other cards don't have.


----------



## darkelixa

I will be RMA the card, but here in Australia it usually takes weeks if not months to get a card back from the vendor if they repair it so i need to buy a new one for when its down plus ill prob use the new r9 290 on a even newer build down the track.

My budget does not really alow for an up to r9 290 and the games i play arent demanding enough to really warrant buying one so i will be purchasing a r9 280x because they are better than the 770gtx, are they not?

Yes i had a quick look on newegg and there were quite alot of complaints on the sapphire r9 280x vapor-x and the toxic, i dont know if newegg reviews are something that I should really rely on thou?

I do love the look of the r9 280x toxic as it will go really well with my asrock z87 oc formula board


----------



## Majentrix

Are the premium 270xs (msi's hawk, sapphire's toxic) worth the additional cash? Do they overclock better?
I'm looking at picking up two of them now that the mining hype has died down


----------



## Devildog83

Quote:


> Originally Posted by *Majentrix*
> 
> Are the premium 270xs (msi's hawk, sapphire's toxic) worth the additional cash? Do they overclock better?
> I'm looking at picking up two of them now that the mining hype has died down


Some of the higher end cards have a lot of room to overclock and some are already close to there max. The Hawk and Toxic seem to be good overclocking cards.


----------



## neurotix

Afaik the Hawk, Toxic, Lightning, Devil, and so on use highest binned chips. You have a good chance of getting a chip with high ASIC that will overclock well on air.

These are also all premium cards and should have no trouble staying cool even when overclocked.
Quote:


> Originally Posted by *darkelixa*
> 
> I will be RMA the card, but here in Australia it usually takes weeks if not months to get a card back from the vendor if they repair it so i need to buy a new one for when its down plus ill prob use the new r9 290 on a even newer build down the track.
> 
> My budget does not really alow for an up to r9 290 and the games i play arent demanding enough to really warrant buying one so i will be purchasing a r9 280x because they are better than the 770gtx, are they not?
> 
> Yes i had a quick look on newegg and there were quite alot of complaints on the sapphire r9 280x vapor-x and the toxic, i dont know if newegg reviews are something that I should really rely on thou?
> 
> I do love the look of the r9 280x toxic as it will go really well with my asrock z87 oc formula board


Ah, I see, so it will take a while to RMA.

As far as the 280X being better than the GTX 770, I think that's kind of subjective. Some people really like the 770. The 770 is basically just a 680. The advantage to having a 280X is that the 280X has 3GB VRAM on a 384-bit bus with high memory bandwidth. The GTX 770 only has 2GB of VRAM on a 256-bit bus. This means that in really demanding titles or at high resolution, the 280X comes out on top.

The Toxic does look good. The Sapphire logo lights up in yellow.





It also has a sick looking backplate. I'm really disappointed that my two Tri-X don't have these features. The cooler is the same afaik so you'd think the Tri-X could have the light up logo and backplate. My only complaints about my cards.

EDIT: Oh, and Newegg reviews might not matter much. I've read reviews before buying stuff before, and there's always a few people that get dead cards or have serious problems. Yet, I've NEVER received a dead card from Newegg or had any issues with them that weren't my own fault. I also buy primarily Sapphire stuff and have yet to have a dead card from them. If you know what you're doing with computers, you're careful and ground yourself before putting cards in, make sure you install drivers correctly, and know how to optimize Windows and overclock properly, you aren't going to have problems. Some people THINK they know what they're doing but don't, and I guarantee a lot of those bad reviews are from idiots.


----------



## darkelixa

Yes I really love the look of that card, im going to buy that with a brand new psu just to be on the safe side, most likely a silverstone 750w gold. I have never had a DOA from any manafacture to be honest


----------



## End3R

As a recently new r9 owner, I will say that while the card is a beast, it has been a rollercoaster of frustration trying to resolve that flickering issue. (I *THINK* I currently have it resolved, but I've thought that a few times already, only time will tell now.)

FYI the steps I've taken to resolve the r9 flicker:
Refreshed Windows, Completely re-installed windows, updated/cleaned my drivers with pretty much every version since 2012, switched from an HDMI to DVI cable, and finally the last thing I've done is set my minimum clockspeeds in Afterburner to 250/400 instead of 150/300 that it keeps downclocking to when idle. Strange though last I checked on my lunch break, while streaming netflix, I didn't notice anything wrong, but in afterburner it showed the memory clock jumping between 400 and 1400 over and over, again there was nothing wrong performance wise so I'm not sure if it mattered, I'll be testing more once I get off work tonight.)

I really hope I can get it working because I don't want to deal with sending it back, and like I said it is a total beast, when it does work, it plays anything and everything at their best settings.


----------



## darkelixa

Cant say I have had a flickering issue, what brand and model gpu is it


----------



## End3R

Quote:


> Originally Posted by *darkelixa*
> 
> Cant say I have had a flickering issue, what brand and model gpu is it


It's a Powercolor r9 270X, not sure the exact model at the moment, but if you just google r9 flickering you'll get tons of pages about it, seems to be widespread on alot of r9 cards. From what I can gather it's a problem with the drivers and the clock speeds being underclocked too far when the card is viewing 2D applications. Since changing my minimum clock speeds I haven't had the issue but like I said I did notice the memory clock jumping around, but without seeing anything obviously wrong on the screen it's hard to tell that anything was actually wrong with it.


----------



## darkelixa

sapphire fixed it with a bios update


----------



## End3R

Quote:


> Originally Posted by *darkelixa*
> 
> sapphire fixed it with a bios update


Yea from what I hear Sapphire and MSI released their own drivers that fixed it, I checked PowerColor's site and downloaded their drivers but it did nothing. Assuming my clock change fixed the issue for me, is the memory clock jumping like I described something to worry about? I haven't noticed any decrease in performance when playing my games. And *knocks on wood* worst case scenario if I see it pop up again, would it be the end of the world if I tried to install Sapphire or MSI drivers? From what I can tell AMD drivers seem to be fairly universal between their cards, way more than NVidia.


----------



## microkid21

Quote:


> Originally Posted by *Roboyto*
> 
> At this point I would begin to suspect an issue with your MoBo since the RAM didn't fix the problem, AND using the card in another computer did. It would be highly unlikely to get another bad set of RAM.
> 
> Do you have another computer you can put your card in?


I don't have. I would not say that the mobo has an issue here, because before I'm using this mobo with another card from the green team which is gtx 460 and I did not encounter this kind of problem. It all started when I upgrade / switch to red and which I'm currently using right now the 270x. Is there a possibility that the PSU is the culprit here? Or it is just a driver related issue?


----------



## boot318

Is this good for an R9 280x? 1150 core & 1700 memory.....


----------



## Roboyto

Quote:


> Originally Posted by *microkid21*
> 
> I don't have. I would not say that the mobo has an issue here, because before I'm using this mobo with another card from the green team which is gtx 460 and I did not encounter this kind of problem. It all started when I upgrade / switch to red and which I'm currently using right now the 270x. Is there a possibility that the PSU is the culprit here? Or it is just a driver related issue?


My Devil 270X with i7 3770k runs on a 450W PSU, and my R9 290 and 4770k run on a 650W; so you have plenty of power to spare. Unless your PSU is faulty then it is unlikely that it is causing the problem. It couldn't hurt to test with a different PSU if you have one lying around.

You have stated multiple times that you have re-installed windows and used numerous drivers so I don't think that is the issue. For best stability you should use 13.12 Use Display Driver Uninstaller to scrub all previous drivers.

Here is something else to try because we are really running out of options. Download Sapphire Trixx and try downclocking the GPU core a little to say 1150. Also, try giving it a little extra voltage. My card at default 1180/1400 runs at 1200mV. Try bumping it up to 1225mV, or even a little higher, and see what happens.

Just because your GTX 460 worked prior to you switching cards, doesn't mean that a problem couldn't have arisen with your MoBo now. Try reinstalling the GTX 460 if you still have it and see if it exhibits the same issues. Make sure to scrub drivers if you do so.

Every ASRock board I have owned has had issues with PCIe or RAM slots. My ASRock 890FX Extreme4 PCIe failed, Z77 Extreme6 PCIe failed, Z77 Extreme6 replacement RAM DIMMs failed, the 2nd replacement Z77 Extreme6 wouldn't clock RAM over 2000MHz, and my ASRock Z77 E-ITX PCIe slot failed.

ASRock's RMA process has gotten much worse since I first started purchasing their products back in 2009. I don't really suggest them as a choice for motherboards anymore. My RMA for the ITX board was a nightmare. I went around in circles via e-mail for 2 weeks, then didn't hear a response for 1 month. I finally sent one last e-mail complaining about the ridiculously long wait and the RMA was approved the next day.


----------



## darkelixa

Have not had one issue yet with my asrock z87 oc formula yet


----------



## Devildog83

Quote:


> Originally Posted by *boot318*
> 
> 
> 
> Is this good for an R9 280x? 1150 core & 1700 memory.....


Not bad at all, I am still waiting for a 280x to hit 50 fps average. Has anyone seen that yet?


----------



## Tobe404

Quote:


> Originally Posted by *boot318*
> 
> 
> 
> Is this good for an R9 280x? 1150 core & 1700 memory.....


Seems about right...



I get FPS: 48.1/Score: 2014/Min FPS 23.7/Max FPS: 93.7.

1205 core/1600 mem/1175 volts. Gigabyte R9 280x Rev 2.

What brand of 280x is yours and what voltage?


----------



## End3R

Quote:


> Originally Posted by *End3R*
> 
> Yea from what I hear Sapphire and MSI released their own drivers that fixed it, I checked PowerColor's site and downloaded their drivers but it did nothing. Assuming my clock change fixed the issue for me, is the memory clock jumping like I described something to worry about? I haven't noticed any decrease in performance when playing my games. And *knocks on wood* worst case scenario if I see it pop up again, would it be the end of the world if I tried to install Sapphire or MSI drivers? From what I can tell AMD drivers seem to be fairly universal between their cards, way more than NVidia.


Here is what I was talking about, the left side of the graph is when I was streaming netflix, is that rapid jumping on my clockspeed something to worry about? The right-hand side of the graph is when I launched Far Cry 3 so no issues there. Thankfully I've still had no ugly flicker issues since changing my minimum clock speeds.


----------



## M1kuTheAwesome

Does anyone here know where I could get Fujipoly pads in Europe? Shipping from the states costs twice as much as the pads themselves.
As for the flickering issue, that explains what I saw the other day, though that was with overclocking applied. At stock settings it's fine.


----------



## neurotix

I think we should clarify. When you say they released their own drivers, do you mean a VBIOS? If so, there should be no problem flashing a new, updated VBIOS as long as it matches your card model. Theoretically, you should be able to use ANY manufacturer's VBIOS for any card as long as it matches your card's model (270X bios for a 270X, I don't think ATIWinflash will even let you flash a different cards bios to your card). However, in practice this doesn't always work. One of my 290s won't work with any bios other than the one it came with, I tried a newer Sapphire Tri-X bios and a few others and it would only display video in a low resolution, even after installing the Catalyst drivers. My other 290, however, seems to work fine with any 290 bios I've tried on it. I haven't tried flashing my 270X because I've had no reason to.

I guess I'm not sure what the problem is? My 290s, my 7970, my 270X all jump around in clock speed when watching video or playing certain old games (very old, DX8). If it's flickering, it could be a codec issue or it could be a problem with your overclock. In general, it's not good to watch videos while overclocked, sometimes you WILL see flickering. Set your card to base clocks and voltage and try.

I'm just gonna throw this out there, but try installing this: http://www.divx.com/en/software/divx I had an issue with my 290 when I first got it where my entire system would freeze every time I tried to watch MPEG videos. I would have to power my machine off via the power button and turn it back on. This happened with all video and I thought my card was defective until my girlfriend found that installing DivX fixed it. You might also want to try getting the combined community codec pack and installing that.


----------



## neurotix

Sorry for double post guys.
Quote:


> Originally Posted by *Devildog83*
> 
> Not bad at all, I am still waiting for a 280x to hit 50 fps average. Has anyone seen that yet?


@ Devildog

I had to dig to find this.



That's my 7970 @ 1200/1600mhz with my FX at 4.7.

7970/280X should be able to easily break 50 fps if you know what you're doing. You should also get 10k+ in 3dmark11 regardless of processor.

I'm not gonna do it because I'm lazy but I bet if I took my 7970 out of my girlfriend's machine and put it in here, now that I have Intel I might be able to break 60 fps.


----------



## End3R

Quote:


> Originally Posted by *neurotix*
> 
> I think we should clarify. When you say they released their own drivers, do you mean a VBIOS? If so, there should be no problem flashing a new, updated VBIOS as long as it matches your card model. Theoretically, you should be able to use ANY manufacturer's VBIOS for any card as long as it matches your card's model (270X bios for a 270X, I don't think ATIWinflash will even let you flash a different cards bios to your card). However, in practice this doesn't always work. One of my 290s won't work with any bios other than the one it came with, I tried a newer Sapphire Tri-X bios and a few others and it would only display video in a low resolution, even after installing the Catalyst drivers. My other 290, however, seems to work fine with any 290 bios I've tried on it. I haven't tried flashing my 270X because I've had no reason to.
> 
> I guess I'm not sure what the problem is? My 290s, my 7970, my 270X all jump around in clock speed when watching video or playing certain old games (very old, DX8). If it's flickering, it could be a codec issue or it could be a problem with your overclock. In general, it's not good to watch videos while overclocked, sometimes you WILL see flickering. Set your card to base clocks and voltage and try.
> 
> I'm just gonna throw this out there, but try installing this: http://www.divx.com/en/software/divx I had an issue with my 290 when I first got it where my entire system would freeze every time I tried to watch MPEG videos. I would have to power my machine off via the power button and turn it back on. This happened with all video and I thought my card was defective until my girlfriend found that installing DivX fixed it. You might also want to try getting the combined community codec pack and installing that.


Sorry for the confusion, when I said this is what I was talking about, I was talking specifically about seeing the clocks jump around when playing videos, it was something I never noticed before and was wondering if it was something I should worry about.

Hopefully I resolved the flickering issue by changing the minimum clock speeds, I haven't experienced it since so everything seems ok so far. If however I come back in a few days because my screen started to have a seizure again (the flickering can be intense), then I will probably be asking for help on how to update the bios of my card with the msi/sapphire fix. But hopefully I won't need to do that.


----------



## neurotix

Quote:


> Originally Posted by *End3R*
> 
> Sorry for the confusion, when I said this is what I was talking about, I was talking specifically about seeing the clocks jump around when playing videos, it was something I never noticed before and was wondering if it was something I should worry about.
> 
> Hopefully I resolved the flickering issue by changing the minimum clock speeds, I haven't experienced it since so everything seems ok so far. If however I come back in a few days because my screen started to have a seizure again (the flickering can be intense), then I will probably be asking for help on how to update the bios of my card with the msi/sapphire fix. But hopefully I won't need to do that.


It's really easy to update your VBIOS. Let us know if you have issues again and someone can help you do it. Nowadays, you don't even need to use DOS to do it, you can do it from within Windows using ATIWinflash. Some people still recommend you use DOS, I'm not sure why. I've flashed my 290s as well as a 7970 using ATIWinflash and never had any problems.

Yeah, the clocks jumping around happens. I just played a few MPEG videos. According to my AIDA64 sidebar gadget, my clocks jump between 825 (minimum), 1100 and 1328 mhz (I'm sure the 1328 is probably wrong). The memory clock seems to stay the same, at 1500mhz. All of my AMD cards have always done this when playing video. My 6870 did it, my 270X, my 7970s did, and my 290 does. It's totally normal.


----------



## Unknownm

Quote:


> Originally Posted by *Tobe404*
> 
> Seems about right...
> 
> 
> 
> I get FPS: 48.1/Score: 2014/Min FPS 23.7/Max FPS: 93.7.
> 
> 1205 core/1600 mem/1175 volts. Gigabyte R9 280x Rev 2.
> 
> What brand of 280x is yours and what voltage?


I was almost able to get to 50fps with Extreme HD (8x AA). Which is using Rev02 BIOS to allow 1.3v @ 1180Mhz (Artifact free). The shots are from my DSLR using High ISO only because I didn't have fraps, bandicam or Dxtory open at the time.


----------



## End3R

Well spoke too soon, had the flickering come back so I went ahead and flashed my card. Looks like it was succesful, the core clock before was 1080 not 1070, and it thinks its a sapphire card now. Hopefully this resolves the issue once and for all.










Edit: Yeeeeea so that didn't work, seems fine until I launch a game, then it crashes the system, reverting back to my previous bios now. (Don't worry it's not bricked, it's back to saying PowerColor vendor and my games are playing again.)


----------



## neurotix

RMA it.


----------



## Tobe404

@ neurotix - Given our clocks are the same, yet your score is 4-5fps higher.
Would you atribute that to driver version, higher clocked/more cores on your CPU?
Or just one of those things that can't really be pin pointed as to why? Too many variables.


----------



## End3R

Quote:


> Originally Posted by *neurotix*
> 
> RMA it.


Gonna try, I bought the system from cyberpowerpc, and their parts all have a 3 year warranty but I'm not sure exactly what that entails, I'll be contacting their support for sure. The questions is though, should I be asking for another r9, but specify it be a sapphire, or just jump back to nvidia and see if they'd give me the closest thing to it, which I think would be a GTX 760.


----------



## boot318

Quote:


> Originally Posted by *Devildog83*
> 
> Not bad at all, I am still waiting for a 280x to hit 50 fps average. Has anyone seen that yet?




I almost got there. My GPU seems to like 1200. Tried 1225 and artifacts appear. I could try 1215 but I like to add +25 to my GPU. I'm weird or something..... I'll try later today though.
Quote:


> Originally Posted by *Tobe404*
> 
> Seems about right...
> 
> 
> 
> I get FPS: 48.1/Score: 2014/Min FPS 23.7/Max FPS: 93.7.
> 
> 1205 core/1600 mem/1175 volts. Gigabyte R9 280x Rev 2.
> 
> What brand of 280x is yours and what voltage?


Sapphire Vapor-X. 1150 core and 1700 memory. 1250v on core and 1560v on memory (this card has separate voltage for core and memory). Core holding me back alot on this card.


----------



## alancsalt

To get those @ mentions to work, type the member's name, highlight, and click on the @ symbol in your reply toolbar.

Otherwise it is:

Code:



Code:


[@]membername[/@]


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> Sorry for double post guys.
> @ Devildog
> 
> I had to dig to find this.
> 
> 
> 
> That's my 7970 @ 1200/1600mhz with my FX at 4.7.
> 
> 7970/280X should be able to easily break 50 fps if you know what you're doing. You should also get 10k+ in 3dmark11 regardless of processor.
> 
> I'm not gonna do it because I'm lazy but I bet if I took my 7970 out of my girlfriend's machine and put it in here, now that I have Intel I might be able to break 60 fps.


That's nice!! I broke 9,000 on 3DMark11 with a 7870 so a 7970 should get 10,000 easy. I think Intel is paying Futuremark to rig the bench's TBH. As they get updated my scores get worse, my CPU is not getting worse but the Physics and Combined score do. I only pay attention to the Graphics score on them.


----------



## DiceAir

Why is my club3d R9-280x being detected as 7900 series and not r9-200 series but my friends r9-270x powercolor devil is being detected as r9-200 series. e ahve the latest drivers 14.3 drivers both installed


----------



## bardacuda

Quote:


> Originally Posted by *Tobe404*
> 
> @ neurotix - Given our clocks are the same, yet your score is 4-5fps higher.
> Would you atribute that to driver version, higher clocked/more cores on your CPU?
> Or just one of those things that can't really be pin pointed as to why? Too many variables.


CPU does not really make a difference for single cards. Driver version can make a difference, as well as Win7 vs Win8 (but both seem to be Win7 in this case), and an ~2ish FPS difference for the texture filtering driver tweaks discussed earlier. The OP in the Valley thread goes into detail about the tweaks.

@DiceAir

Probably has to do with Club3D's BIOS. I wouldn't worry about it since the card is really just a 7970 anyway.


----------



## DiceAir

Quote:


> Originally Posted by *bardacuda*
> 
> CPU does not really make a difference for single cards. Driver version can make a difference, as well as Win7 vs Win8 (but both seem to be Win7 in this case), and an ~2ish FPS difference for the texture filtering driver tweaks discussed earlier. The OP in the Valley thread goes into detail about the tweaks.
> 
> @DiceAir
> 
> Probably has to do with Club3D's BIOS. I wouldn't worry about it since the card is really just a 7970 anyway.


Ye but it detected it as r9-200 series before.


----------



## bardacuda

@DiceAir

Hmm....strange. In that case I dunno...but still nothing to worry about. When did it change? Did you change OS, drivers, add a 2nd card? Or did you just reboot one day and it was different?


----------



## JCH979

Quote:


> Originally Posted by *Devildog83*
> 
> Not bad at all, I am still waiting for a 280x to hit 50 fps average. Has anyone seen that yet?


56.9 fps



P11601

http://www.3dmark.com/3dm11/7783545



Still not really sure if any of my results are good since it was my first time benching..


----------



## Devildog83

Yes, very nice!!


----------



## electrocabeza

Hi everyone. I have a question for 270X owners: What is the max voltage that you recomend for this card? Im using 1.3V for 1350/1600MHz 3DMark11 and 1340/1600MHz Fire Strike Extreme.

My card is a Sapphire R9 270X Toxic. Thank you and sorry for my english!


----------



## DiceAir

Quote:


> Originally Posted by *bardacuda*
> 
> @DiceAir
> 
> Hmm....strange. In that case I dunno...but still nothing to worry about. When did it change? Did you change OS, drivers, add a 2nd card? Or did you just reboot one day and it was different?


it happened since the 14.1 drivers and I'm on windows 8.1


----------



## Devildog83

Quote:


> Originally Posted by *electrocabeza*
> 
> Hi everyone. I have a question for 270X owners: What is the max voltage that you recomend for this card? Im using 1.3V for 1350/1600MHz 3DMark11 and 1340/1600MHz Fire Strike Extreme.
> 
> My card is a Sapphire R9 270X Toxic. Thank you and sorry for my english!


For most 1.3 is about it but the Toxic could probably go a bit higher if you can keep it cool enough.


----------



## DiceAir

Can someone guide me as to how to overclock my R9-280x. I'm at stock 1.263V 1100MHz and 1500mhz memory. I really want to overclock this card but don't know if I should increase voltage or what. I tried increasing clock speeds with power limiter and some voltage but I get crashes and freezes in my games.

My card is club3d royalking r9-280x. I think my card might not even overclock that well. I'm using MSI afterburner to overclock my card. Temps on BF4 = 72C on a aggresive fan profile.


----------



## diggiddi

Quote:


> Originally Posted by *DiceAir*
> 
> Can someone guide me as to how to overclock my R9-280x. I'm at stock 1.263V 1100MHz and 1500mhz memory. I really want to overclock this card but don't know if I should increase voltage or what. I tried increasing clock speeds with power limiter and some voltage but I get crashes and freezes in my games.
> 
> My card is club3d royalking r9-280x. I think my card might not even overclock that well. I'm using MSI afterburner to overclock my card. Temps on BF4 = 72C on a aggresive fan profile.


http://www.overclock.net/t/633816/how-to-overclock-your-amd-ati-gpu


----------



## Roboyto

Quote:


> Originally Posted by *darkelixa*
> 
> Have not had one issue yet with my asrock z87 oc formula yet


Since it is Z87 it probably isn't very old, likely under a year, and every time I had the PCIe quit on me it was within the first 12 months. I have purchased 3 in the last 5 years; which I have had to RMA 4 times. I first chose ASRock as they were a subsidiary of ASUS, but they have since broken away and now are their own standalone company.

I'm not saying it will fail, but for 3 different motherboards, which all had the PCIe slot fail, while using different GPUs in each scenario, is not a good track record. I would not be negative if the RMAs weren't so aggravating these days. The service for my first 2 RMAs was so excellent that I *almost* didn't mind the hardware failure because it was fast and easy. The latter 2 RMAs for Extreme6 and ITX board were a nightmare, and a far cry from the service I received the first 2 times.

I am still using an ASRock board in the machine I'm typing away at now. I may consider their products in the future, but I wouldn't do so without purchasing them at my local MicroCenter with a walk-in warranty to bypass the RMA procedure.

The purpose of my post was to point out issues I have had with ASRock PCIe slots, since MicroKid is having a horrible time with pinpointing his problem.



Spoiler: Warning: ASUS *thumbs down*



So you don't think I am just an ASRock hater, I will now tell you of my disgust for ASUS

I needed to RMA a board because I damaged it, and apparently ASUS didn't want my money because I couldn't get them to repair the board and take my money. ASUS tech/customer support & RMA process is the worst thing I have ever experienced in my life. *According to ASUS tech support, a PCIe slot is not on, nor is it part of, the motherboard.* They angered me to the point of selling/disposing of nearly every ASUS product that I own. I sold my: K55A Ivy Bridge i5 laptop, eeePC 1215P Atom N550 netbook, RT-N56U Router, (2) external disc drives, and my Z77M-PRO motherboard for parts because I couldn't get *customer pay* RMA approval; Still have a GT640 GPU and a bluetooth dongle to dump. I will likely be keeping my Z87 Gryphon until the MicroCenter warranty expires in ~2 years. I won't ever be purchasing, or recommending, anything from ASUS again.


----------



## Justinbaileyman

Quote:


> Originally Posted by *electrocabeza*
> 
> Hi everyone. I have a question for 270X owners: What is the max voltage that you recomend for this card? Im using 1.3V for 1350/1600MHz 3DMark11 and 1340/1600MHz Fire Strike Extreme.
> 
> My card is a Sapphire R9 270X Toxic. Thank you and sorry for my english!


Wow that is way to much voltage for that little bit of an over clock.
I have 2 XFX r9 270x DD's overclocked to same settings as you but only use 1.25v max and 0.875 while stock.
If you are planning on keeping those volts then you should have some good cooling maybe even water and bump up the speeds a tad more.


----------



## darkelixa

I think its really luck of the draw when it comes to mainboards, i have had an asus and a gigabyte board blow up on me just from turning them on. Never had an asrock board fail on me yet to be honest, even my z77 extreme 6 ran perfectly


----------



## Devildog83

Quote:


> Originally Posted by *Justinbaileyman*
> 
> Wow that is way to much voltage for that little bit of an over clock.
> I have 2 XFX r9 270x DD's overclocked to same settings as you but only use 1.25v max and 0.875 while stock.
> If you are planning on keeping those volts then you should have some good cooling maybe even water and bump up the speeds a tad more.


I think you are gong to need at least 1.3v to run at 1360 on a 270x. 1350 is just about as high as I have seen on a 270x. Can you get 1350 and your DD's at 1.25v? That would be crazy good if you can.


----------



## bardacuda

Quote:


> Originally Posted by *Justinbaileyman*
> 
> Wow that is way to much voltage for that little bit of an over clock.
> I have 2 XFX r9 270x DD's overclocked to same settings as you but only use 1.25v max and 0.875 while stock.
> If you are planning on keeping those volts then you should have some good cooling maybe even water and bump up the speeds a tad more.


1350 is not a little overclock...that's 300MHz over stock...a 28.5% increase. He's lucky to be able to get his card stable at those clocks at any voltage. If you have two that can bench at 1350 with only 1.25V well you've hit the silicon lottery, but 1.3V for his OC is not bad by any means.


----------



## End3R

Quote:


> Originally Posted by *Devildog83*
> 
> I think you are gong to need at least 1.3v to run at 1360 on a 270x. 1350 is just about as high as I have seen on a 270x. Can you get 1350 and your DD's at 1.25v? That would be crazy good if you can.


Quote:


> Originally Posted by *bardacuda*
> 
> 1350 is not a little overclock...that's 300MHz over stock...a 28.5% increase. He's lucky to be able to get his card stable at those clocks at any voltage. If you have two that can bench at 1350 with only 1.25V well you've hit the silicon lottery, but 1.3V for his OC is not bad by any means.


I'm new to the amd/ati scene, but it was my understanding all 270Xs came stock overclocked at 1400mhz (that's what the X is for isn't it?), that's what mine is at (while gaming) and my voltage never goes over 1.2. My temps never go over the mid 60s either (again, while gaming), with stock cooling.

Edit: NVM, I see now yall were talking about the core clock, not the memory clock. (My core clock is at 1080, not 1400)

Curious though, could increasing the volts from 1.2 possibly resolve that flickering I've been talking about? I have to wait till tomorrow to contact their CS to request a replacement so I'm still troubleshooting.

Edit: Tried upping to 1.25 then 1.3m flickering persisted, looks like it's time to send this card back, because ive tried everything.


----------



## Devildog83

Quote:


> Originally Posted by *End3R*
> 
> I'm new to the amd/ati scene, but it was my understanding all 270Xs came stock overclocked at 1400mhz (that's what the X is for isn't it?), that's what mine is at (while gaming) and my voltage never goes over 1.2. My temps never go over the mid 60s either (again, while gaming), with stock cooling.


No, they come with different clocks. 1400 Mhz that yours came with is memory clock not core clock.


----------



## End3R

So I noticed in gpu-z even though I edited my profile.xml to up my minimum 2d clock speeds to be 450/300, they won't go above 300/150mhz. This is the last thing I can try before having to get this card replaced. Does anyone know why they aren't changing?


----------



## Luckael

@Devildog83
Quote:


> Originally Posted by *Devildog83*
> 
> It does look like you lack some airflow in that case.


i end it up to exchange my old Devil 270x to new Devil 270x,.. and guess what







. the temperature of new Card is very cool, 38c idle and 75max.


----------



## Arkanon

I would be surprised to see a 270x hit 1350MHz @ 1.3V
Most people tend to top out around 1270-1300 MHz at that voltage.
My 270x at 1.3V tops out at around 1275.

When carefully testing my card i came to the conclusion that you need an incremental increase of voltage to get the cards to be fully stable at clocks above 1300MHz.
I needed:
1.340V for 1320MHz
1.360V for 1340MHz
1.412V for 1360MHz

I don't doubt there's better cards out there which are of a higher bin than mine, but i really don't see any 270x hitting 1350MHz at 1.3V. AMD knows what they can get out of their cards and that would put them straight into 280 territory, thus cutting their own flesh. As a card overclocked at those speeds can go head to head with stock 280x's for about two thirds of the price.


----------



## ghost_z

Would anyone be kind enough to rpely to this simple question.
Have been searching for long but could not get a concrete answer :\

Is the Memory Voltage modifiable through MSI after burner or any tool on ASUS R9 280X DCUII TOP card ?

I was having a hard time with Artifacts so i rma'ed my GPU now even the new GPU does the same although the frequency and occurance of artifacts has reduced to a significant degree.

So would like to undervolt and underclock my memory to experiment with it.


----------



## ricko99

Quote:


> Originally Posted by *ghost_z*
> 
> Would anyone be kind enough to rpely to this simple question.
> Have been searching for long but could not get a concrete answer :\
> 
> Is the Memory Voltage modifiable through MSI after burner or any tool on ASUS R9 280X DCUII TOP card ?
> 
> I was having a hard time with Artifacts so i rma'ed my GPU now even the new GPU does the same although the frequency and occurance of artifacts has reduced to a significant degree.
> 
> So would like to undervolt and underclock my memory to experiment with it.


I have exactly the same card with the same problem, sadly there is no way you can underclock the memory voltage. it will always stuck at 1.6v i tried with afterburner and it didn't work. other gpu tools from asus and sapphire don't have memory voltage sliders. but lately i've experienced no artifacts when playing by lowering down the gpu memory clock to 1550 instead of the original 1600 and power limit at +20%. Use catalyst 13.12 or 14.3 beta. Hope this helps


----------



## Anonizer

Had some atikmdag.sys BSOD on 14.4 when using FF. Rolled back to 14.3, I was able to overclock it higher and not having any solid red screen issue I had on 14.4. Haven't tried out 13.12 yet though. I noticed my temps were getting higher than it was previously while running stress test/benchmarks (70c~80c), and I remembered that I removed the heatsink without re-applying TIM.







Searched some TIMs that are available here.

Zalman STG2 , 1
Deepcool Z3 , Z5 , Z9
Prolimatech PK-1
Noctua NT-H1
Tuniq TX4 , TX2
Thermaltake TG-2
Antec Formula 7
Cooler Master Extreme Fusion X1
Cooler Master IC Value V1
Compound IC Diamond 24 Carat , 7 Carat
Arctic Silver 5
Arctic Cooling MX-2
Gelid GC-Extreme
Gelid GC-2
Thermalright Chill factor 3

Couldn't find any of the TIMs you mentioned on the previous pages (PK-3 Nano, Xigmatek PTI-G4512) not sure if I missed some. No Arcticlean too







but I'll try asking the store when I'll be buying TIM.
Anyone here by chance tried one of these out? What do you think is the best among those?

The TIM applied to my card was too thin that when I removed the heatsink I just pulled it without twisting luckily the core didn't get ripped off.








Is it okay to just wipe it off? Is the alcohol 90%~99% solution is really needed or it's okay to use 70% solution? Can I use either Ethyl or Isopropyl? Sorry for asking too many questions.


----------



## GoLDii3

Quote:


> Originally Posted by *Anonizer*
> 
> Had some atikmdag.sys BSOD on 14.4 when using FF. Rolled back to 14.3, I was able to overclock it higher and not having any solid red screen issue I had on 14.4. Haven't tried out 13.12 yet though. I noticed my temps were getting higher than it was previously while running stress test/benchmarks (70c~80c), and I remembered that I removed the heatsink without re-applying TIM.
> 
> 
> 
> 
> 
> 
> 
> Searched some TIMs that are available here.
> 
> Zalman STG2 , 1
> Deepcool Z3 , Z5 , Z9
> *Prolimatech PK-1*
> *Noctua NT-H1*
> Tuniq TX4 , TX2
> Thermaltake TG-2
> Antec Formula 7
> Cooler Master Extreme Fusion X1
> Cooler Master IC Value V1
> Compound IC Diamond 24 Carat , 7 Carat
> *Arctic Silver 5*
> *Arctic Cooling MX-2*
> *Gelid GC-Extreme*
> Gelid GC-2
> Thermalright Chill factor 3
> 
> Couldn't find any of the TIMs you mentioned on the previous pages (PK-3 Nano, Xigmatek PTI-G4512) not sure if I missed some. No Arcticlean too
> 
> 
> 
> 
> 
> 
> 
> but I'll try asking the store when I'll be buying TIM.
> Anyone here by chance tried one of these out? What do you think is the best among those?
> 
> The TIM applied to my card was too thin that when I removed the heatsink I just pulled it without twisting luckily the core didn't get ripped off.
> 
> 
> 
> 
> 
> 
> 
> 
> Is it okay to just wipe it off? Is the alcohol 90%~99% solution is really needed or it's okay to use 70% solution? Can I use either Ethyl or Isopropyl? Sorry for asking too many questions.


The one in bold should do good,prob. the GC-Extreme is the best one.

Don't know about GPU's but i always used a cheap ethyl alcohol and never had any problems.


----------



## diggiddi

Quote:


> Originally Posted by *ricko99*
> 
> I have exactly the same card with the same problem, sadly there is no way you can underclock the memory voltage. it will always stuck at 1.6v i tried with afterburner and it didn't work. other gpu tools from asus and sapphire don't have memory voltage sliders. but lately i've experienced no artifacts when playing by lowering down the gpu memory clock to 1550 instead of the original 1600 and power limit at +20%. Use catalyst 13.12 or 14.3 beta. Hope this helps


From my understanding you need GPU Tweak for the Asus cards


----------



## neurotix

I use 91% isopropyl alcohol on a q-tip to clean my IHS.

It seems to work better than the common 70% isopropyl alcohol. I have no idea about ethyl alcohol.


----------



## Devildog83

OK, somebody tell me again how replacing the TIM and using better stuff will make very little difference. I was getting to 85C on the top card (7870 before) on a valley run @1200/1400. Here are my temps with the same clocks, same valley run. Now there are only 5c apart and 65c max. Love that Gelid stuff.


----------



## lightsout

Quote:


> Originally Posted by *Devildog83*
> 
> OK, somebody tell me again how replacing the TIM and using better stuff will make very little difference. I was getting to 85C on the top card (7870 before) on a valley run @1200/1400. Here are my temps with the same clocks, same valley run. Now there are only 5c apart and 65c max. Love that Gelid stuff.


It depends how poor the original paste/application was though. I remember in the gtx 580 days people were doing it for 2-3 C improvement. I do not believe your results are common but congrats anyways.


----------



## Devildog83

Quote:


> Originally Posted by *lightsout*
> 
> It depends how poor the original paste/application was though. I remember in the gtx 580 days people were doing it for 2-3 C improvement. I do not believe your results are common but congrats anyways.


It was applied poorly and after 6 months or so it just was getting hot but 20c difference, that's huge.


----------



## microkid21

AMD Catalyst 14.4 RC is now available for download!
Website donwload Link

I'll try this one and lets see if this will solve my problems.


----------



## Majentrix

Just pulled the trigger on two Sapphire R9 270x toxics.
What are the highest clocks people are getting with them?


----------



## neurotix

Quote:


> Originally Posted by *Majentrix*
> 
> Just pulled the trigger on two Sapphire R9 270x toxics.
> What are the highest clocks people are getting with them?


Usually around 1200/1500mhz for an average oc. 1600mhz or more on the RAM is uncommon. 1300mhz or more on the core is rare.

Expect somewhere around 1200/1500, maybe a little bit more (1250/1500) with a very negligible performance increase over 1200mhz.


----------



## Majentrix

That's great! I've been looking at reviews of them and they seem to scale fairly well. I see it has a second crossfire finger too, can you really hook up a third or fourth card?


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> Usually around 1200/1500mhz for an average oc. 1600mhz or more on the RAM is uncommon. 1300mhz or more on the core is rare.
> 
> Expect somewhere around 1200/1500, maybe a little bit more (1250/1500) with a very negligible performance increase over 1200mhz.


Didn't you have the Toxic 270x? I thought the core could go a lot higher than that.


----------



## End3R

So I'm assuming the Sapphire Toxic is the community favorite for the 270X? I'm asking because my RMA got approved, but before I send it in, I'm going to talk to their CS and see if I can specify what kind, rather than them just sending me one from a "major brand" like before.


----------



## Roboyto

Quote:


> Originally Posted by *End3R*
> 
> So I'm assuming the Sapphire Toxic is the community favorite for the 270X? I'm asking because my RMA got approved, but before I send it in, I'm going to talk to their CS and see if I can specify what kind, rather than them just sending me one from a "major brand" like before.


Very pleased with my Powercolor Devil 270X.


----------



## Devildog83

Quote:


> Originally Posted by *Roboyto*
> 
> Very pleased with my Powercolor Devil 270X.


Me to!!


----------



## End3R

Quote:


> Originally Posted by *Roboyto*
> 
> Very pleased with my Powercolor Devil 270X.


PowerColor is the brand I'm having issues with, although I think it's the TurboDuo not the Devil, are they really that diff?


----------



## Recr3ational

Quote:


> Originally Posted by *End3R*
> 
> PowerColor is the brand I'm having issues with, although I think it's the TurboDuo not the Devil, are they really that diff?


i got two turbo duos, i haven't had any issues (apart from heat but thats something to do with my loop ) what problems are you having?


----------



## End3R

Quote:


> Originally Posted by *Recr3ational*
> 
> i got two turbo duos, i haven't had any issues (apart from heat but thats something to do with my loop ) what problems are you having?


The r9 270x flickering problem. My screen will start flickering until the driver eventually crashes and recovers, or locksup and I have to reboot. I've already tried cleaning/re-installing every version of the drivers since 2012, completely re-installed windows, switched from hdmi to dvi cables, flashed the vbios, and attempted to adjust my minimum speeds (because it seems related to the 2d clocks being too low). I've never seen my temps go over 70 so it's not a heating issue either.




What's really aggravating is it comes and goes as it pleases, I can play games for a couple days without seeing this issue, (when I first got it, I didnt notice this for at least a few weeks) and then other times I'll have to reboot my machine 10 times in a row because my game will lockup about 10 seconds after booting up.

Also weirdly enough in the process of one of my updates/cleaning out drivers I seem to have half-broken my xbox controller. Some games still register it fine like assassin's creed, but others, like Bioshock don't register it at all. Then I have games like Remember me that register it during menus only, and DmC where it's constantly moving my menu/camra "up".

Once I get off work tonight I'm going to be putting a gts 450 in as a temp replacement while I wait for my RMA so I suppose I'll know then if it could possibly be any other component like the power supply if the issue persists.


----------



## Devildog83

Hey REC,I solved my heat issue on the top card. I just replaced what was a very poor TIM job with the Gelid GC-Extreme and lowered my load temps by 20C. Still haven't figured out yours? I assume you will be redoing your loop or at least flushing and refilling it. Hope you get it solved.


----------



## Recr3ational

Yeah dude, I think it's something to do with my loop.


----------



## End3R

Needless to say I'm iffy about asking for another Powercolor, but if I really did just have bad-luck then maybe....


----------



## Recr3ational

It's probably a dud dude. Hope you get it sorted though.


----------



## Devildog83

Quote:


> Originally Posted by *End3R*
> 
> Needless to say I'm iffy about asking for another Powercolor, but if I really did just have bad-luck then maybe....


I love my Devil's from Powercolor but the Toxic is also a good choice but both are a bit more spendy. Have you thought about maybe some programs causing trouble. I did notice that ones like 3DMark have there own system info and so do clocking tools like Aterburner and Trixx. If you have graphics overdrive in CCC running at the same time all of these could be conflicting. I had minor issues with mine until I turned off graphics overdrive and they went away.

Just a thought but what are you running for a PSU?


----------



## Roboyto

Quote:


> Originally Posted by *End3R*
> 
> PowerColor is the brand I'm having issues with, although I think it's the TurboDuo not the Devil, are they really that diff?


Devil edition has way better cooler and higher out of the box clocks; 1180 is pretty high for not having to do anything. Devil likely has better power regulation than the turbo duo as well.

Quote:


> Originally Posted by *Devildog83*
> 
> I love my Devil's from Powercolor but the Toxic is also a good choice but both are a bit more spendy. Have you thought about maybe some programs causing trouble. I did notice that ones like 3DMark have there own system info and so do clocking tools like Aterburner and Trixx. *If you have graphics overdrive in CCC running at the same time all of these could be conflicting.* I had minor issues with mine until I turned off graphics overdrive and they went away.
> 
> *Just a thought but what are you running for a PSU?*


Devil or Toxic are both solid choices.

It's unlikely that you would get 2 bad cards back to back so you should check out other things thoroughly.

If you are using CCC and Overdrive, stop immediately...Nothing but problems, especially if you are using Overdrive plus another utility.

Make sure you have scrubbed all old drivers with Display Driver Uninstaller.

A fresh windows install doesn't hurt sometimes either.

Try removing the card, blowing out the PCIe slot, and reseat the card. A small piece of hair/dust can make things no good; Do the same with your RAM.

Have you tried another PCIe slot?

Check all other cables/connections as well.

If anything is overclocked, return all settings to stock to be certain that isn't conflicting or causing issues as well.

I don't see a PSU in your sig, do you have something halfway decent? Especially considering you're using a power hungry FX-8320. Those things overclocked in the 4.5GHz range need ~200W alone; push them to 5GHz+ and it's more like 300W. It doesn't take much with one of these 270X, as mine is running off a 450W Rosewill Capstone, but a crappy PSU can ruin everything; Capstone are very nice PSU and very reasonable price.


----------



## End3R

Quote:


> Originally Posted by *Roboyto*
> 
> Devil edition has way better cooler and higher out of the box clocks; 1180 is pretty high for not having to do anything. Devil likely has better power regulation than the turbo duo as well.
> 
> Devil or Toxic are both solid choices.
> 
> It's unlikely that you would get 2 bad cards back to back so you should check out other things thoroughly.
> 
> If you are using CCC and Overdrive, stop immediately...Nothing but problems, especially if you are using Overdrive plus another utility.
> 
> Make sure you have scrubbed all old drivers with Display Driver Uninstaller.
> 
> A fresh windows install doesn't hurt sometimes either.
> 
> Try removing the card, blowing out the PCIe slot, and reseat the card. A small piece of hair/dust can make things no good; Do the same with your RAM.
> 
> Have you tried another PCIe slot?
> 
> Check all other cables/connections as well.
> 
> If anything is overclocked, return all settings to stock to be certain that isn't conflicting or causing issues as well.
> 
> I don't see a PSU in your sig, do you have something halfway decent? Especially considering you're using a power hungry FX-8320. Those things overclocked in the 4.5GHz range need ~200W alone; push them to 5GHz+ and it's more like 300W. It doesn't take much with one of these 270X, as mine is running off a 450W Rosewill Capstone, but a crappy PSU can ruin everything; Capstone are very nice PSU and very reasonable price.


I've already tried a full re-install of windows, I'm not overclocking the card or using overdrive in CCC, Infact I've used MSI afterburner to underclock it a little as some people indicated that it had something to do with overclocking. I've tried reseating it and changed my cable from hdmi to dvi. I've also fully cleaned/re-installed every version of drivers since 2012 and flashed the vbios. As for my PSU, I don't actually know what brand it is specifically, but it's a 600W. Tonight when I get off work I'm going to be swapping in a temporary card (nvidia gts 450) while I send the r9 back. The PSU being the issue did enter my mind though so it'll be interesting to see if the issue persists when I swap the card.


----------



## Roboyto

Quote:


> Originally Posted by *End3R*
> 
> I've already tried a full re-install of windows, I'm not overclocking the card or using overdrive in CCC, Infact I've used MSI afterburner to underclock it a little as some people indicated that it had something to do with overclocking. I've tried reseating it and changed my cable from hdmi to dvi. I've also fully cleaned/re-installed every version of drivers since 2012 and flashed the vbios. As for my PSU, I don't actually know what brand it is specifically, but it's a 600W. Tonight when I get off work I'm going to be swapping in a temporary card (nvidia gts 450) while I send the r9 back. The PSU being the issue did enter my mind though so it'll be interesting to see if the issue persists when I swap the card.


Possibility of issue with mobo.

does the GTS450 require PCIe power? may not be able to test PSU if it doesn't need same power connectors.

Find out what PSU exactly, and test with the Nvidia and let us know what happens.


----------



## Devildog83

Quote:


> Originally Posted by *End3R*
> 
> I've already tried a full re-install of windows, I'm not overclocking the card or using overdrive in CCC, Infact I've used MSI afterburner to underclock it a little as some people indicated that it had something to do with overclocking. I've tried reseating it and changed my cable from hdmi to dvi. I've also fully cleaned/re-installed every version of drivers since 2012 and flashed the vbios. As for my PSU, I don't actually know what brand it is specifically, but it's a 600W. Tonight when I get off work I'm going to be swapping in a temporary card (nvidia gts 450) while I send the r9 back. The PSU being the issue did enter my mind though so it'll be interesting to see if the issue persists when I swap the card.


A 600w should be fine as long as it's a stable unit and if it's multi rail has at least 25 amps per rail. If it doesn't it could be a problem. Also just make sure you have disabled Graphics overdrive because whether you use it or not having it enabled will cause issues. I am sure you have already done this but thought I would just throw that in.


----------



## End3R

Quote:


> Originally Posted by *Roboyto*
> 
> Possibility of issue with mobo.
> 
> does the GTS450 require PCIe power? may not be able to test PSU if it doesn't need same power connectors.
> 
> Find out what PSU exactly, and test with the Nvidia and let us know what happens.


Yea it should work it's a PCIe card, I'll be sure to update when I know what brand of PSU and have swapped the card.
Quote:


> Originally Posted by *Devildog83*
> 
> A 600w should be fine as long as it's a stable unit and if it's multi rail has at least 25 amps per rail. If it doesn't it could be a problem. A*lso just make sure you have disabled Graphics overdrive because whether you use it or not having it enabled will cause issues.* I am sure you have already done this but thought I would just throw that in.


How do you disable it? Other than not having it checked in CCC I'm not sure how I'd do that.


----------



## Devildog83

Quote:


> Originally Posted by *End3R*
> 
> Yea it should work it's a PCIe card, I'll be sure to update when I know what brand of PSU and have swapped the card.
> How do you disable it? Other than not having it checked in CCC I'm not sure how I'd do that.


If the box is empty and it says "enable graphics overdrive" next to it you are good to go.


----------



## End3R

Quote:


> Originally Posted by *Devildog83*
> 
> If the box is empty and it says "enable graphics overdrive" next to it you are good to go.


Ah ok then yea it is disabled.


----------



## Devildog83

I am ordering new parts for a CPU loop tomorrow, and no I am not crazy enough to take the beautiful coolers off of my Devil's!! It will be going in here -


----------



## runelotus

Wanted to share
Tried smoothing off the based of my HIS R9 270X IceQX2TBC


In Comparison the old surface is the one not rerlecting while the Lapped base is the one reflecting back
Used 1200,1500,2000 grit sand paper
The Temp when from 92C when running Furmark to 80C
And 78C to 66C while gaming



I Am quite diapointed that the culprint is the awfull base of the VGA stock HSF

a few sand paper and Lapping of the base provided massive temperature drop

note : I used DeepCool Z5 thermal paste


----------



## Mackem

I really like my Asus R9 280X DirectCUII TOP but I'm sick of the artefacting in games..


----------



## End3R

Quote:


> Originally Posted by *Roboyto*
> 
> Possibility of issue with mobo.
> 
> does the GTS450 require PCIe power? may not be able to test PSU if it doesn't need same power connectors.
> 
> Find out what PSU exactly, and test with the Nvidia and let us know what happens.


Alright so I got my card switched out with a gts 450 and everything has been running smoothly. I also have to say, I'm somewhat shocked at how good my performance still is. It's obviously a weaker card, and I get lower FPS, but I'm still able to run the games at the exact same (ultra) settings in almost all my games, and still have a smooth framterate, and this is a 4 yr old card.

I only had 1 existing benchmark from fraps with my r9 installed, it's for bioshock infinite

r9:
Frames, Time (ms), Min, Max, Avg
14814, 120000, 90, 196, 123.450

gts 450:
Frames, Time (ms), Min, Max, Avg
5736, 120000, 35, 74, 47.800

Again, clearly weaker, but still completely smooth.

Other self-gauged benchmarks, all with the same (ultra) settings
AC4 Single Player
r9 - avg 35
450 - avg 20

AC4 MP
r9 - avg 60
450 - avg 30

Spiderman (funny enough even with the higher fps with the r9, it was less stuttery on the 450)
r9 - avg 40
450 - avg 20

I had to turn down my settings in tomb raider to keep a smooth framterate, but was still able to maintain an avg of 26 with the 450, with the settings between high/ultra (Anisotropic 8x, fxaa, ssao normal, ultra shadows/detail, and tessellation). I was getting an avg of 45 with the r9 on ultra settings.

Hopefully the next r9 I get is better, but based on the workhorse with 450 is, it's making me wonder if I should sweet talk cyberpower's cs to send me a gtx 760 instead of the r9.

Oh and that weird issue with my controller is fixed too, no idea why.

Also here is the specs on my psu


Spoiler: Warning: Spoiler!


----------



## Marty99

Quote:


> Originally Posted by *Mackem*
> 
> I really like my Asus R9 280X DirectCUII TOP but I'm sick of the artefacting in games..


Sorry, but I don't understand how can you like it when it's artefacting








Nothing new really - R9 series is the king of artefacting, flickering, tearing and so on..
I'm starting to lose hope in AMD. Not only GPU's.


----------



## Roboyto

Quote:


> Originally Posted by *End3R*
> 
> Alright so I got my card switched out with a gts 450 and everything has been running smoothly. I also have to say, I'm somewhat shocked at how good my performance still is. It's obviously a weaker card, and I get lower FPS, but I'm still able to run the games at the exact same (ultra) settings in almost all my games, and still have a smooth framterate, and this is a 4 yr old card.
> 
> I only had 1 existing benchmark from fraps with my r9 installed, it's for bioshock infinite
> 
> r9:
> Frames, Time (ms), Min, Max, Avg
> 14814, 120000, 90, 196, 123.450
> 
> gts 450:
> Frames, Time (ms), Min, Max, Avg
> 5736, 120000, 35, 74, 47.800
> 
> Again, clearly weaker, but still completely smooth.
> 
> Other self-gauged benchmarks, all with the same (ultra) settings
> AC4 Single Player
> r9 - avg 35
> 450 - avg 20
> 
> AC4 MP
> r9 - avg 60
> 450 - avg 30
> 
> Spiderman (funny enough even with the higher fps with the r9, it was less stuttery on the 450)
> r9 - avg 40
> 450 - avg 20
> 
> I had to turn down my settings in tomb raider to keep a smooth framterate, but was still able to maintain an avg of 26 with the 450, with the settings between high/ultra (Anisotropic 8x, fxaa, ssao normal, ultra shadows/detail, and tessellation). I was getting an avg of 45 with the r9 on ultra settings.
> 
> Hopefully the next r9 I get is better, but based on the workhorse with 450 is, it's making me wonder if I should sweet talk cyberpower's cs to send me a gtx 760 instead of the r9.
> 
> Oh and that weird issue with my controller is fixed too, no idea why.
> 
> Also here is the specs on my psu
> 
> 
> Spoiler: Warning: Spoiler!


I can't find much information on ATNG, or that specific PSU model number...so I'm not sure what to make of it. Most of the links that involve ATNG bring me to posts that are pretty old, how long have you had that PSU? Just because it supplies the GTS450 which is using (1) 6-pin connector with power, doesn't mean it can reliably supply a 270X that needs (2) 6-pin connectors. If the PSU is causing your problem, installing another GPU(GTX 760), that needs roughly the same amount of power as the 270X is going to give you the same problems regardless if it's AMD or Nvidia.

I don't want to condemn your PSU, but without being able to find any information for it specifically it's hard to say it's OK.

The 270X should be outperforming the 450 by a larger margin than what it is. If it doesn't have ample power than it definitely wouldn't benchmark/perform properly.

What settings were you using when you were running Bioshock? My 270X and 3770k can't pull those frames on Ultra Settings 1080P for the benchmark

*With DDOF: Average just under 60 FPS*

*
Without DDOF: Average just under 80 FPS*


"Self-Gauged"? I need definitive numbers from a benchmark/utility so every setting is the same for a fair comparison. I don't think your settings were maxed for Bioshock when you recorded with Fraps, if you were getting those numbers, with either card.


----------



## Roboyto

Quote:


> Originally Posted by *Marty99*
> 
> Sorry, but I don't understand how can you like it when it's artefacting
> 
> 
> 
> 
> 
> 
> 
> 
> Nothing new really - R9 series is the king of artefacting, flickering, tearing and so on..
> I'm starting to lose hope in AMD. Not only GPU's.


A generalized negative statement about the entire R9 line of GPUs coming from the guy with an Nvidia card...interesting.

My GTX 670 4GB Superclock was a total PoS, but I don't go into the Nvidia section to say all 600, and consequently most 700, series GPUs are all turds because mine had problems and suffered a premature death.

Artifacting, flickering, tearing on these R9(or related GCN) GPUs from my experience, is most likely caused by unsuccessful overclocking. More over it is more than likely caused by GPU core clocks being too high with not enough voltage. I have had my share of experience with GCN cards such as: 7750, 7770, 7870, 7950, 7970, 270X, 280X, and 290.

Beta drivers can also be responsible...and for some reason people have this delusion that beta drivers are supposed to work perfectly, but ultimately forget they are BETA drivers. The 13.12 drivers are rock solid. If you don't want to have questionable performance then steer clear of beta drivers.

I haven't been active in this thread very long, since I've only had my 270X for a couple weeks, but I have seen people commenting on flickering issues with 270X. However, people seem to have been getting good results with BIOS flash, or have been getting replacement cards as a result.

Yes, the FX CPUs are weak *and* power hungry in comparison to their Intel counterparts; if you would have read any review of the 81xx, or 83xx, before purchasing one, you would have known this. This is also why the FX "8 core" CPUs are cheaper than the i5. Few people were bigger AMD fanboys than myself..that was up until the 8350 came out and was still struggling to keep up with the i5's. I switched to Intel for CPUs and probably won't look back unless AMD does something glorious with their next gen...which could be possible since AMD got this guy back:

http://www.macrumors.com/2012/08/01/key-apple-chip-designer-jim-keller-returns-to-amd/

Both the Red and Green team have issues from time to time..but your generalized negativity for AMD isn't helping anyone/anything in this thread.

You should be happy AMD is still around to keep prices in check, otherwise you would probably need a home equity loan to purchase a decent Nvidia card.


----------



## Devildog83

Yes I agree about the negativity here. Doesn't belong.

On the FX series CPU's. If you have and FX 8350 or a I7 3770K you will notice hardly any difference in the two in anything but benching. A frame or two here and there is not noticeable at all and that's the only difference in gaming and in some games the 8350 performs better. In rendering or other CPU intensive applications the difference is not that much at all either so is the extra $100 or more for the chip and another $100 or more for the motherboard worth it. Not in my opinion but that's up too you. I run bench's too but I don't worry about comparing the to rigs with Intel chips because it's seriously an apples to oranges comparison. If bench's are extremely important to you then yes, by all means drop the cash and get an Intel, if not, there is nothing wrong with the real world performance of an FX chip. You can loose faith in AMB if you wish but the rest of don't have to.


----------



## Marty99

Quote:


> A generalized negative statement about the entire R9 line of GPUs coming from the guy with an Nvidia card...interesting.


Nothing interesting in here, I've had R9 280x for a while.
Quote:


> Artifacting, flickering, tearing on these R9(or related GCN) GPUs from my experience, is most likely caused by unsuccessful overclocking.


No. From what I've read on forums a lot of cards have all that issues just out of the box.
Quote:


> More over it is more than likely caused by GPU core clocks being too high with not enough voltage.


Yeah, I really "love" tweaking the voltage and clocks of brand new graphic card.

http://linustechtips.com/main/topic/108284-huge-list-of-failure-rates-on-pc-components-french-but-i-translated-nearly-everything/

How does this look compared to Nvidia? And I am sure that R9 series failure rate is even higher than HD78xx/ HD79xx. Oh, we will see. Nvidia just seems more reliable than AMD.
Off course you can RMA the card but who the f* does like it? And those stories that after RMA'ing you get the same issues again and again.
Quote:


> if you would have read any review of the 81xx, or 83xx, before purchasing one, you would have known this.


lol, I did. The only reason why I bought FX-8320 instead of i5 is that I already had MB. And maybe I thought that "8 cores" is more future proof... But now I really doubt it.
Generally, 4,0-4,5 GHz Vishera octo-core is good enough for most of the games but some of them run really bad. For example AC4 and Thief feel a lot smoother on my friend's i5-3350p than on my cpu.
Quote:


> You should be happy AMD is still around to keep prices in check


I am. We need another competitor but that is hardly possible.


----------



## Marty99

Quote:


> Originally Posted by *Devildog83*
> 
> Yes I agree about the negativity here. Doesn't belong.
> 
> On the FX series CPU's. If you have and FX 8350 or a I7 3770K you will notice hardly any difference in the two in anything but benching. A frame or two here and there is not noticeable at all and that's the only difference in gaming and in some games the 8350 performs better.


You will notice difference in some games pretty easy







What? Frame or two? Yeah... Earlier my thoughts was exactly the same but reality is different. What about minimum frame rates which is more important than average? What about people with more than one high-end VGA?
You can find information sustaining the fact that i5 is noticeably faster in games than Vishera on oc.net.
And that heat dissipation and cooling requirements..
Quote:


> You can loose faith in AMB if you wish but the rest of don't have to


Agree, go AMD!


----------



## End3R

Quote:


> Originally Posted by *Roboyto*
> 
> I can't find much information on ATNG, or that specific PSU model number...so I'm not sure what to make of it. Most of the links that involve ATNG bring me to posts that are pretty old, how long have you had that PSU? Just because it supplies the GTS450 which is using (1) 6-pin connector with power, doesn't mean it can reliably supply a 270X that needs (2) 6-pin connectors. If the PSU is causing your problem, installing another GPU(GTX 760), that needs roughly the same amount of power as the 270X is going to give you the same problems regardless if it's AMD or Nvidia.
> 
> I don't want to condemn your PSU, but without being able to find any information for it specifically it's hard to say it's OK.
> 
> The 270X should be outperforming the 450 by a larger margin than what it is. If it doesn't have ample power than it definitely wouldn't benchmark/perform properly.


I bought it about 3 months ago, I supposed if they send me back another card and it starts to do this again, I'll know they aren't getting enough power and can get them to replace that part, it should be covered under the warranty as well.
Quote:


> Originally Posted by *Roboyto*
> 
> What settings were you using when you were running Bioshock? My 270X and 3770k can't pull those frames on Ultra Settings 1080P for the benchmark


So you're partially right here, I just relaunched bioshock to check, and the settings defaulted back to high for some reason instead of Ultra, so the bench for the r9 was using ultra, and the bench for the 450 was using high, both benchs were run at the same spot (near the end when you're walking along the docks with elizabeth)
Quote:


> Originally Posted by *Roboyto*
> 
> "Self-Gauged"? I need definitive numbers from a benchmark/utility so every setting is the same for a fair comparison. I don't think your settings were maxed for Bioshock when you recorded with Fraps, if you were getting those numbers, with either card.


Heh yea sorry but I only ran my fraps bench on bioshock before swapping out the card. As I mentioned above you are right about the benches with the 450, those got reset to high. But they were definitely on ultra when I ran the r9 bench.

If it was a problem with my PSU, wouldn't this only have happened when it was using a lot of power? Like my video on the other page shows, it would happen while staring at the desktop, not running anything sometimes.


----------



## Roboyto

Quote:

Originally Posted by *Marty99* 
Yeah, I really "love" tweaking the voltage and clocks of brand new graphic card.

*OVERCLOCK*.net You are 100% on the wrong website if you don't enjoy tweaking your computer hardware.

Quote:
Originally Posted by *Marty99* 

Agree, go AMD!









Choose a side lol

Quote:
Originally Posted by *End3R* 

I bought it about 3 months ago, I supposed if they send me back another card and it starts to do this again, I'll know they aren't getting enough power and can get them to replace that part, it should be covered under the warranty as well.
So you're partially right here, I just relaunched bioshock to check, and the settings defaulted back to high for some reason instead of Ultra, so the bench for the r9 was using ultra, and the bench for the 450 was using high, both benchs were run at the same spot (near the end when you're walking along the docks with elizabeth)


> Heh yea sorry but I only ran my fraps bench on bioshock before swapping out the card. As I mentioned above you are right about the benches with the 450, those got reset to high. But they were definitely on ultra when I ran the r9 bench.


What did it cost? PSU is practically the worst place to skimp for PC parts.

Run the official benchmark, on Ultra 1080P, with the GTS450 and see where you end up; I have a feeling it will be crawling. Running fraps in game, near the same spot, with the 2 cards, is not a good test.


----------



## End3R

Quote:


> Originally Posted by *Roboyto*
> 
> What did it cost? PSU is practically the worst place to skimp for PC parts.
> 
> Run the official benchmark, on Ultra 1080P, with the GTS450 and see where you end up; I have a feeling it will be crawling. Running fraps in game, near the same spot, with the 2 cards, is not a good test.


Well I find benches from actually playing the game more reliable than the built-in benchmarks, and yes the 450 can't handle ultra without dropping to 10-15 fps, but high settings is still completely smooth as that bench indicates.

And I don't remember exactly what the power supply cost, I bought the computer as a whole for $680 ($755 after shipping).

Wouldn't this only be a PSU issue if I was getting the flickering when my r9 wasn't getting enough power? Because it would happen while just staring at a desktop sometimes.

These are screens taken with the 450 on high settings, still shocked this 4yr old card I was just gonna use temporarily can run so smooth. Will probably create an entire new rig for this card.


Spoiler: Warning: Spoiler!


----------



## Recr3ational

Hey guys,
I've done some stuff to much rig,
I've added an extra 120 to the existing 240 + 200mm rads.

Added a new pump that's better,

Now my temps are crazy. Under load, it gets 40-45c at 1250/1650 @1.3v (I think it's more than that just the slider is at 1.3)
That's during the day. At night it goes as low as 30c. That's some CRAZY temps.

Though my house is pretty cold at night so that's why it's that low. Still though.


----------



## HAVOKNW

Glad I found this thread.

I'm the proud owner of 2 x PowerColor R9 270X Devil cards and 2 x PowerColor R9 280X cards. Love them!


----------



## Recr3ational

Quote:


> Originally Posted by *HAVOKNW*
> 
> Glad I found this thread.
> 
> I'm the proud owner of 2 x PowerColor R9 270X Devil cards and 2 x PowerColor R9 280X cards. Love them!


Sweet PowerColor,
Haha. What clocks are you gettin with the 280x?


----------



## HAVOKNW

Quote:


> Originally Posted by *Recr3ational*
> 
> Sweet PowerColor,
> Haha. What clocks are you gettin with the 280x?


I've got them at stock right now. I've got the two 280X's in a case right now that doesn't have great airflow. Once I get them in a rig with better cooling, I'll raise those clocks!


----------



## Recr3ational

Quote:


> Originally Posted by *HAVOKNW*
> 
> I've got them at stock right now. I've got the two 280X's in a case right now that doesn't have great airflow. Once I get them in a rig with better cooling, I'll raise those clocks!


Sweet!
That's the problems with my 280x the fan isn't the greatest. Though once underwater it's a beast


----------



## HAVOKNW

Quote:


> Originally Posted by *Recr3ational*
> 
> Sweet!
> That's the problems with my 280x the fan isn't the greatest. Though once underwater it's a beast


I noticed the same thing with the fans. I have them on a micro-ATX board. When the fan on the bottom card would start to spin faster, the fan would raise up a bit and grind against the top card. I had to install little rubber spacers between the cards to prevent this rubbing. Nothing like small little hacks to get your gear in top performance.


----------



## Recr3ational

Quote:


> Originally Posted by *HAVOKNW*
> 
> I noticed the same thing with the fans. I have them on a micro-ATX board. When the fan on the bottom card would start to spin faster, the fan would raise up a bit and grind against the top card. I had to install little rubber spacers between the cards to prevent this rubbing. Nothing like small little hacks to get your gear in top performance.


Haha yeah, I had the same problem when I was running dual Twin frozrs. Those cards take 2 1/5 slots. Just because of the fan.

Good hack though


----------



## M1kuTheAwesome

Just thought I'd share a nasty experience with you people, might end up saving someone from my misery...
Almost 2 weeks ago this happened:
http://www.overclock.net/t/1481239/psu-moody-or-graphics-card-dead
No matter what I did my PC wouldn't start up with my 280X plugged in. Then suddenly it did and the problems disappeared. Tried figuring out what happened but everything was as normal as possible. This morning though, disaster struck: PC didn't wake up from sleep mode with those same sympthomes. Went to school to let it settle down, but this time the problem didn't fix itself. After school I started testing with my roommate's PC and after several different setups tested on both of our motherboards the problem got narrowed down to my Powercolor 280X TurboDuo. I had had some problems with the VRM running too hot on stock fan speeds and since I didn't find any decent thermal pads I replaced the original pad under the VRM heatsink with some MX4 paste, but that didn't really improve the cooling. Knowing I keep forgetting to set up my custom fan profile and the fact that it was a power delivery related issue I feared the worst. Pulling the card apart confirmed my fears: carbon-like black substance mixed with the paste and black marks on the VRM chips. The VRMs were fried. The card is bricked.
I wiped off the paste and put the card back together, hoping if I try to RMA it they will believe that the pad was never there and that caused the problem. I don't feel myself good about lying like that but what choice do I have? I can't exactly pull money for a new one out of my backpocket.
So remember guys: they use thermal pads for a reason and you shouldn't replace them with paste. And if you know you have issues with temps on any part of the graphics card, set a fan profile and run it on startup, or even better, customize your BIOS with a new fan curve. I learned all that the hard way. Even if they replace my card, it will take the store 4 weeks again like with both my motherboard and CPU in autumm 2012. 4 weeks without my own computer... Don't have a secondary, only my useless phone and my roommate's PC whenever he's AFK, which happens rarely...


----------



## Roboyto

Quote:


> Originally Posted by *End3R*
> 
> Well I find benches from actually playing the game more reliable than the built-in benchmarks, and yes the 450 can't handle ultra without dropping to 10-15 fps, but high settings is still completely smooth as that bench indicates.
> 
> And I don't remember exactly what the power supply cost, I bought the computer as a whole for $680 ($755 after shipping).
> 
> Wouldn't this only be a PSU issue if I was getting the flickering when my r9 wasn't getting enough power? Because it would happen while just staring at a desktop sometimes.


That's pretty cheap collectively if you consider the CPU is 160, GPU is 200, board is $85, similar RAM $70, cheap 2TB $85, and that doesn't include the case, PSU, mouse/keyboard, and a windows license. Granted they don't pay retail, but it's a business so they have to make money.

I have no experience with CyberpowerPCs, and I haven't purchased a pre-built desktop in over 15 years...don't trust them.

Didn't know it would happen at the desktop. With a little luck you will get a functioning card back, I would find it nearly impossible to get 3 bad ones consecutively assuming nothing else is contributing to the problem.


----------



## End3R

Quote:


> Originally Posted by *Roboyto*
> 
> That's pretty cheap collectively if you consider the CPU is 160, GPU is 200, board is $85, similar RAM $70, cheap 2TB $85, and that doesn't include the case, PSU, mouse/keyboard, and a windows license. Granted they don't pay retail, but it's a business so they have to make money.
> 
> I have no experience with CyberpowerPCs, and I haven't purchased a pre-built desktop in over 15 years...don't trust them.
> 
> Didn't know it would happen at the desktop. With a little luck you will get a functioning card back, I would find it nearly impossible to get 3 bad ones consecutively assuming nothing else is contributing to the problem.


Yea the video i posted a few pages back is at the desktop. And this is the 2nd comp I've bought from cyberpower, I think they are pretty good but as you can see with the PSU, if you don't specify the component they usually use something unheard of. The laptop I bought from them had a bsod issue I had to download russian wifi drivers to fix lol. But you can't really beat their prices when they have sales. I got mine during their super-bowl sale which is why it was so cheap, but it did not include windows, I had to buy that seperate. It did however, come with a mouse/keyboard (lowest tier razer), a free copy of Thief, Hitman Absolution, and an aftermarket logisys heatsink for the cpu(not the best but way better than stock), I could have upgraded to watercooling for like $40 I think but I'm too paranoid about that breaking so I stuck with air.

My case: it has x2 120mm fans in the front, x2 120mm fans ontop, x1 120mm on the back, and a raised bottom for the psu fan. It gets very nice cooling.


Spoiler: Warning: Spoiler!







Hope the next card I get is better, assuming I get another r9 I'll be sure to update again once I get it back.


----------



## DiceAir

Ok so I can only do 1150 on the core and 1500mhz on memory. my core overlcock is fine but my memory is still at stock. when I overclock it everything is unstable. anything I can do to overclock my memroy. i already have my power limit on 20%.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Hey guys,
> I've done some stuff to much rig,
> I've added an extra 120 to the existing 240 + 200mm rads.
> 
> Added a new pump that's better,
> 
> Now my temps are crazy. Under load, it gets 40-45c at 1250/1650 @1.3v (I think it's more than that just the slider is at 1.3)
> That's during the day. At night it goes as low as 30c. That's some CRAZY temps.
> 
> Though my house is pretty cold at night so that's why it's that low. Still though.


Glad to hear you got it all worked out. Those temps are crazy low.


----------



## Devildog83

Quote:


> Originally Posted by *HAVOKNW*
> 
> Glad I found this thread.
> 
> I'm the proud owner of 2 x PowerColor R9 270X Devil cards and 2 x PowerColor R9 280X cards. Love them!


I will add you if you wish, do you have pics? I love Dual Devil's!!


----------



## microkid21

Ok an update, I tried to underclock my card (*Powercolor r9 270x Devil*) from 1180 to 1150 using 14.4 driver and the result was awesome. No crash / BSoD occurred, no artifact or whatsoever just some screen tearing which is unnoticeable while gaming BF4(ultra) Skyrim (Ultra) for 3-5 hours. And also while watching Youtube. It's kinda weird that I need to underclock my card in order for me to play the intense games that I like for a long run. I'll try to revert back to the default clock speed and see if the problem will still exist while gaming witcher 2 or starcraft 2. Upon monitoring the temps of both cpu and gpu I noticed that my CPU temp using stock cooler from intel it reaches 70c on the other hand the GPU temps is 66-68c only. Is this normal for the CPU? BTW my cpu is i3 2nd gen. I know its old but I really don't mind since I can still play games on its maximum settings using only a dual core cpu.


----------



## fabiovtec

I just bought today a asus r9 270x top but i have a question
its normal that the fan speed stays always at 20% until the temperature goes to 60º?
Driver 14.4


----------



## uroshnish

Yes, I think auto profile is set to increase fan speed when temperatures get over 60C, cards are made to run on high temperatures so it's not problem, it's just being quiet


----------



## uroshnish

I have one question myself too, has anyone noticed that game performance is better when you don't have CCC on ? I disabled it, amd fuel and amd extensive service aswell. Games play smoother this way.


----------



## Horsemama1956

Quote:


> Originally Posted by *uroshnish*
> 
> I have one question myself too, has anyone noticed that game performance is better when you don't have CCC on ? I disabled it, amd fuel and amd extensive service aswell. Games play smoother this way.


What are your specs?


----------



## uroshnish

AMD FX-6300 @4.4Ghz
Sapphire R9 270X Vapor-X (stock settings atm)
4GB Kingston HyperX Blu

Definitely better gameplay when I got CCC and those things off. Just that bit more smoothness I needed. No gain in FPS, just plays better


----------



## Devildog83

Quote:


> Originally Posted by *uroshnish*
> 
> AMD FX-6300 @4.4Ghz
> Sapphire R9 270X Vapor-X (stock settings atm)
> 4GB Kingston HyperX Blu
> 
> Definitely better gameplay when I got CCC and those things off. Just that bit more smoothness I needed. No gain in FPS, just plays better


Yes, if you run graphics overdrive and a clocking tool at the same time you may see instability. I did.


----------



## Horsemama1956

Quote:


> Originally Posted by *uroshnish*
> 
> AMD FX-6300 @4.4Ghz
> Sapphire R9 270X Vapor-X (stock settings atm)
> 4GB Kingston HyperX Blu
> 
> Definitely better gameplay when I got CCC and those things off. Just that bit more smoothness I needed. No gain in FPS, just plays better


I think I remember doing something similar back when the 1950XTs came out.
Quote:


> Originally Posted by *Devildog83*
> 
> Yes, if you run graphics overdrive and a clocking tool at the same time you may see instability. I did.


I think he's just talking about the CCC itself, as well as some of the AMD processes that run in the background and not because he has 2 programs trying to apply clocks.


----------



## Majentrix

Ordered and paid!


----------



## JaredLaskey82

Well I finally got the EK backplates installed for my two ASUS R9 280X DCU2 Top cards and I think my rig looks complete now.


----------



## NBAasDOGG

Hi gens,

I have a question about MSI R9 280X Gaming series. Do you guys know how to change memory voltage? I have tried alle overclocking ulities, but none gave acces tot the memory voltage. Is there any kind of way to change memory voltage via bios or something. These cards have 1.5v, but i want 1.6v like the old HD7970's

Thankssssss


----------



## lightsout

Quote:


> Originally Posted by *JaredLaskey82*
> 
> Well I finally got the EK backplates installed for my two ASUS R9 280X DCU2 Top cards and I think my rig looks complete now.


Looks great!! Just need to paint/wrap that crossfire cable







:thumb:


----------



## JaredLaskey82

Quote:


> Originally Posted by *NBAasDOGG*
> 
> Hi gens,
> 
> I have a question about MSI R9 280X Gaming series. Do you guys know how to change memory voltage? I have tried alle overclocking ulities, but none gave acces tot the memory voltage. Is there any kind of way to change memory voltage via bios or something. These cards have 1.5v, but i want 1.6v like the old HD7970's
> 
> Thankssssss


If you are using MSI Afterburner. Go to settings in the bottom right corner. The MSI Afterburner properties window should open up. In 'General' there should be a box with the words 'Unlock voltage control', also make sure you have chosen the correct card in the tab next to it. The three options are reference design', 'Standard MSIn' or extended MSI'.


----------



## BackwoodsNC

Quote:


> Originally Posted by *JaredLaskey82*
> 
> If you are using MSI Afterburner. Go to settings in the bottom right corner. The MSI Afterburner properties window should open up. In 'General' there should be a box with the words 'Unlock voltage control', also make sure you have chosen the correct card in the tab next to it. The three options are reference design', 'Standard MSIn' or extended MSI'.


Shoot, I still can't get MSI AB to apply voltage after setting and applying. I am having to use sapphire trixx to apply voltage.


----------



## JaredLaskey82

Quote:


> Originally Posted by *BackwoodsNC*
> 
> Shoot, I still can't get MSI AB to apply voltage after setting and applying. I am having to use sapphire trixx to apply voltage.


It works fine for me and I am not even using MSI cards.

Make sure you save the profile and then apply the profile.
Dont forget to click the tab next to the 'Core voltage' slider to set your Mem voltage and Aux Voltages as well.


----------



## electrocabeza

Quote:


> Originally Posted by *Majentrix*
> 
> Just pulled the trigger on two Sapphire R9 270x toxics.
> What are the highest clocks people are getting with them?


1360/1600 is my max 3DM11 frequency, with 1.308V...


----------



## Majentrix

Sounds great, can't wait for mine to arrive.
How much voltage do you get to play with in AB with that card?


----------



## BackwoodsNC

Quote:


> Originally Posted by *JaredLaskey82*
> 
> It works fine for me and I am not even using MSI cards.
> 
> Make sure you save the profile and then apply the profile.
> Dont forget to click the tab next to the 'Core voltage' slider to set your Mem voltage and Aux Voltages as well.


that's the funny thing, I can apply more memory voltage. It's just when I go to apply core voltage, it just won't apply it.


----------



## JaredLaskey82

Quote:


> Originally Posted by *BackwoodsNC*
> 
> that's the funny thing, I can apply more memory voltage. It's just when I go to apply core voltage, it just won't apply it.


I know that the MSI cards have a switch on the cards for Quiet, Gaming and OC mode. Check the switch on the side of the card to make sure you are in the right mode for overclocking. Silent or gaming mode may limit the amount of V you can apply to the card.

Its just a guess, but if not, check you cards manual to see what each of the modes do.


----------



## Arkanon

Quote:


> Originally Posted by *JaredLaskey82*
> 
> I know that the MSI cards have a switch on the cards for Quiet, Gaming and OC mode. Check the switch on the side of the card to make sure you are in the right mode for overclocking. Silent or gaming mode may limit the amount of V you can apply to the card.
> 
> Its just a guess, but if not, check you cards manual to see what each of the modes do.


Eh.. No. The switch on the card is actually the bios switch to select stock or LN2 optimised bios. silent/gaming and oc mode are selected through software which you can download from the MSI site.


----------



## JaredLaskey82

Quote:


> Originally Posted by *Arkanon*
> 
> Eh.. No. The switch on the card is actually the bios switch to select stock or LN2 optimised bios. silent/gaming and oc mode are selected through software which you can download from the MSI site.


Well I am sure the silent mode limits the voltage you can add to the card so it does not overheat the vrm and the GPU.


----------



## NBAasDOGG

Quote:


> Originally Posted by *JaredLaskey82*
> 
> If you are using MSI Afterburner. Go to settings in the bottom right corner. The MSI Afterburner properties window should open up. In 'General' there should be a box with the words 'Unlock voltage control', also make sure you have chosen the correct card in the tab next to it. The three options are reference design', 'Standard MSIn' or extended MSI'.


Thanks for the help sir, but the thing is that there is no memory voltage slider, only core voltage whish works pretty well up to 100+. I am even in able to higher or lower my voltage by 200+ or 200- with artmoney hack software. The only problem is that afterburner doesn't have AUX and memory sliders. Any solution for that? Thanks


----------



## JaredLaskey82

Quote:


> Originally Posted by *NBAasDOGG*
> 
> Thanks for the help sir, but the thing is that there is no memory voltage slider, only core voltage whish works pretty well up to 100+. I am even in able to higher or lower my voltage by 200+ or 200- with artmoney hack software. The only problem is that afterburner doesn't have AUX and memory sliders. Any solution for that? Thanks


What version of AB are you using?

If nothing else, just use trixx.


----------



## NBAasDOGG

Quote:


> Originally Posted by *JaredLaskey82*
> 
> What version of AB are you using?
> 
> If nothing else, just use trixx.


The only AB version we can use is beta 19, and trix does not allow me to change any voltage, there is no AUX and memory voltage sliders aswell.


----------



## JaredLaskey82

Quote:


> Originally Posted by *NBAasDOGG*
> 
> The only AB version we can use is beta 19, and trix does not allow me to change any voltage, there is no AUX and memory voltage sliders aswell.


Check out this page I found. It may help you out.

http://www.hardocp.com/article/2013/10/28/msi_r9_280x_gaming_video_card_review/3#.U1tmq_mSxQU


----------



## NBAasDOGG

Hi
Quote:


> Originally Posted by *JaredLaskey82*
> 
> Check out this page I found. It may help you out.
> 
> http://www.hardocp.com/article/2013/10/28/msi_r9_280x_gaming_video_card_review/3#.U1tmq_mSxQU


Hi sir,

You are my hero. God bless i can finally change memory voltage. Thanks soooo ******* mush.


----------



## JaredLaskey82

Quote:


> Originally Posted by *NBAasDOGG*
> 
> Hi
> 
> Hi sir,
> 
> You are my hero. God bless i can finally change memory voltage. Thanks soooo ******* mush.


You sir, are most welcome.


----------



## fabiovtec

Using mantle with r9 270x 14.4 rc1 and works 100% with any settings


----------



## JaredLaskey82

Quote:


> Originally Posted by *fabiovtec*
> 
> Using mantle with r9 270x 14.4 rc1 and works 100% at with settings


The only problem with Mantle is it does not work with FRAPS, as FRAPS works on A DirectX program.
But oh well.

My cards are sitting at around 89% core use at 78 Deg in B4.


----------



## Majentrix

Have you tried using DXTORY? It's similar to FRAPS and might work with Mantle.


----------



## JaredLaskey82

Quote:


> Originally Posted by *Majentrix*
> 
> Have you tried using DXTORY? It's similar to FRAPS and might work with Mantle.


Nah, you just use the console command 'perfoverlay.drawfps 1' and that logs your FPS during play.









or use the commands:
Show FPS = perfoverlay.drawfps true
Hide FPS = perfoverlay.drawfps false


----------



## Devildog83

Quote:


> Originally Posted by *Majentrix*
> 
> 
> 
> Ordered and paid!


There's a bit of bling!!


----------



## diggiddi

Quote:


> Originally Posted by *JaredLaskey82*
> 
> Nah, you just use the console command 'perfoverlay.drawfps 1' and that logs your FPS during play.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> or use the commands:
> Show FPS = perfoverlay.drawfps true
> Hide FPS = perfoverlay.drawfps false


Where does it save the logs to?


----------



## fabiovtec

Which is the max oc for r9 270x with stock voltage? Im trying 1200 but 3dmark crashed


----------



## anubis1127

Quote:


> Originally Posted by *fabiovtec*
> 
> Which is the max oc for r9 270x with stock voltage? Im trying 1200 but 3dmark crashed


It will vary from GPU to GPU, there is no set max for the 270x in general, just whatever your card can do.


----------



## Roaches

Hey Devildog, how do you like your Devils in CFX? is the frametime and microstudder improved on 1080p? I'm contemplating on getting a second one to play with...If not, I'll just toss in my ITX build.


----------



## Dasboogieman

Heya all

I've got a question to owners of early model ASUS 270X Direct CU II (or TOP)
I've got my hands on a late model 7870 Direct CU II V2 and I've been scouring the PCB. It seems, the layout is identical to the 270X, right down to the little VRMs, Memory chips, even core voltage.
In fact, the only real difference seems to be the cooling shroud (even the cooler is the same heatsink/heatpipe layout)

It stands to reason that its the same recycled PCB and GPU but with a different BIOS. I've heard that the 270X BIOS has better memory timings, would anyone think its possible to flash the ASUS 270X BIOS on to this 7870? I'm predicting a 200mhz headroom in Memory OC potential (mine chokes at 1350mhz)
I've read that the few that tried to attempt the 7970 to 280X flash ended up bricking their cards, there was one chap that tried to do a 270X flash too but he also bricked his. However, I believe these failures were because the 280X/270X PCBs are subtly different (even Reference) to the 7xxx counterparts but this ASUS PCB is the first example I've seen that uses the exact same layout between the generations.

Plus also, has any of the ASUS owners attempted to use a VRM heatsink? The VRM assembly can shoot up to 95 degrees even on a slight overvoltage of 1250mV (compared to 1221mV stock) unless the fan is manually ramped to 80% (in which case the VRMs drop to 75 degrees)
There seems to be a real excess of vertical space between the VRMs and the cooler heatsinks in the area.


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> Hey Devildog, how do you like your Devils in CFX? is the frametime and microstudder improved on 1080p? I'm contemplating on getting a second one to play with...If not, I'll just toss in my ITX build.


I love it, I don't really have any issues at all but for the occasional flicker in FireFox.

BTW my build has a bit of an upgrade. Can't wait till next week to get the CPU loop in.


----------



## Roaches

Hmm, I'm not aware of this flicker. Any video or image to show how it looks like? In that case, I'll have to see it myself in the coming week, thanks. Nice system btw.


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> Hmm, I'm not aware of this flicker. Any video or image to show how it looks like? In that case, I'll have to see it myself in the coming week, thanks. Nice system btw.


No I don't have any way to show you. For just a slit second every once in a while a white line flashes across the middle of the screen. It only happens in Firefox browser and it's not very often so it doesn't bug me at all.


----------



## Roaches

Quote:


> Originally Posted by *Devildog83*
> 
> No I don't have any way to show you. For just a slit second every once in a while a white line flashes across the middle of the screen. It only happens in Firefox browser and it's not very often so it doesn't bug me at all.


That's whats exactly I wanted to hear, as long it isn't something major to worry about. Thanks. +rep


----------



## Unknownm

issue i'm facing

This thread offers a Rev02 BIOS that allows 1.3vcore. Rev02 BIOS works with Rev01 card but I'm having issues staying stable when applying 1.3v. Sometimes sapphire Trixx (this tools allows me to apply higher voltage) freezes up windows when I have Rev02 BIOS flashed. When Rev02 BIOS is flashed I can run 1.2Ghz core & furmark for 4 hours fine (no crash) but sometimes at random even at startup i'll get a blank screen.

When Rev01 BIOS is flashed to my card nothing ever crashes BUT I can only get 1.256v max vcore which only allows around 1160mhz.

My question is can anyone here customize Rev01 BIOS to allow 1.3v like Rev02 BIOS found in the thread?


----------



## Devildog83

Quote:


> Originally Posted by *Unknownm*
> 
> issue i'm facing
> 
> This thread offers a Rev02 BIOS that allows 1.3vcore. Rev02 BIOS works with Rev01 card but I'm having issues staying stable when applying 1.3v. Sometimes sapphire Trixx (this tools allows me to apply higher voltage) freezes up windows when I have Rev02 BIOS flashed. When Rev02 BIOS is flashed I can run 1.2Ghz core & furmark for 4 hours fine (no crash) but sometimes at random even at startup i'll get a blank screen.
> 
> When Rev01 BIOS is flashed to my card nothing ever crashes BUT I can only get 1.256v max vcore which only allows around 1160mhz.
> 
> My question is can anyone here customize Rev01 BIOS to allow 1.3v like Rev02 BIOS found in the thread?


I cannot use sapphire Trixx or any overclocking tool but for Afterburner because I can get higher voltage but it's never stable. I have resigned myself to the fact that Afterburner is it and I won't be able to get higher the 1.3 on the HD7870 and stock on the 270x.


----------



## itomic

I can buy for very good price one of these two cards Club3D R9 280X royalQueen 3GB or VTX R9 280X 3GBD5-2DHEV2. I know that these ones are not the creme of the crop in R9 280X family but the are very cheap in Croatia now, they are on sale. I would like to hear impressions if someone has them. Espacially about cooler, is it loud and how does it cope with R9 280X TDP ?

http://www.club-3d.com/index.php/products/reader.en/product/radeon-r9-280x-royalqueen.html

http://www.vtx3d.com/products_features.asp?id=183


----------



## mAs81

Quote:


> Originally Posted by *itomic*
> 
> I can buy for very good price one of these two cards Club3D R9 280X royalQueen 3GB or VTX R9 280X 3GBD5-2DHEV2. I know that these ones are not the creme of the crop in R9 280X family but the are very cheap in Croatia now, they are on sale. I would like to hear impressions if someone has them. Espacially about cooler, is it loud and how does it cope with R9 280X TDP ?
> 
> http://www.club-3d.com/index.php/products/reader.en/product/radeon-r9-280x-royalqueen.html
> 
> http://www.vtx3d.com/products_features.asp?id=183


Definitely get the club-3d..better than the other one IMHO


----------



## fabiovtec

How can i check if my bios is the newest? In asus site only have bios 1.0 and 2.0 -.-


----------



## mAs81

Your card is the r9 270X right?If so,checking out here might give you an answer if this is your card..
Also,check out this thread on how to put your rig specs in your signature,so it easier for people to help out


----------



## trabunco

Quote:


> Originally Posted by *fabiovtec*
> 
> How can i check if my bios is the newest? In asus site only have bios 1.0 and 2.0 -.-


Yours is 103. Mine is 104.


----------



## BackwoodsNC

When I overclock my memory and watch a youtube video my pc locks up. Now I am just talking a overclock of like 50mhz, I can take it all the way to 1700mhz in game and it does just fine. Does anybody else get this issue?


----------



## ricko99

Anyone here using 14.4 WHQL? How is the performance and does it increase the card's temperature?


----------



## Anonizer

@BackwoodsNC

Did you try disabling hardware acceleration in your browser?

@ricko99

I'm using it, and my monitor properly wakes up now unlike the other drivers.
Didn't notice a slight increase in temperatures. Clocks now stay at 1100 when playing Dota 2 or not so intensive games, with 14.x drivers it would jump 300-1050-1100 randomly.


----------



## creationsh

They both look the same. VTX could be a club3d card underneath. IDK for sure. I'd get which ever is cheaper.


----------



## Roboyto

Quote:


> Originally Posted by *fabiovtec*
> 
> Which is the max oc for r9 270x with stock voltage? Im trying 1200 but 3dmark crashed


To get 1200 you will probably need to up voltage. Get an OC utility and slowly add voltage until you find stability at 1200


----------



## anubis1127

May I join? I picked up a couple used MSI R9 270s from the ocn mp.





They seem pretty good for 1080p/1200p gaming. I am running them with +20% Power Limit at 1100/1500 on the stock voltage, which for these is 1.18V. Both cards stay pretty cool, under 70C for the most part. They will do.



Thanks.


----------



## Devildog83

Quote:


> Originally Posted by *anubis1127*
> 
> May I join? I picked up a couple used MSI R9 270s from the ocn mp.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> They seem pretty good for 1080p/1200p gaming. I am running them with +20% Power Limit at 1100/1500 on the stock voltage, which for these is 1.18V. Both cards stay pretty cool, under 70C for the most part. They will do.
> 
> 
> ]
> 
> 
> 
> Thanks.


You have been added. Nice cards. I fine that 2 x 270x and or 7870's do quite well for just about anything.


----------



## anubis1127

Quote:


> Originally Posted by *Devildog83*
> 
> You have been added. Nice cards. I fine that 2 x 270x and or 7870's do quite well for just about anything.


Thanks, one small thing though, they are R9 270s, not the 270x. They only have single 6-pin power, guess you could have seen that if I had shown a pic of them installed (well maybe, its a pretty crappy pic).


----------



## Devildog83

Quote:


> Originally Posted by *anubis1127*
> 
> Thanks, one small thing though, they are R9 270s, not the 270x. They only have single 6-pin power, guess you could have seen that if I had shown a pic of them installed (well maybe, its a pretty crappy pic).


Fixed, if I could just read.


----------



## BackwoodsNC

@Anonizer Well, it still did it with hardware acceleration off in the browser. Using Firefox.

.

So, i switch over to Chrome and no more of that nasty screen YAY!!!


----------



## fabiovtec

Quote:


> Originally Posted by *trabunco*
> 
> Yours is 103. Mine is 104.


How do i update my bios? in asus site only have two bios.
Asus gpu tweak tells me that there are new bios for my card but i dont know if this program is safe


----------



## neurotix

Quote:


> Originally Posted by *anubis1127*
> 
> Thanks, one small thing though, they are R9 270s, not the 270x. They only have single 6-pin power, guess you could have seen that if I had shown a pic of them installed (well maybe, its a pretty crappy pic).


Those look really good anubis.


----------



## Unknownm

Quote:


> Originally Posted by *Devildog83*
> 
> I cannot use sapphire Trixx or any overclocking tool but for Afterburner because I can get higher voltage but it's never stable. I have resigned myself to the fact that Afterburner is it and I won't be able to get higher the 1.3 on the HD7870 and stock on the 270x.


Afterburner doesn't even give me a voltage option. Only Trixx does and my card is a gigabyte.


----------



## anubis1127

Quote:


> Originally Posted by *neurotix*
> 
> Those look really good anubis.


Thanks @neurotix, they do pretty good for gaming too. I wonder how they will fold though.

Quote:


> Originally Posted by *Unknownm*
> 
> Afterburner doesn't even give me a voltage option. Only Trixx does and my card is a gigabyte.


I'm having a similar issue with Afterburner as well, even with the "unlock voltage control" checked, the voltage slider is still greyed out for me. I've tried uninstalling / reinstalling to no avail. What I find odd is that these cards are MSI, and it doesn't work, but it had been working fine on the Sapphire HD 7850 I was using before I installed the 270s. I've been using a combo of trixx, and afterburner for now. I set my voltage and OC in trixx, then close it, then leave AB open for fan control, and monitoring.


----------



## Devildog83

Quote:


> Originally Posted by *Unknownm*
> 
> Afterburner doesn't even give me a voltage option. Only Trixx does and my card is a gigabyte.


Same for me but when I use the others I am unstable and I can get higher clocks with Afterburner even locked. I don't know if you have the same issue but that's just my experience.


----------



## trabunco

Quote:


> Originally Posted by *fabiovtec*
> 
> How do i update my bios? in asus site only have two bios.
> Asus gpu tweak tells me that there are new bios for my card but i dont know if this program is safe


I just used the GPU Tweak in updating.


----------



## DarthBaggins

Right now I'm excitedly awaiting the mail carrier to deliver my sapphire R9 270x that will accompany my 7870,


----------



## mAs81

Quote:


> Originally Posted by *fabiovtec*
> 
> How do i update my bios? in asus site only have two bios.
> Asus gpu tweak tells me that there are new bios for my card but i dont know if this program is safe


You can always use this tutorial if you don't want to use the Asus Gpu tweak


----------



## anubis1127

Quote:


> Originally Posted by *DarthBaggins*
> 
> Right now I'm excitedly awaiting the mail carrier to deliver my sapphire R9 270x that will accompany my 7870,


Nice.







I'm seeing about 155k PPD between both my 270s right now, just FYI.


----------



## mikemykeMB

Hows this for a superficial backplate>>> with some mfr logo whatnots.



Just and idea



probably "tone down" the white..splatter it brown-grey..match the mobo ya know


----------



## aman27deep

Got an XFX DD R9 270X. Very sexy card - very good performance for the price.


----------



## Devildog83

Quote:


> Originally Posted by *aman27deep*
> 
> Got an XFX DD R9 270X. Very sexy card - very good performance for the price.


Do you want to join the club, show us a pic and maybe what clocks your running and I will add you!!


----------



## DarthBaggins

Quote:


> Originally Posted by *anubis1127*
> 
> Nice.
> 
> 
> 
> 
> 
> 
> 
> I'm seeing about 155k PPD between both my 270s right now, just FYI.


only thing is I'm having issues booting Ubuntu w/ the two in crossfire


----------



## Devildog83

What's Ubuntu?

Congrats on the baby on the way!! Is you ya'lls 1st?


----------



## DarthBaggins

Linux based OS.

Thanks yeah it's my first and my gf/fiancé's 2nd


----------



## Devildog83

Quote:


> Originally Posted by *DarthBaggins*
> 
> Linux based OS.
> 
> Thanks yeah it's my first and my gf/fiancé's 2nd


Awesome, good luck.









Never used Linux.


----------



## mikemykeMB

Sometime ago I noticed a backplate idea and then wondered about making up a mockery
Quote:


> Originally Posted by *aman27deep*
> 
> Got an XFX DD R9 270X. Very sexy card - very good performance for the price.


Join up..XFX has a hidden secret...Hmmm..like the movie with the slimey snail...Hiccup


----------



## mikemykeMB

Ohhh, damn..help..shoot bang me in the head.. I did re-do that superficial backplate..New Paint>>>if it likes then i can make more with your logo-mfr...smooch











And.....


----------



## Devildog83

That's cool!


----------



## Unknownm

Quote:


> Originally Posted by *Devildog83*
> 
> Same for me but when I use the others I am unstable and I can get higher clocks with Afterburner even locked. I don't know if you have the same issue but that's just my experience.


You ' know I never tried this. I use trixx to apply max voltage because the card only does 1.18v load without any 3rd party programs, but after flashing to F3 bios it seems to load 1.256v without the software.

Regardless BF4 won't make my card hit OC settings even if I apply them using software. Unless I manually enter my OC settings in the BIOS than flash BIOS. L4D2 pushes my card to use the OC applied from the software just fine so idk what's going on there.

Even if I do maxvariablefps 999 (which tells bf4 to render much frames as possible) the card still does stock core settings maybe something to do with mantle?


----------



## Kayfanx

Hi guys.

I have a pair of ASUS Radeon R9 270X DCII TOP In crossfire, and im having major problems with Mantle.

When using mantle in BF4, the game locks up randomly and continues to the point it'll crash with a "DEVICE LOST GPU" error which points to one of my cards.

I have done a clean Catalyst 14.4 install.

Can anyone help with this?


----------



## anubis1127

Quote:


> Originally Posted by *Kayfanx*
> 
> Hi guys.
> 
> I have a pair of ASUS Radeon R9 270X DCII TOP In crossfire, and im having major problems with Mantle.
> 
> When using mantle in BF4, the game locks up randomly and continues to the point it'll crash with a "DEVICE LOST GPU" error which points to one of my cards.
> 
> I have done a clean Catalyst 14.4 install.
> 
> Can anyone help with this?


Hi, welcome to the forum. Is Mantle ready for Pitcairn? I was under the impression it is not. I tried it on 14.4 rc1 and it was garbage still. I'm just using DX11 and its fine, 100+fps on 2x 270s without even OCing.

I have not tried Mantle since I got my 2x R9 270s, only tried it on a single 7850, and 7950, but on both GPUs it was terrible.


----------



## Devildog83

Do they have a mantle thread where these questions can be discussed? It sounds like a good one to start if there is not. Either that or we need a serious GPU guru to pop in and help because with all of the issues popping up I am beginning to get confused as what's happening to who and for what when and why. Sorry my brain hurts.


----------



## anubis1127

There is this thread: http://www.overclock.net/t/1429303/amd-mantle-discussion-thread/

and this one: http://www.overclock.net/t/1464054/mantle-bug-thread/ <- That one hasn't seen much action in a while though.


----------



## Devildog83

Quote:


> Originally Posted by *anubis1127*
> 
> There is this thread: http://www.overclock.net/t/1429303/amd-mantle-discussion-thread/
> 
> and this one: http://www.overclock.net/t/1464054/mantle-bug-thread/ <- That one hasn't seen much action in a while though.


Thanks *anubus*, I should link those to the OP for folks looking for help.

These links have been added to the OP so if anyone needs help and it can't be figured out here we can point them in the right direction.


----------



## TheLAWNOOB

Anybody have any R9 280X Sapphire Dual-X BF4 Edition OC ?

I have some major issues with the RMA process, please PM me I need your help.


----------



## itomic

Hi, i have MSI R9 280X and it is loud and hot !!! I bought it becouse it should be cool and quiet. Does anyone have problems like i have with this card and how to solve it ??


----------



## Roboyto

Quote:


> Originally Posted by *itomic*
> 
> Hi, i have MSI R9 280X and it is loud and hot !!! I bought it becouse it should be cool and quiet. Does anyone have problems like i have with this card and how to solve it ??


Which 280X?

Are you overclocking/overvolting the card?

Got some pictures of your rig? Lack of proper airflow will ruin GPU temperatures.

What's cooling your CPU? Is it overclocked?

Should update your Rig info since it's still showing a GTX 760.


----------



## itomic

Exactly this one: http://www.msi.com/product/vga/R9_280X_GAMING_3G_LE.html#overview

No, on stock settings. Air flow is very good, that isnt the problem. CPU cooler isnt problem to.

GTX 760 i R9 280X Twin Frozer IV have same fanse and on R9 280X (at least on mine) they are much louder when i set them to same rpm-s. Im starting to regret for switching from dead silent and cool GTX 760 to noisy and hot R9 280X. Its month old so BIOS sholud be good. LE version didnt exist in early days when the had problems with BIOS.


----------



## Roboyto

Quote:


> Originally Posted by *itomic*
> 
> Exactly this one: http://www.msi.com/product/vga/R9_280X_GAMING_3G_LE.html#overview
> 
> No, on stock settings. Air flow is very good, that isnt the problem. CPU cooler isnt problem to.
> 
> GTX 760 i R9 280X Twin Frozer IV have same fanse and on R9 280X (at least on mine) they are much louder when i set them to same rpm-s. Im starting to regret for switching from dead silent and cool GTX 760 to noisy and hot R9 280X. Its month old so BIOS sholud be good. LE version didnt exist in early days when the had problems with BIOS.


What do you have the fans set to? If you're not overclocking then auto settings should work just fine. The 280X would run hotter than 760, but it is a more powerful card and. Regardless of that fact that cooler should not be loud.

Can't help you completely if you don't want to at least post a picture or two.


----------



## itomic

On auto fan its loud and hot. I will post some pics tomorrow. I know that it should run cool and quiet and thats why i said that i have a problem do







.


----------



## Arkanon

Maybe a bad TIM application from factory?
On a sidenote: Noticed MSI released a LE version of their existing R9 cards and i can't seem to figure out what the LE exactly stands for? Should i look at it as a second revision or is something else at hand here?


----------



## itomic

I dont know either. Only difference that i can tell is slightly lower core clocks.


----------



## rdr09

Wake up Red Team! Competition ends today.

http://www.overclock.net/t/1476601/3d-fanboy-overclocking-competition-2014-500-in-prizing

just post your 24/7 clocks. Make sure to turn off tessellation in CCC.


----------



## Tempest2000

Looking to get an MSI 280 (non-X) and wondering if anyone knows if any of the EK full blocks fit it. They do not have the card listed on coolingconfigurator.


----------



## DarthBaggins

He'll wish they made water blocks for the 270x and not sure if it's truly compatible on my 7870


----------



## mikemykeMB

Quote:


> Originally Posted by *DarthBaggins*
> 
> He'll wish they made water blocks for the 270x and not sure if it's truly compatible on my 7870


They..being EK? Have an XSPC Rasa (not full cover) on a XFx 270x, holes line up exactly and has kept temps below 40c while burn in test for over 15min.


----------



## DarthBaggins

yeah sorry they = EKWB, and I was hoping for a full cover, if not I'll end up putting something together.


----------



## Devildog83

At least the CPU is going under water -


----------



## mikemykeMB

Quote:


> Originally Posted by *Devildog83*
> 
> At least the CPU is going under water -


Hmmm..isn't already!?! How u like the Photon? It has a awesome appearance with the reflectivity of LEDs


----------



## DarthBaggins

Just enjoyed BF4 w/ the new crossfire, and wow huge improvement. . now wonder what a 3rd would do. . lol


----------



## kpo6969

Quote:


> Originally Posted by *DarthBaggins*
> 
> He'll wish they made water blocks for the 270x and not sure if it's truly compatible on my 7870


http://www.techpowerup.com/200342/alphacool-announces-its-gpu-cooler-customization-service.html


----------



## End3R

Quote:


> Originally Posted by *kpo6969*
> 
> http://www.techpowerup.com/200342/alphacool-announces-its-gpu-cooler-customization-service.html


Why would anyone waste money on water cooling a low-end card?

Ok that article makes no sense, they say both:

With new patented technology Alphacool have the ability to fill this gap in the market by producing full cover waterblocks for any non-reference or lower end graphics card.

-AND-

Alphacool will be launching this range based on any Nvidia card 7XX or above or any AMD card 2XX or above

These are the opposite of low-end.


----------



## Devildog83

Quote:


> Originally Posted by *End3R*
> 
> Why would anyone waste money on water cooling a low-end card?
> 
> Ok that article makes no sense, they say both:
> 
> With new patented technology Alphacool have the ability to fill this gap in the market by producing full cover waterblocks for any non-reference or lower end graphics card.
> 
> -AND-
> 
> Alphacool will be launching this range based on any Nvidia card 7XX or above or any AMD card 2XX or above
> 
> These are the opposite of low-end.


I would off the coolers on my Devil's for this in a heartbeat.


----------



## End3R

Quote:


> Originally Posted by *Devildog83*
> 
> I would off the coolers on my Devil's for this in a heartbeat.


Yea I'm not saying anything bad about the cooling itself, just as I read the article it made no sense to me that they were claiming this was targeted at low-end cards.


----------



## runelotus

Did a Illuminated GPU backplate for my HIS R9 270X IceQX2 TBC


----------



## DarthBaggins

Looks like I'll be going Alphacool on my cards then, lol


----------



## anubis1127

Quote:


> Originally Posted by *runelotus*
> 
> Did a Illuminated GPU backplate for my HIS R9 270X IceQX2 TBC
> 
> 
> Spoiler: Warning: Spoiler!


Nicely done on the backplate, looks great.
Quote:


> Originally Posted by *DarthBaggins*
> 
> Looks like I'll be going Alphacool on my cards then, lol


Are your 7870/270x running that hot? My little 270s don't see 70C on auto fan. I guess your cards do have another 6pin, and more OC headroom though.


----------



## DarthBaggins

Under folding they hit 55c tops and last night on bf4 multiplayer they only hit 45c and I have the 7870 clocked to match the r9270x. Just would like to have more than my CPU underwater lol. Also I have a custom fan setting so they won't peak 55c so far.


----------



## Devildog83

Quote:


> Originally Posted by *DarthBaggins*
> 
> Under folding they hit 55c tops and last night on bf4 multiplayer they only hit 45c and I have the 7870 clocked to match the r9270x. Just would like to have more than my CPU underwater lol.


I here ya there, mine stay cool enough now that I replaced the TIM but I keep getting pushed to add my cards to the loop I am doing.


----------



## drieg500

would like to update the clocks on my 270X HAWK to 1265/1400. tried overclocking again after getting bored for a while.

just screwing with core clocks & voltages atm. mems will go later if I could push it.

here's the validation:
http://www.techpowerup.com/gpuz/fhg3m/

and my valley/heaven results:


would like to try the alphacool custom block program if I was only in europe. I can't tell for now if the Kraken G10 would be a good investment since there are literally no full blocks for the 270X HAWK since EKWB have no plans on making full blocks (or any block manufacturer for that matter). or would I just settle for a universal block? (although they won't looks as appealing as full blocks.)

hopefully I could somehow cool this card under water so I could try overclocking with the LN2 bios to see if it does overclock better than the default one.


----------



## H01ym0ses

Just got the 280X at a steal from tigerdirect... Apparently they made a mistake on the pricing and dropped it to 269.99.. that lasted for about 10 minutes and was pulled and repriced at 309. so far the card is roughly double the speed of my old GTX 560 Ti card and I must say it's wicked fast and for the price can't complain. Power requirements are a bit steep on the R9 line in general but, hey 700 watt psu handling its business (gold + PSU) so I'm very happy with it thus far. Any suggestions, recommendations or tips for a new ATI owner? I've been an NV owner for nearly 15 years since they dropped the old 8800 GTS 640 MB version. Loved that card, it was a true monster in the day.


----------



## shilka

Quote:


> Originally Posted by *H01ym0ses*
> 
> Just got the 280X at a steal from tigerdirect... Apparently they made a mistake on the pricing and dropped it to 269.99.. that lasted for about 10 minutes and was pulled and repriced at 309. so far the card is roughly double the speed of my old GTX 560 Ti card and I must say it's wicked fast and for the price can't complain. Power requirements are a bit steep on the R9 line in general but, hey 700 watt psu handling its business (gold + PSU) so I'm very happy with it thus far. Any suggestions, recommendations or tips for a new ATI owner? I've been an NV owner for nearly 15 years since they dropped the old 8800 GTS 640 MB version. Loved that card, it was a true monster in the day.


Which PSU?

Welcome to OCN by the way


----------



## DarthBaggins

I would think over 650+ there should be no issues with a 280x,


----------



## shilka

Quote:


> Originally Posted by *DarthBaggins*
> 
> I would think over 650+ there should be no issues with a 280x,


500 watts is enough for a 280x as long as you dont overvolt it


----------



## boot318

Quote:


> Originally Posted by *DarthBaggins*
> 
> I would think over 650+ there should be no issues with a 280x,


Crossfire? I have a 280x and never seen power usage go over 350w when gaming (usually hovers around 280-330).


----------



## eAT5




----------



## DarthBaggins

Under xfire that would be cutting it close on a 700w supply I would think, kinda the reason I personally went with the v850 I currently run from the tx650m I had. Always have to factor in the voltages of other components in the case too, in my case I have my loop components etc.


----------



## Roboyto

Quote:


> Originally Posted by *shilka*
> 
> 500 watts is enough for a 280x as long as you dont overvolt it


Quote:


> Originally Posted by *DarthBaggins*
> 
> I would think over 650+ there should be no issues with a 280x,


Yup. Got a 290 in my HTPC with 3770k, both at stock settings, with an 120mm AIO on each, and my 450W does just fine. Wife is very pleased playing Tomb Raider maxed out on thr TV :-D


----------



## fabiovtec

I have a question about the voltage of r9 270x
My asus at default have the voltage at 1.20v in programs like asus gpu tweak or trixx but in full load only uses 1.16v 1.17v

Im now overclocking the card and i set voltage to max "1.30v" but in full load only uses 1.24v 1.25v and this is holding back my overclock.
Currently stuck on 1270mhz 1.24v... trying 1300 but the card needs more juice...

Help please


----------



## Roboyto

Quote:


> Originally Posted by *fabiovtec*
> 
> I have a question about the voltage of r9 270x
> My asus at default have the voltage at 1.20v in programs like asus gpu tweak or trixx but in full load only uses 1.16v 1.17v
> 
> Im now overclocking the card and i set voltage to max "1.30v" but in full load only uses 1.24v 1.25v and this is holding back my overclock.
> Currently stuck on 1270mhz 1.24v... trying 1300 but the card needs more juice...
> 
> Help please


That's a high overclock already for one of these cards. The little further you could push with more voltage wouldn't give you much extra performance.

The voltage you're seeing is likely after the vDroop. My devil 270X exhibited similar voltages and maxed at 1280 with 1320mV in Trixx. Not much you can do except maybe flash BIOS, but not really worth the risk IMHO.

If you need more processing power I'd suggest a bigger GPU. 280X are as low as $309 for a visiontek with lifetime warranty. My buddy just got one and it's a solid performer that is extremely quiet; the cooler impressed me.


----------



## fabiovtec

vdroop in gpu? I thought it was only for cpu








anyway, its a good overclock, my fps in tomb raider increased from 47 to 54 in ultimate settings

The only thing that i hate in this card is that i cant get 1200mhz stable with stock voltage


----------



## Roboyto

Quote:


> Originally Posted by *fabiovtec*
> 
> vdroop in gpu? I thought it was only for cpu
> 
> 
> 
> 
> 
> 
> 
> 
> anyway, its a good overclock, my fps in tomb raider increased from 47 to 54 in ultimate settings
> 
> The only thing that i hate in this card is that i cant get 1200mhz stable with stock voltage


Yup, there is vdroop in GPU as well.

It is good clocks, especially when stock 7870 is 1 GHz. You're at the limit of the average 7870/270X. If you want/need more horsepower you'll need more cores


----------



## anubis1127

I think vdroop in GPUs gets overlooked rather frequently. My old Tahiti GPUs had hella droop, so did the 780 Lightnings I had.


----------



## Spade616

Quote:


> Originally Posted by *runelotus*
> 
> Wanted to share
> Tried smoothing off the based of my HIS R9 270X IceQX2TBC
> 
> 
> In Comparison the old surface is the one not rerlecting while the Lapped base is the one reflecting back
> Used 1200,1500,2000 grit sand paper
> The Temp when from 92C when running Furmark to 80C
> And 78C to 66C while gaming
> 
> 
> 
> I Am quite diapointed that the culprint is the awfull base of the VGA stock HSF
> 
> a few sand paper and Lapping of the base provided massive temperature drop
> 
> note : I used DeepCool Z5 thermal paste


my HIS 270x does get pretty hot as well. especially now that its summer for us. on stock clocks, i hit around 55c just playing Dota 2, over 75c on benchmarks, and on BF4 i hit around ~60c.







will try that mod later.


----------



## mikemykeMB

The best and fastest 280x hands down is? Just wanting to get a few well rounded answers/chimes from the peeps here..have read from several sites- articles, reviews..etc. Leaning towards the Toxic, even tho' haven't had probs with the 2 Xfx branded units.


----------



## Roboyto

Quote:


> Originally Posted by *mikemykeMB*
> 
> The best and fastest 280x hands down is? Just wanting to get a few well rounded answers/chimes from the peeps here..have read from several sites- articles, reviews..etc. Leaning towards the Toxic, even tho' haven't had probs with the 2 Xfx branded units.


The better question is which has the better cooler and power regulation. Both of those factors will help *aid* in the cards' ability to be the fastest. Silicon lottery plays an important role since chip yields are better overall, hence the high out of box clocks, but getting a great chip comes down to a bit of luck. You could get a toxic and it may not budge past it's given 1150 boost speed, or it could be a beast. It is a nice card for the money either way.

I have had good luck with many different brand GPUs, especially the 78XX and 79XX lines; HIS, XFX, ASUS, VisionTek, and PowerColor.

Friend of mine just bought a VisionTek 280X from The Egg for $309. It clocked to 1195/1725 without much fuss and it is a reference design. The bland shroud and typical dual fan setup are deceiving in pictures, as the heatpipes are enormous. It does a fine job of keeping the Core/VRMs cool as well as being astonishingly quiet even at 80% fan speed. It lacks the pizzazz of some of the other cards, but at $309 with a backplate and lifetime warranty it's a pretty good deal.

The MSI gaming is always a solid choice. The TF IV cooler is nothing shy of awesome.

I never see very many people using HIS cards, and I'd like to throw them a bone. Myself and one particular friend of mine have owned several HIS 78XX and 79XX cards between us and every one of them were outstanding performers. Don't know if this is just pure dumb luck, but, he just got a HIS 290 in the mail today and from his texts so far it is sounding like another winner; 1100 core clock with no additional voltage. The IceQ X2 cooler is solid, and it has a large heatsink for the VRMs and RAM like the MSI Gaming.

XFX also has a special place in my heart due to the wonderful 290 in my rig presently; 1300/1700 beast mode! Plus with a DD cooler you get a lifetime warranty...what's not to like about that?

I wish PowerColor had a Devil edition 280X, because that would be high on my list as well. I was very pleased with my 270X's performance in the short run that I had it for.

I am mostly negative with ASUS these days, but the DC2 Top 280X is a pretty nice card all things considered.

Then there is Gigabyte who sort of blends into the crowd. That Windforce cooler does a solid job though, there's no doubt about that.

The Toxic looks like it has the best boost clock out of all the 280X, so without touching anything it is the winner...but you're probably going to fiddle with things so your options are vast!


----------



## mikemykeMB

Definitely on the power mgmnt, as for cooling..strip it and block it, do like the MSI-TF, and have a good track with Xfx, and the Toxic might not be a true winner OOB, but can "fiddle" with as you say. Just don't want a let down when the expectations of a stepping up from the current card to gain only minor improvements ahhh...Good opinions and like the info about the VisionTek...like all decisions, comes another impressive insight from others. Might look a little deeper into it.


----------



## Roboyto

Quote:


> Originally Posted by *mikemykeMB*
> 
> Definitely on the power mgmnt, as for cooling..strip it and block it, do like the MSI-TF, and have a good track with Xfx, and the Toxic might not be a true winner OOB, but can "fiddle" with as you say. Just don't want a let down when the expectations of a stepping up from the current card to gain only minor improvements ahhh...Good opinions and like the info about the VisionTek...like all decisions, comes another impressive insight from others. Might look a little deeper into it.


If you're going for one of the custom cards, be sure there is a block available to fit them. EK's CoolingConfigurator.com is always a great place to check PCB layout.

If your rig sig is accurate with a 270X, there will be a vast improvement going to a 280X just by considering the core count alone; 1280 ~VS 2048...so you really can't lose no matter which 280X you choose as long as there is a waterblock for it.


----------



## Roboyto

Quote:


> Originally Posted by *Tempest2000*
> 
> Looking to get an MSI 280 (non-X) and wondering if anyone knows if any of the EK full blocks fit it. They do not have the card listed on coolingconfigurator.


280 = 7950 and I doubt they changed the PCB much, if at all, so I would imagine the reference 7950/7970/280X blocks would fit it without issue.

*Here is the 7950 PCB*



*Here is the 280X PCB*



Looks the same to me









Odds are you could use reference 7950/7970/280X full cover blocks. If you must be 100% certain you can always e-mail EK, they usually respond pretty quickly from my experience.


----------



## mikemykeMB

Quote:


> Originally Posted by *Roboyto*
> 
> If you're going for one of the custom cards, be sure there is a block available to fit them. EK's CoolingConfigurator.com is always a great place to check PCB layout.
> 
> If your rig sig is accurate with a 270X, there will be a vast improvement going to a 280X just by considering the core count alone; 1280 ~VS 2048...so you really can't lose no matter which 280X you choose as long as there is a waterblock for it.


For sure, the step up is noticeable..Have a XSPC Rasa uni-block on the 270x currently,..so would probably need to know what the hole dimensions/spacing on the one to be chosen, that's kinda where I am at on the fence about.


----------



## Roboyto

Quote:


> Originally Posted by *mikemykeMB*
> 
> For sure, the step up is noticeable..Have a XSPC Rasa uni-block on the 270x currently,..so would probably need to know what the hole dimensions/spacing on the one to be chosen, that's kinda where I am at on the fence about.


Bah! Ditch that universal thing and get a FC block! You can probably even find a used one for a reasonable price, check the OCN Marketplace!


----------



## mikemykeMB

Quote:


> Originally Posted by *Roboyto*
> 
> Bah! Ditch that universal thing and get a FC block! You can probably even find a used one for a reasonable price, check the OCN Marketplace!


Haha..that maybe true, but now a backplate to add rigidity??


----------



## Roboyto

Quote:


> Originally Posted by *mikemykeMB*
> 
> Haha..that maybe true, but now a backplate to add rigidity??


And functionality in most cases. A strip of thermal pad between the VRMs and the backplate turns it into a large passive heatsink to help keep them even cooler.


----------



## mikemykeMB

Quote:


> Originally Posted by *Roboyto*
> 
> And functionality in most cases. A strip of thermal pad between the VRMs and the backplate turns it into a large passive heatsink to help keep them even cooler.


You'd make a good salesman..lol Let the venture begin.


----------



## Roboyto

Quote:


> Originally Posted by *mikemykeMB*
> 
> You'd make a good salesman..lol Let the venture begin.


That's more like it! Let me know how far the rabbit hole goes :-D


----------



## H01ym0ses

Quote:


> Originally Posted by H01ym0ses
> 
> Just got the 280X at a steal from tigerdirect... Apparently they made a mistake on the pricing and dropped it to 269.99.. that lasted for about 10 minutes and was pulled and repriced at 309. so far the card is roughly double the speed of my old GTX 560 Ti card and I must say it's wicked fast and for the price can't complain. Power requirements are a bit steep on the R9 line in general but, hey 700 watt psu handling its business (gold + PSU) so I'm very happy with it thus far. Any suggestions, recommendations or tips for a new ATI owner? I've been an NV owner for nearly 15 years since they dropped the old 8800 GTS 640 MB version. Loved that card, it was a true monster in the day.


Quote:


> Which PSU?
> 
> Welcome to OCN by the way


http://www.newegg.com/Product/Product.aspx?Item=N82E16817116024

Ok its only bronze.. my mistake. thats the one I have at about 1.2yr old right now.


----------



## H01ym0ses

Quote:


> Originally Posted by *DarthBaggins*
> 
> Under xfire that would be cutting it close on a 700w supply I would think, kinda the reason I personally went with the v850 I currently run from the tx650m I had. Always have to factor in the voltages of other components in the case too, in my case I have my loop components etc.


right now I've only got 2 hdd's and a cd drive which I rarely use and kbd, mouse, printer ( don't think it draws any power) nostromo keypad, and logitech game pad.

The system is under 2 years old as well. Z77-G43 MSI board with I5 -3570k, 2x4GB Corsair vengance at 2133. H80 corsair cooler only 3 fans in the system, don't need a lot of internal cooling with the case open. Have a HUGE old lian li 70 model full tower server case. 3.5 ft tall approx 1ft wide and 3 ft deep.

I'm at work right now hard to really post any pics and such.


----------



## BruceB

Just thought I'd give some advice on Powercolor brand 280X's:
The good:
The cooler is sick, keeps the GPU under 70C in most cases and is pretty quiet too (I can't hear it at all untill it reaches about 50C)!
Customer services are top-notch, the RMA process is fast and easy.

The bad:
On the first Powercolor 280X I had the thermal paste wasn't applied properly and it would overheat and go into 'panic mode'. RMA'd it and the new GPU has worked absolutely problem free out of the box (as it should be for a 270EUR GPU).
The powercolor GPUs don't seem to OC as high/stable as other brands (I think because the powercolor GPUs are pretty much stock).

My 2c:
If OC'ing potential isn't top prioirty get a powercolor, it's quiet, cool and won't break the bank (as much as some other 280Xs) while performing well.
If max performance is all you care about and money isn't a thing, get a Toxic.


----------



## DarthBaggins

Quote:


> Originally Posted by *H01ym0ses*
> 
> right now I've only got 2 hdd's and a cd drive which I rarely use and kbd, mouse, printer ( don't think it draws any power) nostromo keypad, and logitech game pad.
> 
> The system is under 2 years old as well. Z77-G43 MSI board with I5 -3570k, 2x4GB Corsair vengance at 2133. H80 corsair cooler only 3 fans in the system, don't need a lot of internal cooling with the case open. Have a HUGE old lian li 70 model full tower server case. 3.5 ft tall approx 1ft wide and 3 ft deep.
> 
> I'm at work right now hard to really post any pics and such.


Yeah not pulling much power really other than the gpu but guessing that's rarely at full load. I know mine are at full blast daily just due to folding 24/7.


----------



## mikemykeMB

Quote:


> Originally Posted by *BruceB*
> 
> Just thought I'd give some advice on Powercolor brand 280X's:
> The good:
> The cooler is sick, keeps the GPU under 70C in most cases and is pretty quiet too (I can't hear it at all untill it reaches about 50C)!
> Customer services are top-notch, the RMA process is fast and easy.
> 
> The bad:
> On the first Powercolor 280X I had the thermal paste wasn't applied properly and it would overheat and go into 'panic mode'. RMA'd it and the new GPU has worked absolutely problem free out of the box (as it should be for a 270EUR GPU).
> The powercolor GPUs don't seem to OC as high/stable as other brands (I think because the powercolor GPUs are pretty much stock).
> 
> My 2c:
> If OC'ing potential isn't top prioirty get a powercolor, it's quiet, cool and won't break the bank (as much as some other 280Xs) while performing well.
> If max performance is all you care about and money isn't a thing, get a Toxic.


After the mention of bad tim app, kinda makes sense to make mandatory change, been noticing people have reached better results. The $ thing is always a factor, want the best with the less, so it makes for interesting choice, the advice/opinions are well liked.


----------



## Megadrone

simple question, whats the stock values of 280x?


----------



## Roboyto

Quote:


> Originally Posted by *Megadrone*
> 
> simple question, whats the stock values of 280x?


Core @ 850MHz with boost to 1GHz
RAM @ 6GHz

http://www.anandtech.com/show/7400/the-radeon-r9-280x-review-feat-asus-xfx


----------



## H01ym0ses

Quote:


> Originally Posted by *DarthBaggins*
> 
> Yeah not pulling much power really other than the gpu but guessing that's rarely at full load. I know mine are at full blast daily just due to folding 24/7.


I ran 3DMark to see if it would hold up under full load and it held fine. More then double performance from my 560Ti, majorly impressed thus far just hope they do something with the mantle API to really turn up the performance. Only really odd thing was the physics performance from both cards are almost identical within .2-4 frames. Not sure what handles the physics on ATI card or if the CPU is offloaded the physics duties.


----------



## BruceB

Quote:


> Originally Posted by *H01ym0ses*
> 
> I ran 3DMark to see if it would hold up under full load and it held fine. More then double performance from my 560Ti, majorly impressed thus far just hope they do something with the mantle API to really turn up the performance. Only really odd thing was the physics performance from both cards are almost identical within .2-4 frames. Not sure what handles the physics on ATI card or if the CPU is offloaded the physics duties.


The physics will be handled by your CPU, that's why your scores are so similar!








What's your setup/scores?


----------



## 904bangingsys

I've got this http://www.newegg.com/Product/Product.aspx?Item=N82E16814202046

what is safest max overclock and overvoltage I can set?


----------



## H01ym0ses

Quote:


> Originally Posted by *904bangingsys*
> 
> I've got this http://www.newegg.com/Product/Product.aspx?Item=N82E16814202046
> 
> what is safest max overclock and overvoltage I can set?


Quote:


> Originally Posted by *H01ym0ses*
> 
> I ran 3DMark to see if it would hold up under full load and it held fine. More then double performance from my 560Ti, majorly impressed thus far just hope they do something with the mantle API to really turn up the performance. Only really odd thing was the physics performance from both cards are almost identical within .2-4 frames. Not sure what handles the physics on ATI card or if the CPU is offloaded the physics duties.


Fire Strike w/ 280x
3DMark Score6737
Graphics Score7949
Physics Score6607
Combined Score3188
Graphics Test 137.46 fps
Graphics Test 232.08 fps
Physics Test20.98 fps
Combined Test14.83 fps

Fire Strike w/ 560Ti
3DMark Score3249
Graphics Score3486
Physics Score6489
Combined Score1440
Graphics Test 116.14 fps
Graphics Test 214.29 fps
Physics Test20.6 fps
Combined Test6.7 fps


----------



## 904bangingsys

@H01ym0ses what does your reply have to do with my questioner post?
Quote:


> Originally Posted by *904bangingsys*
> 
> I've got this http://www.newegg.com/Product/Product.aspx?Item=N82E16814202046
> 
> what is safest max overclock and overvoltage I can set?


----------



## Alanthor

A answer to above. I dont have that GPU, I have a R9 270X Dual-X OC Edition 2GB. And im currently running these setups stable. (I can probably go higher)

Core clock - 1110MHz (1020MHz original)
Memory clock - 1450MHz (1400MHz original)
Voltage on 1.242


----------



## fabiovtec

Its very bad that oc for a r9 270x with that voltage


----------



## H01ym0ses

Sorry quoted the wrong message.


----------



## Arkanon

Quote:


> Originally Posted by *Alanthor*
> 
> A answer to above. I dont have that GPU, I have a R9 270X Dual-X OC Edition 2GB. And im currently running these setups stable. (I can probably go higher)
> 
> Core clock - 1110MHz (1020MHz original)
> Memory clock - 1450MHz (1400MHz original)
> Voltage on 1.242


Stock voltage should allow for 1200MHz atleast.
1.3V should allow for around 1250-1300MHz depending on the quality of your chip, unless the very high binned ones that go straight over 1300MHz. But that's rather on rare occasions.


----------



## ricko99

Anyone here change the original thermal paste on Asus r9 280x? How is the temperature?


----------



## DarthBaggins

I just changed it out on my Asus 7870, what a PitA but got it cleaned and put new on just going to put the 7870 in the gf's rig while I get crossfire in Ubuntu fixed, since I don't feel like removing a card when I switch back to folding in Linux from w8.1


----------



## creationsh

As most manufacturer would over applied thermal paste. I decreased the amount of paste on my 280x /w Gelid Vision-a and saw 10c difference. Anyway, this is my first post on this thread, thought I join the thread by making a post to thank those before me for helping me achieve my satisfactory overclock setting. I'm running on a Diamond 280X card. So far i have pushed to (1175/1500) @ 1.212v @64 degree core, I thought I quit right there. I find it strange that my card's stock voltage was 1.131v as most mentioned that their stock voltage was 1.2v.


----------



## Unknownm

Quote:


> Originally Posted by *ricko99*
> 
> Anyone here change the original thermal paste on Asus r9 280x? How is the temperature?


I thought it was like the law to change the thermal paste of any GPU you buy.....

Every single GPU bought = New thermal paste before installing it. Even did this to my ATi Radeon 9550 which didn't have a temp sensor (well 9600 had temp sensor so you can see the changes in ATi Tool)


----------



## DiceAir

Quote:


> Originally Posted by *Unknownm*
> 
> I thought it was like the law to change the thermal paste of any GPU you buy.....
> 
> Every single GPU bought = New thermal paste before installing it. Even did this to my ATi Radeon 9550 which didn't have a temp sensor (well 9600 had temp sensor so you can see the changes in ATi Tool)


LOl so you have like no warranty on any GPU's. Thermalpaste might not improve anything much unless you spend a fortune on it but heck that's why we here. We are extreme overclockers anyway. So I should say if your temps is going above 80C on 100% fan speed then you should think about changing thermal paste


----------



## Unknownm

Quote:


> Originally Posted by *DiceAir*
> 
> LOl so you have like no warranty on any GPU's. Thermalpaste might not improve anything much unless you spend a fortune on it but heck that's why we here. We are extreme overclockers anyway. So I should say if your temps is going above 80C on 100% fan speed then you should think about changing thermal paste


well I must be lucky because every product I owned never had to return back. Thermal paste might not improve much most of the time but I've seen some pretty bad thermal paste on stock gpus, maybe not today but good example was the BFG 6600 GT I bought from NCIX. With a good thermal paste and having to wait for AS5 to cure it would drop down 5c to 10c which could mean those extra MHz in colder times of the year.

This is why we are on Overclock.net to push the overclock limits. It's not semi-overclock.net or if_temps_are_over_80c_I'll_change the_thermal_paste.net


----------



## Dasboogieman

Quote:


> Originally Posted by *Unknownm*
> 
> well I must be lucky because every product I owned never had to return back. Thermal paste might not improve much most of the time but I've seen some pretty bad thermal paste on stock gpus, maybe not today but good example was the BFG 6600 GT I bought from NCIX. With a good thermal paste and having to wait for AS5 to cure it would drop down 5c to 10c which could mean those extra MHz in colder times of the year.
> 
> This is why we are on Overclock.net to push the overclock limits. It's not semi-overclock.net or if_temps_are_over_80c_I'll_change the_thermal_paste.net


Actually, I've found recently that the stock thermal paste used by some Semi custom or custom cards have been pretty good. It certainly has come a long way since the 8800GT era.
After my G10 watercooling adventures with the 290 Tri X, I've remounted my original air cooler while waiting for more parts. I used supposedly superior Arctic MX-4 and found that my temperatures were similar or worse than the original paste job. Presumably, the superior mounting and paste distribution at the factory trumps any thermal resistance disadvantage of the stock paste vs the MX-4.

Similar thing also happened when I re-pasted my MSI GTX 570 Twin Frozr III.

That being said, you can never really be sure if the best possible paste job has been done on your card without repasting.

Will probably attempt to repaste when my Phobya Nanogrease comes, don't really wanna use the spare IC Diamond (great for laptops though).


----------



## nitrubbb

does MSI Gaming App work for anyone with win 8.1 ?

I get a crash every time I select OC mode and silent mode doesn't lower clocks either


----------



## Devildog83

*creationsh* has been added. Welcome!!!

I replaced the TIM on both my HD7870 and 270x Devil's, he 279X showed no difference and the 7870 was nearly 20C difference. I guess it just depends on who applies it and how good they do.


----------



## Alanthor

Quote:


> Originally Posted by *fabiovtec*
> 
> Its very bad that oc for a r9 270x with that voltage


What, really?







But I get BSOD if I lower the VDDC.. :s

The exact model is Sapphire Radeon R9 270X 2GB GDDR5 DUAL-X OC Edition w/battlefield 4


----------



## DarthBaggins

Quote:


> Originally Posted by *Devildog83*
> 
> *creationsh* has been added. Welcome!!!
> 
> I replaced the TIM on both my HD7870 and 270x Devil's, he 279X showed no difference and the 7870 was nearly 20C difference. I guess it just depends on who applies it and how good they do.


Just redid mine on the 7870, get to find out tonight when I fire up the source reboot how well I did.

Well running [email protected] and the 7870 is running 15c cooler under full load (normally around 55 now at 40c) so guess I did good, lol


----------



## BruceB

I'd like to reapply the thremal paste on my Card, even if the temps don't Change, I just like taking things apart








but I don't want to void my warranty so I'll be sticking with the 70C temps I have atm for a while...


----------



## Devildog83

Quote:


> Originally Posted by *DarthBaggins*
> 
> Just redid mine on the 7870, get to find out tonight when I fire up the source reboot how well I did.
> 
> Well running [email protected] and the 7870 is running 15c cooler under full load (normally around 55 now at 40c) so guess I did good, lol


40C!! Holy cow are you running stock? Mine under full load will get in the 60's and I am happy with that but I run bench's under a huge overclock.


----------



## DarthBaggins

Bumped the clocks up a little but not much,


----------



## fabiovtec

Wich program you guys are using to change the gpu voltage?


----------



## DarthBaggins

I am using trixx on my Source 210 Re-boot build but I'm still at stock voltage on the 7870 and haven't really messed with the clocks on my 270x since I'm running Ubuntu for team comp. Folding.


----------



## Mackem

I don't know whether to return my 280X and get a refund or replacement. I got mine from one of the first batches of cards and I get artifacts in games but I don't want to get a replacement if it's just going to do the same thing. I've tried multiple stable and beta driver versions as well as trying various other things and it still happens. I have the DirectCUII TOP model.


----------



## fabiovtec

Do you updated the gpu bios?


----------



## DarthBaggins

I'd return it for a new model, think by now Asus probably revised the card


----------



## Mackem

Quote:


> Originally Posted by *fabiovtec*
> 
> Do you updated the gpu bios?


Yep

Quote:


> Originally Posted by *DarthBaggins*
> 
> I'd return it for a new model, think by now Asus probably revised the card


There's quite a few people though that have returned the card, got a new one and are having the same issues. I really like the 280X as I feel it gives very good bang for the buck but I'd rather not have to deal with the possibility of artifacting again.


----------



## raven86

Heys guys does changing the TIM on my devil r9 270x voids its warranty?


----------



## Devildog83

Quote:


> Originally Posted by *raven86*
> 
> Heys guys does changing the TIM on my devil r9 270x voids its warranty?


No.


----------



## DarthBaggins

this is what the 7870 is now set at, also Folding at full settings:


----------



## BruceB

Quote:


> Originally Posted by *Mackem*
> 
> I don't know whether to return my 280X and get a refund or replacement. I got mine from one of the first batches of cards and I get artifacts in games but I don't want to get a replacement if it's just going to do the same thing. I've tried multiple stable and beta driver versions as well as trying various other things and it still happens. I have the DirectCUII TOP model.


If you're running stock clocks then you should never see artifacts, if you do then you should return your card for a new one! At the end of the day you have paid for a product, if it dosen't work as it should then you're entitled to a replacement.









Plus, if you're getting artifacts at stock, then a properly working card should let you OC for more performance


----------



## BruceB

Quote:


> Originally Posted by *Devildog83*
> 
> No.


O rly?! I thought I read in my manual (yes, I read the manual for my GPU, I'm a geek







) taking off the cooler voided the warranty? Or is it different from brand to brand?


----------



## Majentrix

Depends on the brand.

I know that MSI will honor your warranty provided you treat the card and cooler properly. Powercolor might too, shoot them an email if you really want to know.


----------



## DarthBaggins

As I see it as long as you don't strip the threads or damage any components you should be fine.


----------



## Devildog83

Quote:


> Originally Posted by *BruceB*
> 
> O rly?! I thought I read in my manual (yes, I read the manual for my GPU, I'm a geek
> 
> 
> 
> 
> 
> 
> 
> ) taking off the cooler voided the warranty? Or is it different from brand to brand?


Yes, Powercolor's warranty says that removing the Serial or Part number stickers (S/N, P/N) will void it but nothing about replacing the TIM. I am sure if you sent it back and it was burned-up due to the buyer doing a poor job of it then they would not honor the warranty but you can replace the TIM. Altering it will void it too but since TIM is a normal part of a GPU it's not altering it in any way and is OK.

http://powercolor.com/us/support_warranty.asp


----------



## BruceB

I sent Powercolor an email earlier today but they haven't got back to me yet. I'll let you all know what they say when I get a Response.


----------



## Devildog83

Quote:


> Originally Posted by *BruceB*
> 
> I sent Powercolor an email earlier today but they haven't got back to me yet. I'll let you all know what they say when I get a Response.


I would hope they don't say anything different than what's in writing.


----------



## BruceB

Quote:


> Originally Posted by *Devildog83*
> 
> I would hope they don't say anything different than what's in writing.


I doubt it, I think the answer will be that TIM replacement is ok but any damage resulting from bad TIM replacement would not be covered. But I'd like to have it in writing before I do it!


----------



## Devildog83

Quote:


> Originally Posted by *BruceB*
> 
> I doubt it, I think the answer will be that TIM replacement is ok but any damage resulting from bad TIM replacement would not be covered. But I'd like to have it in writing before I do it!


That sounds like a plan too me. I don't blame you a bit.


----------



## Mackem

Quote:


> Originally Posted by *BruceB*
> 
> If you're running stock clocks then you should never see artifacts, if you do then you should return your card for a new one! At the end of the day you have paid for a product, if it dosen't work as it should then you're entitled to a replacement.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Plus, if you're getting artifacts at stock, then a properly working card should let you OC for more performance


Yeah, everything's at stock but I was debating whether to stick with the Asus DirectCUII TOP model, get another 280X or just switch to a GTX 770. I don't want to get a replacement card and have to deal with artifacting again.


----------



## BruceB

Quote:


> Originally Posted by *Mackem*
> 
> Yeah, everything's at stock but I was debating whether to stick with the Asus DirectCUII TOP model, get another 280X or just switch to a GTX 770. I don't want to get a replacement card and have to deal with artifacting again.


I understand. I'd just get it RMA'd, you shouldn't get any artefacting, you just had the bad luck to get a duff card!


----------



## Mackem

Quote:


> Originally Posted by *BruceB*
> 
> I understand. I'd just get it RMA'd, you shouldn't get any artefacting, you just had the bad luck to get a duff card!


The Asus forums are plagued with people having the exact same problem even after multiple RMAs, which doesn't instill too much confidence. I'm beginning to think there's something wrong with the R9 series of cards as a whole rather than just the card I have.


----------



## BruceB

Quote:


> Originally Posted by *Mackem*
> 
> The Asus forums are plagued with people having the exact same problem even after multiple RMAs, which doesn't instill too much confidence. I'm beginning to think there's something wrong with the R9 series of cards as a whole rather than just the card I have.


That's strange, I thought asus were one of the higher quality brands. My Powercolor works fine, I'd recommend it to anyone looking for an R9.


----------



## Recr3ational

My PowerColor is perfectly fine too. Maybe it's Asus?


----------



## BruceB

Bad news guys, just got the email back from powercolor:

_Hello Bruce,

*Yes, that will void the warranty.* If the card is reaching temperature of *95 degree Celsius* and above, please send it to us and we will repair it for you.

Best Regards,
Raymond,
North American Division,
Tech Support/RMA Manager,
Tul, Inc.
1216 John Reed Court, City of Industry,
CA, 91745, United States_

Also, he says _95C and above_ is grounds for an RMA, but google tells me that the max. temp for a 280X is 85C; is that just a typo or are powercolor GPUs super heat-tolerant or something?


----------



## Roboyto

Quote:


> Originally Posted by *BruceB*
> 
> Bad news guys, just got the email back from powercolor:
> 
> _Hello Bruce,
> 
> *Yes, that will void the warranty.* If the card is reaching temperature of *95 degree Celsius* and above, please send it to us and we will repair it for you.
> 
> Best Regards,
> Raymond,
> North American Division,
> Tech Support/RMA Manager,
> Tul, Inc.
> 1216 John Reed Court, City of Industry,
> CA, 91745, United States_
> 
> Also, he says _95C and above_ is grounds for an RMA, but google tells me that the max. temp for a 280X is 85C; is that just a typo or are powercolor GPUs super heat-tolerant or something?


The AMD/ATi cards have long been able to tolerate close to 100C for operating temperatures. I emailed them about my 4890 vapor-x which I saw temps over 110C with vapor-x cooler and large overclock. A Waterblock fixed that problem post Haste :-D

I also contacted them when my friend was running Xfire 6790s and they said the cards can safely run near 100C regularly. He was seeing mid-80s on top card.

The 290(X), I believe, are the first ones where the BIOS is allowing them to hit 94C before throttling.

It's definitely better to have them run cooler, but they are capable of doing high temp on a regular basis.


----------



## BruceB

Quote:


> Originally Posted by *Roboyto*
> 
> The AMD/ATi cards have long been able to tolerate close to 100C for operating temperatures. I emailed them about my 4890 vapor-x which I saw temps over 110C with vapor-x cooler and large overclock. A Waterblock fixed that problem post Haste :-D
> 
> I also contacted them when my friend was running Xfire 6790s and they said the cards can safely run near 100C regularly. He was seeing mid-80s on top card.
> 
> The 290(X), I believe, are the first ones where the BIOS is allowing them to hit 94C before throttling.
> 
> It's definitely better to have them run cooler, but they are capable of doing high temp on a regular basis.


Thanks for the info! I sent in my first card when it got to 96C then went into panic mode while I was looking at the desktop. Good to know what the limtis are.


----------



## Roboyto

Quote:


> Originally Posted by *BruceB*
> 
> Thanks for the info! I sent in my first card when it got to 96C then went into panic mode while I was looking at the desktop. Good to know what the limtis are.


If you're running stock clocks and it's getting that hot then there is definitely a problem. It's too bad they need to have it back to fix the overheating as it could be an easy fix of just reseating the cooler and applying new paste.


----------



## creationsh

I finally got around to fine tune my 24/7 Diamond 280Xcard.

For some strange reason my VDDC(set at 1.225v) on 98% load is 1.182 (gpu-z) but will occasionally spike up to 1.213v. It seems this is the best I can do before any artifacts and higher temperatures. Good luck.


----------



## eAT5

Quote:


> Originally Posted by *Recr3ational*
> 
> My PowerColor is perfectly fine too. Maybe it's Asus?


mine are fine. love them... they get hot, so i have 1 Bottom Fan next to PSU fan. 2 front fans extra HD bay removed and HD located to lowest point allowing direct airflow to the GPUs from the front., top fan and radiator fan and a back fan. when playing BF4 i run fans on GPU's at 75% runs 70c in game or benchmark.

i use this for fan control

http://www.nzxt.com/product/detail/78-sentry-lx-fan-control


----------



## creationsh

Quote:


> Originally Posted by *BruceB*
> 
> That's strange, I thought asus were one of the higher quality brands. My Powercolor works fine, I'd recommend it to anyone looking for an R9.


They are, its just the fact Asus 280x owners abuse them a lot more than the others.


----------



## Mackem

Quote:


> Originally Posted by *creationsh*
> 
> They are, its just the fact Asus 280x owners abuse them a lot more than the others.


Yep, I abuse the heck out of my card. I'm that extreme that I leave it at stock clocks and everything! Crazy, right?


----------



## creationsh

Quote:


> Originally Posted by *Mackem*
> 
> Yep, I abuse the heck out of my card. I'm that extreme that I leave it at stock clocks and everything! Crazy, right?


Wanna switch cards? XD


----------



## Megadrone

Quote:


> Originally Posted by *Roboyto*
> 
> Core @ 850MHz with boost to 1GHz
> RAM @ 6GHz
> 
> http://www.anandtech.com/show/7400/the-radeon-r9-280x-review-feat-asus-xfx


in Afterburner, which do i change? Do i change the core clock to 850 or 1000? And whats the factory default for memory clock? my 280x likes to artifact.


----------



## Mackem

I picked up the Asus card due to its cooler and whatnot. Are there any 280X's that have good cooling as well but are perhaps less plagued with issues than the Asus cards?


----------



## BruceB

Quote:


> Originally Posted by *Mackem*
> 
> I picked up the Asus card due to its cooler and whatnot. Are there any 280X's that have good cooling as well but are perhaps less plagued with issues than the Asus cards?


I've only owned a powercolor and seen a toxic in the 'flesh', I can tell you that the toxic is without a doubt louder (but also without a doubt faster!). It's unlikely that anyone here has personal experience with _all_ the 280x's out there, so your best bet is to read as many Reviews (or even better, comparisons) as you can before you buy!


----------



## Roboyto

Quote:


> Originally Posted by *Megadrone*
> 
> in Afterburner, which do i change? Do i change the core clock to 850 or 1000? And whats the factory default for memory clock? my 280x likes to artifact.


You can set it to whichever you like, the card will automatically run up to 1000MHz without having to change anything.

The memory clocks are usually displayed as a multiplier. For AMD cards they display 1/4 of actual speed, so 1500 in AB would be 6000MHz(you can set most OC utilities to display actual sp3ed if you desire).

From my experience with these cards, artifacting is usually caused by core clocks or insufficient voltage when overclocking the card.


----------



## creationsh

I wanna say HIS cards. I like their card internals and I like their cooler. Its may not be the biggest, but its definately the longest.- 11.8 inch. http://i57.tinypic.com/2jbo0i.jpg they have good reviews, but very few.


----------



## End3R

Got my R9 back today and it seems to be working smoothly so far. Having trouble pinpointing exactly what model I got though. The fan/heatsink says HIS, but GPU-Z lists the subvendor as ATI-AIB. And the Device ID is 6810 which google says is a Sapphire. The core/memory clock came at 1050/1400. The shipping invoice just says Radeon r9 270x so no help there. Any ideas?

Looks like this:


Spoiler: Warning: Spoiler!







I'll run some benches to compare it to the 450 gts I was using while waiting for the RMA when I get off work. During testing the temps never went over 69c and the fan never went over 45%.


----------



## Devildog83

Quote:


> Originally Posted by *End3R*
> 
> Got my R9 back today and it seems to be working smoothly so far. Having trouble pinpointing exactly what model I got though. The fan/heatsink says HIS, but GPU-Z lists the subvendor as ATI-AIB. And the Device ID is 6810 which google says is a Sapphire. The core/memory clock came at 1050/1400. The shipping invoice just says Radeon r9 270x so no help there. Any ideas?
> 
> Looks like this:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I'll run some benches to compare it to the 450 gts I was using while waiting for the RMA when I get off work. During testing the temps never went over 69c and the fan never went over 45%.


It's probably a reference PCB with a HIS heatsink on it.


----------



## DarthBaggins

reference is very good, EK still has a waterblock for the 7870's which in most cases matches up w/ the 270x's


----------



## DiceAir

I haven't tried any other games but i was running 1150 on the core just fine for weeks now but suddenly after a few weeks when i play BF4 or BF3 my pc will just freeze all the sudden. i haven't checked stock speed (1100mhz) clock to see if it's still giving issues. funny thing is I play then it will crash freeze and have to reboot. then what I do is I restart the pc and it will work for a long time. Could be something to do with Battlelog or battlefield in general.

Just a side not was running 14.4 and it was stable for weeks but now it's not.


----------



## End3R

Here are some benches for the replacement r9 I got today. So far things have seemed ok (fingers crossed). Haven't seen any flickering or crashing drivers like my last r9.

All of these were at 1680x1050 resolution.

r9 270x


Spoiler: Warning: Spoiler!
























And incase anyone would like to see, here are the benches for my 450 gts I was using while waiting on my r9, gotta say I was impressed with this 4yr old beast.


Spoiler: Warning: Spoiler!


----------



## H01ym0ses

Quote:


> Originally Posted by *DiceAir*
> 
> I haven't tried any other games but i was running 1150 on the core just fine for weeks now but suddenly after a few weeks when i play BF4 or BF3 my pc will just freeze all the sudden. i haven't checked stock speed (1100mhz) clock to see if it's still giving issues. funny thing is I play then it will crash freeze and have to reboot. then what I do is I restart the pc and it will work for a long time. Could be something to do with Battlelog or battlefield in general.
> 
> Just a side not was running 14.4 and it was stable for weeks but now it's not.


Check thermals first.. make sure your cooler is solid and get a temp sensor proggie running to monitor CPU/GPU when you play if you have dual monitor. GPU would be later down the list I would look at HDD next then play with the memory. HDD is somewhat unlikely but, memory I've had loads of issues with heatsink contact and crappy thermal paste on them. If all that checks out ok time to break out voltmeter and start probing the PSU. Just my suggestions nothing set in stone but I've got enough dead parts laying around my house can attest to things eventually give out..


----------



## DiceAir

Quote:


> Originally Posted by *H01ym0ses*
> 
> Check thermals first.. make sure your cooler is solid and get a temp sensor proggie running to monitor CPU/GPU when you play if you have dual monitor. GPU would be later down the list I would look at HDD next then play with the memory. HDD is somewhat unlikely but, memory I've had loads of issues with heatsink contact and crappy thermal paste on them. If all that checks out ok time to break out voltmeter and start probing the PSU. Just my suggestions nothing set in stone but I've got enough dead parts laying around my house can attest to things eventually give out..


My temps is under 80C on full load. it just randomley crashes ut funny enough it doesn't look like a driver crash as I can then just alt+tab and then alt+tab back into game and then it will run for a few minutes and whole pc freezes. Maybe it's to do with my monitor and 120Hz.

I have my qnix qx2710 @ 120hz with custom tight timings so maybe it's to low timings and though I'm not getting green lines it might not be stable.

Let me play a little bit on 120hz. I was playing some other games @ 60hz cause they don't support anything above 60hz and it was running just fine so maybe it's to demanding on my gpu's. If I play on 96hz and it doesn't crash I will maybe try 110hz.


----------



## H01ym0ses

Quote:


> Originally Posted by *DiceAir*
> 
> My temps is under 80C on full load. it just randomley crashes ut funny enough it doesn't look like a driver crash as I can then just alt+tab and then alt+tab back into game and then it will run for a few minutes and whole pc freezes. Maybe it's to do with my monitor and 120Hz.
> 
> I have my qnix qx2710 @ 120hz with custom tight timings so maybe it's to low timings and though I'm not getting green lines it might not be stable.
> 
> Let me play a little bit on 120hz. I was playing some other games @ 60hz cause they don't support anything above 60hz and it was running just fine so maybe it's to demanding on my gpu's. If I play on 96hz and it doesn't crash I will maybe try 110hz.


I've had similar issues it turned out to be a problem with my PSU under powering my board was losing a 12v rail on it and would randomly hang..


----------



## DiceAir

Quote:


> Originally Posted by *H01ym0ses*
> 
> I've had similar issues it turned out to be a problem with my PSU under powering my board was losing a 12v rail on it and would randomly hang..


Any way I can check the 12v rail under load? on HWinfo my lowest volts will be like 12v but sometimes n desktop my volts will drop to 0.8 or something but I don't get any issues so I suppose it's a issue with Hwinfo. I also done a test with aida64 letting my cpu and gpu be under stress and all seems ok. Had my cpu over night and it's fine. Now I played bf3 and bf4 after changing my monitor to 96hz and all seems fine so far


----------



## Mackem

Just trying to figure out if there's a brand of the 280X that I have a higher chance of not getting a one that artifacts or if they all do it.


----------



## JeremyFenn

Hey, I just got my CPU to 5.16 Ghz and I'm ready to do some serious gaming!! I've got a Sapphire Vapor-X Tri-X R9 280x with catalyst version 14.4 (NOT the beta). I'm curious about the "AMD Optimized" things for Tessellation and surface optimization. I want everything to look as nice as possible. Also, I have the AA mode and samples set to "Enhance application settings".


----------



## P-39 Airacobra

Mine is the HIS R9 270 iPower IceQ X² Boost Clock I paid $179 and got 2 free games and free delivery. My clocks are stock 925/1400 (I am still in I am going to baby it mode)



Looks like I am not exclusive enough to be added. LOL Oh well I will live.


----------



## DarthBaggins

Hearing that temps close or at 100c makes me feel comfy with my 40's temps in mine under air. Also just added a fan in my case on the hdd cage so that should add more flow to them of cooler air (103cfm BGears Blaster 120)


----------



## Recr3ational

Quote:


> Originally Posted by *Mackem*
> 
> Just trying to figure out if there's a brand of the 280X that I have a higher chance of not getting a one that artifacts or if they all do it.


My powercolour hasn't got any artifacts. Both of them rub at 1250/1650 under water. So the clocks is not an issue there.

Many people who has power colour hasn't got any problem with theirs either.


----------



## H01ym0ses

Quote:


> Originally Posted by *DiceAir*
> 
> Any way I can check the 12v rail under load? on HWinfo my lowest volts will be like 12v but sometimes n desktop my volts will drop to 0.8 or something but I don't get any issues so I suppose it's a issue with Hwinfo. I also done a test with aida64 letting my cpu and gpu be under stress and all seems ok. Had my cpu over night and it's fine. Now I played bf3 and bf4 after changing my monitor to 96hz and all seems fine so far


You can get an inexpensive voltmeter from walmart or somewhere and depending on your case you can place the lead through the back of the pin connector and see how much and if it flucuates. It seems strange that a refresh rate would cause a system to lock up. I could see it causing graphics distortion or "unable to display current resolution" message.


----------



## Tobe404

Anyone have an issue where undervolting does not remain when gaming anymore?

Gigabyte R9 280x Rev 2 has gone back to 1.163v in games, randomly, seeminly out of no where...

Despite being undervolted through Trixx. And yes, it is a modified BIOS. Restart doesn't fix the issue either.


----------



## DiceAir

Quote:


> Originally Posted by *H01ym0ses*
> 
> You can get an inexpensive voltmeter from walmart or somewhere and depending on your case you can place the lead through the back of the pin connector and see how much and if it flucuates. It seems strange that a refresh rate would cause a system to lock up. I could see it causing graphics distortion or "unable to display current resolution" message.


Jsut one note. I had my gpu overclocked to 1150 and then 1125 and Maybe i can't even do any overclock on my GPU anymore. I will stick to 1100 for now and test with that also. I just hope I'm not unstable when I go back to crossfire.


----------



## BruceB

Quote:


> Originally Posted by *DiceAir*
> 
> Jsut one note. I had my gpu overclocked to 1150 and then 1125 and Maybe i can't even do any overclock on my GPU anymore. I will stick to 1100 for now and test with that also. I just hope I'm not unstable when I go back to crossfire.


Are your temps ok? It could be thermal throttling.


----------



## DiceAir

Quote:


> Originally Posted by *BruceB*
> 
> Are your temps ok? It could be thermal throttling.


Temps = 75C on full load


----------



## H01ym0ses

Quote:


> Originally Posted by *BruceB*
> 
> Are your temps ok? It could be thermal throttling.


Quote:


> Originally Posted by *DiceAir*
> 
> Jsut one note. I had my gpu overclocked to 1150 and then 1125 and Maybe i can't even do any overclock on my GPU anymore. I will stick to 1100 for now and test with that also. I just hope I'm not unstable when I go back to crossfire.


Yea I usually try putting stuffs back to stock settings one at a time to see which one stops the problem.


----------



## DiceAir

Quote:


> Originally Posted by *H01ym0ses*
> 
> Yea I usually try putting stuffs back to stock settings one at a time to see which one stops the problem.


BF4 crashed even on stock clocks. maybe the card is unstable even on stock clocks.

Ok so this is my issue. on desktop when it downclocks to 300mhz core and 150mhz memory and drag my mouse across the screen making a blue box I will get green lines inside that box. Then when I play a video file it will also give me green vertical lines. I have another card and installed it and what do you know. Not a single green line. it down clocks nicely to 300mhz/150mhz.

I had this issue with green lines even on a fresh install of windows and only drivers installed with everything else on stock. So maybe my card is faulty and need to get it swap out. I will be testing it on my pc at work that's running a 1080p 60hz monitor if it fits in the case.

The one BIG issue I have is how do I prove it to the guys at the pc shop as they don't have a 2560x1440 monitor so I hope it does give me issues on 1080p as well.


----------



## JeremyFenn

Take a video on your phone & email it to them showing the green lines. Show the resolution your using as well as the gpu & driver version you're using. They might have a fix or tell you to mail it in.


----------



## Roboyto

Quote:


> Originally Posted by *DiceAir*
> 
> The one BIG issue I have is how do I prove it to the guys at the pc shop as they don't have a 2560x1440 monitor so I hope it does give me issues on 1080p as well.


Just bring your monitor with you? When you say PC shop i presume a local store?


----------



## DiceAir

lol
Quote:


> Originally Posted by *Roboyto*
> 
> Just bring your monitor with you? When you say PC shop i presume a local store?


Not that local store but If I must then i will but i will try to avoid it.
Quote:


> Originally Posted by *JeremyFenn*
> 
> Take a video on your phone & email it to them showing the green lines. Show the resolution your using as well as the gpu & driver version you're using. They might have a fix or tell you to mail it in.


Sounds like a brilliant idea dude. I might just do that. i will show them my clocks and the cards using etc etc. make a small video and put it together. they might just see hey the card is faulty and then take it back.


----------



## bartledoo

What do i do exactly to unlock voltage with VBE7? Do I just check Manual Power Adjustment?

edit: Nevermind i got it, i changed it to 1268 VDDC from 1219


----------



## H01ym0ses

Quote:


> Originally Posted by *DiceAir*
> 
> BF4 crashed even on stock clocks. maybe the card is unstable even on stock clocks.
> 
> Ok so this is my issue. on desktop when it downclocks to 300mhz core and 150mhz memory and drag my mouse across the screen making a blue box I will get green lines inside that box. Then when I play a video file it will also give me green vertical lines. I have another card and installed it and what do you know. Not a single green line. it down clocks nicely to 300mhz/150mhz.
> 
> I had this issue with green lines even on a fresh install of windows and only drivers installed with everything else on stock. So maybe my card is faulty and need to get it swap out. I will be testing it on my pc at work that's running a 1080p 60hz monitor if it fits in the case.
> 
> The one BIG issue I have is how do I prove it to the guys at the pc shop as they don't have a 2560x1440 monitor so I hope it does give me issues on 1080p as well.


Do you have anything else you can use to test the monitor on? if its HDMI you can use dvd player or something. green lines and blue artifacts does sound like video for sure but, it's weird if it doesn't show up on their monitor at tech shop. You can try using a screen capture to make a video of the anomalies happening and see if you play them back on another computer if they show the same artifacts in the encode. If they do then its not your monitor if not then you need to check cable and monitor.


----------



## DiceAir

Quote:


> Originally Posted by *JeremyFenn*
> 
> Take a video on your phone & email it to them showing the green lines. Show the resolution your using as well as the gpu & driver version you're using. They might have a fix or tell you to mail it in.


Quote:


> Originally Posted by *H01ym0ses*
> 
> Do you have anything else you can use to test the monitor on? if its HDMI you can use dvd player or something. green lines and blue artifacts does sound like video for sure but, it's weird if it doesn't show up on their monitor at tech shop. You can try using a screen capture to make a video of the anomalies happening and see if you play them back on another computer if they show the same artifacts in the encode. If they do then its not your monitor if not then you need to check cable and monitor.


So I tested my other graphics card that's the same card r9 280x and I get no green lines etc etc. Everything works fine so i think the card is faulty. maybe it was faulty all along


----------



## Megadrone

Quote:


> Originally Posted by *Tobe404*
> 
> Anyone have an issue where undervolting does not remain when gaming anymore?
> 
> Gigabyte R9 280x Rev 2 has gone back to 1.163v in games, randomly, seeminly out of no where...
> 
> Despite being undervolted through Trixx. And yes, it is a modified BIOS. Restart doesn't fix the issue either.


i have the same card, why modified BIOS?


----------



## Tobe404

Quote:


> Originally Posted by *Megadrone*
> 
> i have the same card, why modified BIOS?


Because without a modified BIOS I can't undervolt...

Re-flashed modified BIOS and all is working again now though.
Not sure what caused it to revert to stock BIOS out of no where but anyway.

Also, doesn't seem like Far Cry 3 likes over or under clocks.
Can get about 50-60mhz more either way in every other game when compared to Far Cry 3.

Not that it really matters. More just an annoyance.


----------



## bartledoo

Here is the OC I got after modding my bios on the toxic 270x


----------



## H01ym0ses

Quote:


> Originally Posted by *DiceAir*
> 
> So I tested my other graphics card that's the same card r9 280x and I get no green lines etc etc. Everything works fine so i think the card is faulty. maybe it was faulty all along


Cool. Kind of sucks that the card is likely the culprit but, now at least you know what to focus on. Is the card under warranty so you can at least get factory replacement?


----------



## DiceAir

Quote:


> Originally Posted by *H01ym0ses*
> 
> Cool. Kind of sucks that the card is likely the culprit but, now at least you know what to focus on. Is the card under warranty so you can at least get factory replacement?


Card is still under warranty. Actually it's a good thing that the card is faulty so now I can get it swapped or get credit from store. Rather the card than my monitor or something else.

Maybe if I get the other card i will be able to overclcock the bad boy much further than I thought I can.


----------



## Devildog83

Quote:


> Originally Posted by *bartledoo*
> 
> Here is the OC I got after modding my bios on the toxic 270x
> 
> 
> Spoiler: Warning: Spoiler!


Those are monster clocks.


----------



## JeremyFenn

So I own a sapphire Vapor-X Tri-X R9 280x OC with BOOST. When using MSI Afterburner I can't use the slider to change the voltage? I guess I'd have to d/l a modded BIOS for it to let me change it huh. I can't really complain TOO much, at stock it has 1100 on the GPU and 1500 on the ram.


----------



## Devildog83

*JeremyFenn* has been added!!! Welcome, could you post a pic for us to see?

Try some other clocking tools like Trixx and see if they allow a voltage increase, I haven't found them to be stable for me but others have.


----------



## JeremyFenn

Sure I'm pretty sure I have a pic of it.



I'm at the office, so I'll have to post GPU-Z sensors during render when I get home. I've tried that Trixx app but still couldn't boost the voltages.


----------



## Devildog83

Nice, that's a very long card.


----------



## Roaches

The feel when no backplates, come on sapphire, you can do better than that...Considering 7970 Vapor X had one.


----------



## JeremyFenn

Yeah I agree back plate would have been nice, especially how long and heavy this card is. I have to say, there was an issue I was having with it throttling down to 300core 150mem after rendering a while and it would get up to 53c (I remember this well). I pushed the little BIOS button on the side (not sure if that turns on it's UEFI Bios or turns it off) and it hasn't done it since. Only on DX11 content too which was weird. It would run DX9 forever at 1100 as soon as I'd bench something like OCCT or whatever on DX11 it would throttle down. Anyway, it's pretty beastly, and I definitely like the blue Vapor-X LED on the side, matches my scheme very nicely. The Card itself is just a hair under 12" btw, but it fits perfectly in my Corsair Carbide 540 case.


----------



## Megadrone

Quote:


> Originally Posted by *Tobe404*
> 
> Because without a modified BIOS I can't undervolt...
> 
> Re-flashed modified BIOS and all is working again now though.
> Not sure what caused it to revert to stock BIOS out of no where but anyway.
> 
> Also, doesn't seem like Far Cry 3 likes over or under clocks.
> Can get about 50-60mhz more either way in every other game when compared to Far Cry 3.
> 
> Not that it really matters. More just an annoyance.


can one under/overvolt from Afterburner? I get artifacts on mine, maybe thats the issue. where did you get that modified bios though?


----------



## H01ym0ses

Quote:


> Originally Posted by *DiceAir*
> 
> Card is still under warranty. Actually it's a good thing that the card is faulty so now I can get it swapped or get credit from store. Rather the card than my monitor or something else.
> 
> Maybe if I get the other card i will be able to overclcock the bad boy much further than I thought I can.


I would suggest running stock for a week or 2 before tweaking the new card give it some time to burn in. I usually run heavy bench's or burn in proggie when i get new hardware I want to make sure it ain't gonna belly up on me later on.


----------



## bartledoo

Quote:


> Originally Posted by *Devildog83*
> 
> [/SPOILER]
> 
> Those are monster clocks.


Thanks!! it think it was at 1319 mv, so im going to see what i can do at 1350mv


----------



## NoobOCerz

hello

I would really like to know what is the safe voltag,memory,power limit and core clock for overclocking r9 280x toxic edition.


----------



## diggiddi

Quote:


> Originally Posted by *NoobOCerz*
> 
> hello
> 
> I would really like to know what is the safe voltag,memory,power limit and core clock for overclocking r9 280x toxic edition.


I think you are looking for this http://www.overclock.net/t/633816/how-to-overclock-your-amd-ati-gpu


----------



## neurotix

Quote:


> Originally Posted by *roaches*
> 
> The feel when no backplates, come on sapphire, you can do better than that...Considering 7970 Vapor X had one.


I have a 7970 Vapor-X and it has no backplate. The one I had before I RMA'ed it had no backplate either.

The 280X Vapor-X should look like this:



http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-280x-vapor-x-review/3/ More images here.

The shroud itself is completely different and has a lit-up "Sapphire" logo, not a lit up Vapor-X logo. The 280X Vapor-X from Kitguru has the exact same shroud as my 7970 Vapor-X, except my 7970 is missing the backplate.

As far as JeremyFenn's card goes, is that a new model of 280X Vapor-X or something? The cooler looks almost identical to my 270X Vapor-X, and it has the same shroud, but the Vapor-X logo is lit up, and my 270X doesn't do that.



As you can see, the cooler and shroud are very similar and the logo is in the same font.

EDIT: Just searched newegg, apparently Sapphire has taken the Vapor-X cooler from the new 290/X and slapped it on the 280X, so now there's *two* "280X Vapor-X". http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=280x+vapor-x&N=-1&isNodeId=1

I'm the world's biggest Sapphire fan but even I don't see the point in this, and why no backplate on it?


----------



## Tobe404

Quote:


> Originally Posted by *Megadrone*
> 
> can one under/overvolt from Afterburner? I get artifacts on mine, maybe thats the issue. where did you get that modified bios though?


Depends on the card I think. I can't adjust voltage in Afterburner at all. Only Trixx or GPU Tweak.

What's your card?


----------



## JeremyFenn

I dunno, I liked the LED and the Vapor-X technology also the tri fan design was a good thing too so I figured why not!!


----------



## diggiddi

Quote:


> Originally Posted by *neurotix*
> 
> I have a 7970 Vapor-X and it has no backplate. The one I had before I RMA'ed it had no backplate either.
> 
> The 280X Vapor-X should look like this:
> 
> 
> 
> http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-280x-vapor-x-review/3/ More images here.
> 
> The shroud itself is completely different and has a lit-up "Sapphire" logo, not a lit up Vapor-X logo. The 280X Vapor-X from Kitguru has the exact same shroud as my 7970 Vapor-X, except my 7970 is missing the backplate.
> 
> As far as JeremyFenn's card goes, is that a new model of 280X Vapor-X or something? The cooler looks almost identical to my 270X Vapor-X, and it has the same shroud, but the Vapor-X logo is lit up, and my 270X doesn't do that.
> 
> 
> 
> As you can see, the cooler and shroud are very similar and the logo is in the same font.
> 
> EDIT: Just searched newegg, apparently Sapphire has taken the Vapor-X cooler from the new 290/X and slapped it on the 280X, so now there's *two* "280X Vapor-X". http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=280x+vapor-x&N=-1&isNodeId=1
> 
> I'm the world's biggest Sapphire fan but even I don't see the point in this, and why no backplate on it?


How well do your vapor X overclock do they have Hynix or Elpida memory, also how'd they compare vs the quality of a matrix platinum
ThX


----------



## JeremyFenn

I haven't OC'd mine yet (and I stress the word yet). I haven't found a program where I can change the voltage. MSI Afterburner or trixx none of them let me alter the core voltage, and I really don't want to try OCing this card until I can because it's already stock OC'd. (1100core 1500mem)


----------



## Roboyto

Quote:


> Originally Posted by *JeremyFenn*
> 
> I haven't OC'd mine yet (and I stress the word yet). I haven't found a program where I can change the voltage. MSI Afterburner or trixx none of them let me alter the core voltage, and I really don't want to try OCing this card until I can because it's already stock OC'd. (1100core 1500mem)


Sorry if you answered this already, but what card do you have?

I had a 7970 XFX DD that was hardware locked for voltage, so it is a possibility on those cards.


----------



## JeremyFenn

I'm using a:

SAPPHIRE VAPOR-X 100363VX-2SR Radeon R9 280X PCI Express 3.0 TRI-X OC w/ Boost Video Card (UEFI)

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202095


----------



## neurotix

Quote:


> Originally Posted by *diggiddi*
> 
> How well do your vapor X overclock do they have Hynix or Elpida memory, also how'd they compare vs the quality of a matrix platinum
> ThX


Both of the 7970s I had would do 1200/1600mhz, and I'm pretty sure both had Elpida memory.

Afaik the only cards that have guaranteed Hynix memory are the 290/290X Tri-X from Sapphire.


----------



## JeremyFenn

Ok here's what I got guys:






Need some help here coz I can't find where to OC the GPU voltage and VDDA is maxed @ 1.2 and can't find any way to raise it.


----------



## neurotix

The card might be too new to be supported by the current Trixx.

You might have to wait until they release a new version that allows full voltage control for that card.

Either that or it's simply voltage locked.

If you can do 1200/1500mhz stably with that voltage, I wouldn't complain though, because that's very good and pretty close to the clock limit for a 7970/280X (most cards won't do much over 1200, and some won't even do that.)


----------



## JeremyFenn

Would disabling ULPS help me any? Maybe increasing my PCIe Freq in BIOS?


----------



## JeremyFenn

screw it, I'm gonna try for 1200/1600. Why not.


----------



## rdr09

Quote:


> Originally Posted by *JeremyFenn*
> 
> screw it, I'm gonna try for 1200/1600. Why not.


hell yeh but do not mess with the pcie in bios.


----------



## Devildog83

Quote:


> Originally Posted by *JeremyFenn*
> 
> screw it, I'm gonna try for 1200/1600. Why not.


Do not change anything with the PCI-E frequency in the bios. You could really screw stuff up doing that. Yes disable DLPS.


----------



## JeremyFenn

So
Quote:


> Originally Posted by *Devildog83*
> 
> Do not change anything with the PCI-E frequency in the bios. You could really screw stuff up doing that. Yes disable DLPS.


So far 1140 on core, haven't touched mem, tried the 1200/1600 and it wasn't havin it!! Gonna tweak some more after work tomorrow, PCIE VDDA in bios set to 1.92v full @ 1.2 on trixx but no gpu voltage slider...


----------



## diggiddi

Quote:


> Originally Posted by *JeremyFenn*
> 
> So
> 
> So far 1140 on core, haven't touched men, tried the 1200/1600 and it wasn't havin it!! Gonna tweak some more after work tomorrow, PCIE VDDA in bios set to 1.92v full @ 1.2 on trixx but no gpu voltage slider...


Try v4.5.0 mine goes up to 1300, 7950 though


----------



## Megadrone

Quote:


> Originally Posted by *Tobe404*
> 
> Depends on the card I think. I can't adjust voltage in Afterburner at all. Only Trixx or GPU Tweak.
> 
> What's your card?


same one as yours, rev 2 gigabyte. It shows artifacts for me without changing anything. But i find that tweaking the 3D settings in Catalyst by turning off the optimizations helps alleviate it. Looks like AMD's optimizations may have something to do with it.


----------



## BruceB

Quote:


> Originally Posted by *Megadrone*
> 
> same one as yours, rev 2 gigabyte. It shows artifacts for me without changing anything. But i find that tweaking the 3D settings in Catalyst by turning off the optimizations helps alleviate it. Looks like AMD's optimizations may have something to do with it.


What's the performance like without optimisations?


----------



## ILLmatik94

Sapphire R9-280 on sale for $209 w/promo code on Newegg http://www.newegg.com/Product/Product.aspx?Item=N82E16814202099


----------



## JeremyFenn

Quote:


> Originally Posted by *Megadrone*
> 
> same one as yours, rev 2 gigabyte. It shows artifacts for me without changing anything. But i find that tweaking the 3D settings in Catalyst by turning off the optimizations helps alleviate it. Looks like AMD's optimizations may have something to do with it.


I've noticed on my R9 280x when benching in Heaven 4.0 to test stability, if I turn on ANY tessellation I get minecraft trying to break through (artifacting). Anyone else get that with these cards? I've tried going back to stock everything, even uninstalling Trixx, and still artifacts like crazy with Tessellation on (on any level, moderate, normal, extreme, etc.)


----------



## JeremyFenn

Definately thinking about RMAing this video card and going NVIDIA getting a GTX 770.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125462

Looking at the benchmarks this card is beast and actually outperforms my 280x for only $40 more. Course I'll be missing out on the mantle features, but if this card can't even do tessellation then what's the point?


----------



## TheN00bBuilder

I have an R9 270 I just got off eBay. Gigabyte OC edition.


----------



## rdr09

Quote:


> Originally Posted by *JeremyFenn*
> 
> Definately thinking about RMAing this video card and going NVIDIA getting a GTX 770.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125462
> 
> Looking at the benchmarks this card is beast and actually outperforms my 280x for only $40 more. Course I'll be missing out on the mantle features, but if this card can't even do tessellation then what's the point?


this is about 30% faster . . .

http://www.amazon.com/Gigabyte-GDDR5-4GB-2xDVI-Graphics-GV-R929OC-4GD/dp/B00HS84DFU/ref=sr_1_1?s=electronics&ie=UTF8&qid=1400073166&sr=1-1&keywords=r9+290

actually, faster than a stock 290X reference.


----------



## JeremyFenn

I dunno, tbh I've always gone AMD CPU's NVIDIA GPU's. Just the way I've always rolled. Going to this 8 core and looking at mantle, I just wanted a beefy AMD card. I can't even use Tessellation so doubt I'll be going to another AMD card. I'll keep you all updated.


----------



## rdr09

Quote:


> Originally Posted by *JeremyFenn*
> 
> I dunno, tbh I've always gone AMD CPU's NVIDIA GPU's. Just the way I've always rolled. Going to this 8 core and looking at mantle, I just wanted a beefy AMD card. I can't even use Tessellation so doubt I'll be going to another AMD card. I'll keep you all updated.


770 will serve you well. get the 2GB cause that 4GB is a waste. cheaper, too.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814130921

or just get a 780.


----------



## JeremyFenn

Quote:


> Originally Posted by *rdr09*
> 
> 770 will serve you well. get the 2GB cause that 4GB is a waste. cheaper, too.


Why would 4GB be a waste?


----------



## rdr09

Quote:


> Originally Posted by *JeremyFenn*
> 
> Why would 4GB be a waste?


that card will run out of grunt before using 2GB, so they say. ask at the 770 club.


----------



## Antoniv

R9 280x Dead? No Display.
Ok, so there's an issue, yesterday I've received an asus R9 280x DC2T. Today, after I started to setup my mining setup with cgminer (for experimenting) in a few minutes my pc crashed (I don't think its a cgminer problem because the crash could occur when playing bf4 for example). I was surprised that my screen was white with black lines https://discussions.apple.com/servlet/JiveServlet/showI... Looked like this as far as I remember. Now after I went through a shock, I decided to see what the real issue is. I reassembled all the cables, tuned it on again and I my motherboard shows the code which is 62 it means "Installation of the PCH runtime services" I am like, ***? I turned the pc off again. I put in my old gtx 560 ti in my mobo, and everything boots up without any problem. I thought, maybe it is a temporary issue. I replaced 560 with r9 again, and after I boot up (note that everything works, fans are spinning (not full speed), I see both green lights and everything works as it should) I still get error 62. I noticed as I touched the card, it starts to heat up really quickly and gets hot as hell. I thought maybe I should replace a thermal compound, maybe my thermal paste isn't applied that well and i've to replace it. There was a screw that I had to break and it looked like this http://i61.tinypic.com/20f4860.jpg I didn't even think about it and I started unscrewing without realizing it may break my warranty (That screw is warranty right? Hope not)
I noticed thermal compound was applied really badly, all over the chip and etc. I cleaned it up and applied my MK-2. Reassembled everything together and powered it up. Nothing happens, still a 62 code and the gpu still heats up. So, is it a faulty card? Btw I've tried to reinstall my mobo bios and etc, but I don't think it's a problem with my bios as it worked before. The thing is, I got a faulty and a broken card and I broke my warranty? Please tell me that screw didn't break my warranty.

My PC main specs: i5 4670k
mobo: gigabyte z87x ud3h
ram: kingston hyperx 2133mhz cl11
psu: Fortron 650w 80' bronze.


----------



## DarthBaggins

Yup bad card from what I'm reading.


----------



## Antoniv

Did I break the warranty or not?


----------



## DarthBaggins

It's a possibility, or just go on it was like that when you got it.


----------



## Antoniv

****..... I don't think it'll work that way lol.


----------



## JeremyFenn

usually unscrewing the heatsinks, applying a better thermal compound, replacing things, etc will void a warranty. If it's NOT that noticable then you can always play dumb and just say you wanted to play your games better because they were slow and when you got around to putting it in it didn't work. Can't usually make geeks think you're NOT a geek if you're speaking their language. Play dumb, that usually gets you pretty far with a tech when RMAing hardware.

I on the other hand am selling out back to NVIDIA. I had a replacement-only deal with Newegg on that video card, BUT since I spend obscene amounts of cash in that place, they told me they'd hook me up with a refund so I could get a 770 at the same price. Only bad thing is I can't find the GD crossfire bridge. How is it I have all that stupid little paperwork and power cables, etc., but can't for the life of me find the GD crossfire bridge??!!!







I need a drink...


----------



## Antoniv

Maybe removing that red circle thingy will make it a deal, like it wasn't even there in a first place?


----------



## Antoniv

Sorry for double post, but I removed that "red circle" and put the screws back together, maybe they won't notice it was even there because i've watched reviews and some cards came with the circle and some didn't, newegg review for example. I've said to the local shop that I have issues with it but didn't say anything about remove the heatsink and reapplying the thermal paste. Thing is, if they say they can't change the card or return the money because they noticed that I changed thermal paste, I'll reply to them and say that there was no sign or information that says "warranty void if removed". There's also no information about warranty breaking in the manual either nor there's info on asus site that says you can't change thermal compound. I'll try to play a dummy and say for example, "on msi boards there's a sign and a sticker that says "warranty void if removed" but on your card there was none and I hope that will work out. That local shop, they've sent the card to Poland. Hope polaks will not notice it. I hope....


----------



## Marty99

What do you think that "red circle" stick is for? Yes, for knowing if user has touched the card.
You have voided the warranty. But off course you can try RMA and see what happens.


----------



## Antoniv

Really? Where does it say that you can't remove that sticker? I think it's asus fault that they didn't inform about it in the manual. Already RMA the card, I hope I'll get a replacement or a refund...


----------



## JeremyFenn

I hope so too!! I have to say I won't be an official R9 280x owner anymore. I've sent mine off for RMA/Refund for a EVGA 770. Can't call me a sellout though, I've always been NVIDIA (gts450 TOP edition, 550ti was the latest one) I just thought AMD had some buff muscle for less cash, why not give them a chance...PLUS Mantle was all hyped up so yay mantle. BUT reading a lot of reviews about the R9's and artifacting issues and everyone just having a rough time with not just hardware but the catalyst drivers (like CFX stuttering, etc) I think I'm just gonna stick to what I know. AMD for CPU's NVIDIA for GPU's. I have to say, without tessellation on in Heaven 4.0 everything maxed out, I was getting over 60fps...


----------



## Antoniv

Do an atheist pray with me that I'll get one!


----------



## Devildog83

Quote:


> Originally Posted by *JeremyFenn*
> 
> I hope so too!! I have to say I won't be an official R9 280x owner anymore. I've sent mine off for RMA/Refund for a EVGA 770. Can't call me a sellout though, I've always been NVIDIA (gts450 TOP edition, 550ti was the latest one) I just thought AMD had some buff muscle for less cash, why not give them a chance...PLUS Mantle was all hyped up so yay mantle. BUT reading a lot of reviews about the R9's and artifacting issues and everyone just having a rough time with not just hardware but the catalyst drivers (like CFX stuttering, etc) I think I'm just gonna stick to what I know. AMD for CPU's NVIDIA for GPU's. I have to say, without tessellation on in Heaven 4.0 everything maxed out, I was getting over 60fps...


Sorry you were pushed into leaving. I don't know about the artifacting and X-Fire stutter because I have not experienced these issues with mine but I understand the frustration. Live long and prosper.


----------



## JeremyFenn

Thanks man. Yeah my wife's tower (which I had to RMA the ASRock Extreme3 board) has 2 R7 260x's in CFX. I notice in some things it stutters like mad even with a 750w PSU and frame pacing on. I did make sure to put those cards (I have 3 of them) in my tower and test the heaven with tessellation, all seemed to work THANK GOD !!! But like I said, I've been reading where the R9 series have been having issues due to the cores not accepting the new clocks (since they're basically re-badged 79xx's).. Anyway, I'm going to stay in here and keep this thread in my watch list just coz I likes you guys lol


----------



## rdr09

Quote:


> Originally Posted by *JeremyFenn*
> 
> Thanks man. Yeah my wife's tower (which I had to RMA the *ASRock Extreme3* board) has 2 R7 260x's in CFX. I notice in some things it stutters like mad even with a 750w PSU and frame pacing on. I did make sure to put those cards (I have 3 of them) in my tower and test the heaven with tessellation, all seemed to work THANK GOD !!! But like I said, I've been reading where the R9 series have been having issues due to the cores not accepting the new clocks (since they're basically re-badged 79xx's).. Anyway, I'm going to stay in here and keep this thread in my watch list just coz I likes you guys lol


AMD board? if it is . . . it is a bad idea to crossfire with it. even if the cards are low end.i think it i gets worst with low ends.


----------



## JeremyFenn

Well its a 990fx AM3+ board, only 4+2 VRM design, but I only had a 6300 (stock) in it and it popped the VRM so.... but anyway yeah could be because the CPU is bottlenecking it, who knows.


----------



## rdr09

Quote:


> Originally Posted by *JeremyFenn*
> 
> Well its a 990fx AM3+ board, only 4+2 VRM design, but I only had a 6300 (stock) in it and it popped the VRM so.... but anyway yeah could be because the CPU is bottlenecking it, who knows.


my bad. that board is a good one but not sure about the 6300 at stock.


----------



## JeremyFenn

Well I figured for what she does on it, 3.5-4.1 should be fine, and 2 260x's would be more than enough muscle for what game she plays. (dcuo)


----------



## Devildog83

Quote:


> Originally Posted by *JeremyFenn*
> 
> Well I figured for what she does on it, 3.5-4.1 should be fine, and 2 260x's would be more than enough muscle for what game she plays. (dcuo)


Not really that up on Asrock boards but I don't think a stock 6300 and 2 x 260x's should pop the VRM's, I could be wrong though with only 4+2 phase. I am glad I spent the money to get the best board I could so I don't have worry about stuff like that. It was more than $200 but well worth the investment and I have zero mobo issues.


----------



## JeremyFenn

Well I had a LGA775 board from them because I had a Q6600 that was just collecting dust and for cheap could get it up and running with an ASRock board (forget right now what kind). It also had DDR2/DDR3 slots so I could upgrade when I wanted. So far it's got 8GB of DDR3 1066 (max for that CPU and board) 120GB SSD and 1 R7 260x. My kids play on that and it's good for what they do. When I went with my wife's tower, I just wanted a decent 990FX board with a 6300 since like I said she only plays 1 game and doesn't do much else but surf around, so I wasn't going to spend $$$ on a CPU that she wouldn't really use plus the turbo on the 6350 is only 100Mhz more on the turbo end and I figured if she was gaming the 6300 would more than likely shoot up to the 4.1... I didn't think about the VRMs popping either, especially on a 95w TDP chip, but it worked for over a month just fine doing what she was doing and then one afternoon it wouldn't POST. I thought it was a bad PSU, but exchanging it with the kids 550w it still wouldn't boot and hers would boot their tower just fine. After taking out all the components (even the CPU) and only having a fan in the CPU1 fan header, it still wouldn't spin up so I knew it was the board. Should have known after hearing that 'ping' sound like a spring shooting off, but I had to be sure before starting an RMA. Before that happened though I ordered my 990FX Extreme9 because from what I have experienced (so far at least) ASRock seemed like a solid alternative to the ASUS boards for less cash, not to mention that 12+2 VRM setup looked really nice......Too bad it really isn't what it's all hyped up to be. I modified the VRM/NB heat sinks with PK3 TIM and fans, didn't care about voiding anything or burning it up since if it would, I don't really want another one, I'd just get something like a saber r2.

Actually you know what, on a side note here, ever since I did that modification, I have to say I haven't really had a lick of trouble off that board. I've had cool(er) temps off the VRMs and got the CPU up to 5.16 so, I'd say I feel lucky to have that board not give me more issues than what I had before, not to mention the clarification that ASRock's boards have backwards LLC settings (75% ASRock = 25% on ASUS) and their funky RAM compatibility... Imo though, if you're going to OC a CPU, you might as well OC everything including RAM and most the time the XMP won't be stable over what it's meant to be anyway.


----------



## Devildog83

Quote:


> Originally Posted by *JeremyFenn*
> 
> Well I had a LGA775 board from them because I had a Q6600 that was just collecting dust and for cheap could get it up and running with an ASRock board (forget right now what kind). It also had DDR2/DDR3 slots so I could upgrade when I wanted. So far it's got 8GB of DDR3 1066 (max for that CPU and board) 120GB SSD and 1 R7 260x. My kids play on that and it's good for what they do. When I went with my wife's tower, I just wanted a decent 990FX board with a 6300 since like I said she only plays 1 game and doesn't do much else but surf around, so I wasn't going to spend $$$ on a CPU that she wouldn't really use plus the turbo on the 6350 is only 100Mhz more on the turbo end and I figured if she was gaming the 6300 would more than likely shoot up to the 4.1... I didn't think about the VRMs popping either, especially on a 95w TDP chip, but it worked for over a month just fine doing what she was doing and then one afternoon it wouldn't POST. I thought it was a bad PSU, but exchanging it with the kids 550w it still wouldn't boot and hers would boot their tower just fine. After taking out all the components (even the CPU) and only having a fan in the CPU1 fan header, it still wouldn't spin up so I knew it was the board. Should have known after hearing that 'ping' sound like a spring shooting off, but I had to be sure before starting an RMA. Before that happened though I ordered my 990FX Extreme9 because from what I have experienced (so far at least) ASRock seemed like a solid alternative to the ASUS boards for less cash, not to mention that 12+2 VRM setup looked really nice......Too bad it really isn't what it's all hyped up to be. I modified the VRM/NB heat sinks with PK3 TIM and fans, didn't care about voiding anything or burning it up since if it would, I don't really want another one, I'd just get something like a saber r2.
> 
> Actually you know what, on a side note here, ever since I did that modification, I have to say I haven't really had a lick of trouble off that board. I've had cool(er) temps off the VRMs and got the CPU up to 5.16 so, I'd say I feel lucky to have that board not give me more issues than what I had before, not to mention the clarification that ASRock's boards have backwards LLC settings (75% ASRock = 25% on ASUS) and their funky RAM compatibility... Imo though, if you're going to OC a CPU, you might as well OC everything including RAM and most the time the XMP won't be stable over what it's meant to be anyway.


Yep, like I said I don't know that much about Asrock except the AM2+ board I bought for my stepsons rig works fine, it was a bit querky to work with but has been solid. It does seem strange that it burnt and may have nothing to do with the CPU and GPU's but just a bit of bad luck. As far as my CHVFZ purchase, it was just my thinking to get the best I could get so I wouldn't have to upgrade when I went from the FX4100 to the 8350 and has been the best investment in my PC. Plus the colors matched my theme and I am a bit OCD about the looks of my rigs.


----------



## JeremyFenn

Quote:


> Originally Posted by *Devildog83*
> 
> Yep, like I said I don't know that much about Asrock except the AM2+ board I bought for my stepsons rig works fine, it was a bit querky to work with but has been solid. It does seem strange that it burnt and may have nothing to do with the CPU and GPU's but just a bit of bad luck. As far as my CHVFZ purchase, it was just my thinking to get the best I could get so I wouldn't have to upgrade when I went from the FX4100 to the 8350 and has been the best investment in my PC. Plus the colors matched my theme and I am a bit OCD about the looks of my rigs.


Hey, no I totally hear ya on that!! I have a white/blue scheme on mine, since the extreme 9 is black & gold (black & gold like the song LOL sorry) it fit pretty well & made it look classy. This EVGA card is also black & gold, but doesn't have a blue LED that says Vapor-X on it, which was just friggin' awesome!! I think when I slap it in my mobo though, I'll see where the look:quality ratio will be in the quality/performance side lol


----------



## diggiddi

Quote:


> Originally Posted by *Antoniv*
> 
> Do an atheist pray with me that I'll get one!


Huh what, atheist pray?? to whom or what???


----------



## DarthBaggins

Quote:


> Originally Posted by *diggiddi*
> 
> Huh what, atheist pray?? to whom or what???


The Flying Spaghetti Monster lol.


----------



## JeremyFenn

I think Homer Simpson said it best

"SAVE ME JEBUS!!"










can -o- spam


----------



## Devildog83

Quote:


> Originally Posted by *JeremyFenn*
> 
> Hey, no I totally hear ya on that!! I have a white/blue scheme on mine, since the extreme 9 is black & gold (black & gold like the song LOL sorry) it fit pretty well & made it look classy. This EVGA card is also black & gold, but doesn't have a blue LED that says Vapor-X on it, which was just friggin' awesome!! I think when I slap it in my mobo though, I'll see where the look:quality ratio will be in the quality/performance side lol


If you had some blue LED's could you mod the GPU housing to light up EVGA?


----------



## JeremyFenn

Not sure? I don't think the side has the EVGA as a diffuser or cut-out, and I certainly don't want to do TOO much to it (in case I want to RMA it if it's defective). Like I said before my Extreme9 board is black and gold, black and gold, black and gold.....sorry.. LOL so this card matches the scheme pretty well, just not as awesomesauce as the blue VAPOR-X Led ya know. I can deal with some...well you know...black and gold..


----------



## Devildog83

Quote:


> Originally Posted by *JeremyFenn*
> 
> Not sure? I don't think the side has the EVGA as a diffuser or cut-out, and I certainly don't want to do TOO much to it (in case I want to RMA it if it's defective). Like I said before my Extreme9 board is black and gold, black and gold, black and gold.....sorry.. LOL so this card matches the scheme pretty well, just not as awesomesauce as the blue VAPOR-X Led ya know. I can deal with some...well you know...black and gold..


Black and gold is actually going to be the color scheme for my next build. I may try Intel just for kicks. I was trying to figure out how I could replace the Devil logos on these and have new plates cut with Devildog83 cut into them and have RED LED'sinside the cards like this.


----------



## JeremyFenn

If they're plastic and you can take them off without damaging the cooling guts, just use a dremel tool to cut out the logo that's there and maybe try something behind it to diffuse the LED?


----------



## Devildog83

Quote:


> Originally Posted by *JeremyFenn*
> 
> If they're plastic and you can take them off without damaging the cooling guts, just use a dremel tool to cut out the logo that's there and maybe try something behind it to diffuse the LED?


They are metal. I was thinking of having somebody make it for me at a metal shop. And then I have some translucent white plexi I could use to defuse them.


----------



## BruceB

Quote:


> Originally Posted by *Devildog83*
> 
> They are metal. I was thinking of having somebody make it for me at a metal shop. And then I have some translucent white plexi I could use to defuse them.


Doing this will probably void your warranty anyway, why not take the coolers off? It'd make it a damn sight easier, and you could Change the TIM too!


----------



## Devildog83

Quote:


> Originally Posted by *BruceB*
> 
> Doing this will probably void your warranty anyway, why not take the coolers off? It'd make it a damn sight easier, and you could Change the TIM too!


I already changed the TIM and it lowered my temps a lot. The warranty is already void. 1 because it was a refurb (review sample and only had 90 warranty and the other because I replaced the TIM. I love the cards and have no reason to return them anyhow. They work great. I did RMA the 270x because the X-Fire connector was bad but won't need to again.

The plates just screw off so if I had some others made it would not void a warranty anyhow as I could just put them back. At this point if the cards get fried then the get fried, I will just buy new ones.


----------



## NoobOCerz

Hi

is it safe to max the power limit when overclocking?


----------



## Tobe404

Quote:


> Originally Posted by *NoobOCerz*
> 
> Hi
> 
> is it safe to max the power limit when overclocking?


Yep... It's fine.

Usually if an OC is unstable it either needs more voltage and/or a bump in the power limit.


----------



## Roboyto

Quote:


> Originally Posted by *NoobOCerz*
> 
> Hi
> 
> is it safe to max the power limit when overclocking?


If you have a good PSU to supply the additional wattage, and good cooling/airflow so nothing overheats.


----------



## TheN00bBuilder

Yep. I'm that good. It is the OC edition, so I'd think these clocks are normal.


----------



## TheN00bBuilder

Bit of an update; I think I might have a golden chip. 1200 on core clock and 1620 on memory. Only an increase of 1C with fans at 25%.


----------



## End3R

Quote:


> Originally Posted by *TheN00bBuilder*
> 
> Bit of an update; I think I might have a golden chip. 1200 on core clock and 1620 on memory. Only an increase of 1C with fans at 25%.


You aren't going to see an increase when they're idle.


----------



## TheN00bBuilder

Okay. so, what do I do to check temps?


----------



## End3R

Quote:


> Originally Posted by *TheN00bBuilder*
> 
> Okay. so, what do I do to check temps?


Play a graphically intense game or benchmark, baby steps though. Don't burnout your card.


----------



## TheN00bBuilder

yeah. I ran TF2 for 10 minutes; no problem. Then I got to L4D2 and instant BSOD. Ah well. It went up pretty high.


----------



## fabiovtec

Quote:


> Originally Posted by *TheN00bBuilder*
> 
> 
> 
> Yep. I'm that good. It is the OC edition, so I'd think these clocks are normal.


Your gpu?


----------



## TheN00bBuilder

Yep.


----------



## Tobe404

I reckon you could lower the voltage on your card at those clcoks and still be stable.

Don't have to. Would lower temps and power usage. Just an idea.
(Edit: Just scrolled up and noticed you had Bsod in L4D2. My bad.)

Does your card allow voltage adjustment out of the box or is it locked?


----------



## TheN00bBuilder

It seems to be locked.


----------



## fabiovtec

Quote:


> Originally Posted by *TheN00bBuilder*
> 
> Yep.


What card is?


----------



## HBizzle

I recently built a new comp with a Powercolor R9 280X TD OC edition. Before building the comp I did not realize this card had a recommended PSU of 750W, and instead I purchased a 600W PSU. Will I see much of a difference if I upgrade to a 750W PSU?


----------



## Devildog83

Quote:


> Originally Posted by *HBizzle*
> 
> I recently built a new comp with a Powercolor R9 280X TD OC edition. Before building the comp I did not realize this card had a recommended PSU of 750W, and instead I purchased a 600W PSU. Will I see much of a difference if I upgrade to a 750W PSU?


What else is in your system?


----------



## HBizzle

Quote:


> Originally Posted by *Devildog83*
> 
> What else is in your system?


i5-4570
ASUS Z87M-Plus Mobo
120GB SSD
1TB WD Blue
DVD Burner
8GB Ram


----------



## BruceB

Quote:


> Originally Posted by *HBizzle*
> 
> I recently built a new comp with a Powercolor R9 280X TD OC edition. Before building the comp I did not realize this card had a recommended PSU of 750W, and instead I purchased a 600W PSU. Will I see much of a difference if I upgrade to a 750W PSU?


I've got that Card with a 660W PSU, it runs fine. If it runs at all you won't see any difference. If you're getting power cuts or the Card is crashing when under load then you Need more power.

IIRC this Card only draws 250W ('only' lol







) so as Long as you've got your System calculated at under your 600W you'll be fine*.

*increacing the voltage takes way more power than the Default voltages. If you plan to overvolt you should calculate extra capacity in your System power requirements.










[EDIT]
Using this site: PSU Calculator (click on 'lite version' at the bottom and you can use it for free over the web)
I calc'd your power requirements to be ~415W at 90% load (*not OC'd*).
*I don't know exactly what's in your System* though; I suggest you go to this site and Input all the correct data to calculate your power requirements.


----------



## Devildog83

Quote:


> Originally Posted by *HBizzle*
> 
> i5-4570
> ASUS Z87M-Plus Mobo
> 120GB SSD
> 1TB WD Blue
> DVD Burner
> 8GB Ram


Yep you should be fine. I have a 270x and a 7870 in X-fire plus a FX 8350 @ 4.8 and I run 660w platinum. It does however depend on the PSU. which one do you have?


----------



## HBizzle

Quote:


> Originally Posted by *HBizzle*
> 
> i5-4570
> ASUS Z87M-Plus Mobo
> 120GB SSD
> 1TB WD Blue
> DVD Burner
> 8GB Ram


Quote:


> Originally Posted by *Devildog83*
> 
> Yep you should be fine. I have a 270x and a 7870 in X-fire plus a FX 8350 @ 4.8 and I run 660w platinum. It does however depend on the PSU. which one do you have?


A Corsair CX600M.


----------



## HBizzle

Quote:


> Originally Posted by *BruceB*
> 
> I've got that Card with a 660W PSU, it runs fine. If it runs at all you won't see any difference. If you're getting power cuts or the Card is crashing when under load then you Need more power.
> 
> IIRC this Card only draws 250W ('only' lol
> 
> 
> 
> 
> 
> 
> 
> ) so as Long as you've got your System calculated at under your 600W you'll be fine*.
> 
> *increacing the voltage takes way more power than the Default voltages. If you plan to overvolt you should calculate extra capacity in your System power requirements.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [EDIT]
> Using this site: PSU Calculator (click on 'lite version' at the bottom and you can use it for free over the web)
> I calc'd your power requirements to be ~415W at 90% load (*not OC'd*).
> *I don't know exactly what's in your System* though; I suggest you go to this site and Input all the correct data to calculate your power requirements.


That's a cool tool. Thanks for linking to it.


----------



## roudabout6

For eyefinity at 5760 x1080 get another Sapphire 280x Vapor-x next week or sell my 280x and get one 290 and crossfire much later. I really only want enough performance to run game in eyefinity at high settings at 60fps


----------



## Roboyto

Quote:


> Originally Posted by *roudabout6*
> 
> For eyefinity at 5760 x1080 get another Sapphire 280x Vapor-x next week or sell my 280x and get one 290 and crossfire much later. I really only want enough performance to run game in eyefinity at high settings at 60fps


Will depend on the game to get 60fps with a 290 at 5760*1080. I'm running it and am very happy with the results.

But you should be aware than you will need to get a card that will overclock fairly well to get respectable frame rates at that resolution with one card; stock clocks won't cut it. My gaming clocks are 1200/1500 on my 290.

A less demanding game like Borderlands 2, I can run all settings maxed and frames stay pegged at 72 even with a full game battling The Warrior.

A more demanding game like FFXIV, I have all settings maxed except for AA(4x), and frames never drop under 40 no matter what I'm doing. FPS usually in the high 40s to low 50s.

I have played a few other games like Bioshock Infinite, Crysis 3, Tomb Raider and FarCry 3...but I can't tell you exactly where the frames were during those games. I can tell you that they looked glorious and played smooth as butter.

You can likely get 60fps in many games at 5760*1080 with a single 290, you will just have to adjust settings per game accordingly.

If you need to know most anything about the 290(X) cards you can check my post here: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


----------



## diggiddi

Quote:


> Originally Posted by *Antoniv*
> 
> Do an atheist pray with me that I'll get one!


Quote:


> Originally Posted by *roudabout6*
> 
> For eyefinity at 5760 x1080 get another Sapphire 280x Vapor-x next week or sell my 280x and get one 290 and crossfire much later. I really only want enough performance to run game in eyefinity at high settings at 60fps


280x in crossfire is at least 15% more powerful than 290


----------



## Tobe404

280x is around 25% (from what I've seen/read, give or take, varies depending on game/bench) slower than a 290/x.
So if you crossfired them you'd likely get 50 to 75% better performance over a 290/x


----------



## diggiddi

Quote:


> Originally Posted by *Tobe404*
> 
> 280x is around 25% (from what I've seen/read, give or take, varies depending on game/bench) slower than a 290/x.
> So if you crossfired them you'd likely get 50 to 75% better performance over a 290/x


It varies widely from game to game and resolution from 0% (see hitman absolution) on upwards http://anandtech.com/bench/product/1058?vs=1068


----------



## BruceB

While we're on the subject of 280x in xfire: how much power is needed to runs 280X's in Xfire?


----------



## diggiddi

750 watts should be enough


----------



## Tobe404

Quote:


> Originally Posted by *BruceB*
> 
> While we're on the subject of 280x in xfire: how much power is needed to runs 280X's in Xfire?


http://www.overclock.net/t/1477611/antec-hcg-620-watt-enough-for-r9-280x-crossfire


----------



## Tobe404

Quote:


> Originally Posted by *Tobe404*
> 
> http://www.overclock.net/t/1477611/antec-hcg-620-watt-enough-for-r9-280x-crossfire


I should be able to power two 280x's on my rig (no OC on GPUs or CPU though)

Edit: Didn't mean to double post. Apologies.


----------



## Unknownm

Quote:


> Originally Posted by *BruceB*
> 
> While we're on the subject of 280x in xfire: how much power is needed to runs 280X's in Xfire?


2x 280X with a 750w PSU, the CPU was overclocked, cpu watercooling, 4x 80mm fans and it ran fine. Couldn't overclock the gpus due the heat but I can't imagine of getting far with a 750w..

However each card was rated max 250w so in real life rules gpu are not 100% used in games in CF setup, maybe around 400w avg


----------



## BruceB

Quote:


> Originally Posted by *Tobe404*
> 
> http://www.overclock.net/t/1477611/antec-hcg-620-watt-enough-for-r9-280x-crossfire


Quote:


> Originally Posted by *Unknownm*
> 
> 2x 280X with a 750w PSU, the CPU was overclocked, cpu watercooling, 4x 80mm fans and it ran fine. Couldn't overclock the gpus due the heat but I can't imagine of getting far with a 750w..
> 
> However each card was rated max 250w so in real life rules gpu are not 100% used in games in CF setup, maybe around 400w avg


Thanks guys! I just wanted to double check what I read online before I blew so much Money on a second 280X!


----------



## Devildog83

Quote:


> Originally Posted by *BruceB*
> 
> Thanks guys! I just wanted to double check what I read online before I blew so much Money on a second 280X!


The 280x in X-fire should be far superior than a 290. My 279x and 7870 will outperform most 290's. It should easily out perform a 290x also.

If you had a high quality, high efficiency PSU, 750w would be OK but I would trade up for an 850W gold or even better platinum. Search for "Shilka" on the forum, he's the PSU Guru.


----------



## shilka

Quote:


> Originally Posted by *Devildog83*
> 
> The 280x in X-fire should be far superior than a 290. My 279x and 7870 will outperform most 290's. It should easily out perform a 290x also.
> 
> If you had a high quality, high efficiency PSU, 750w would be OK but I would trade up for an 850W gold or even better platinum. Search for "Shilka" on the forum, he's the PSU Guru.


You called?


----------



## TheN00bBuilder

I swear, shilka's everywhere at once!!


----------



## DarthBaggins

Quote:


> Originally Posted by *Devildog83*
> 
> The 280x in X-fire should be far superior than a 290. My 279x and 7870 will outperform most 290's. It should easily out perform a 290x also.
> 
> If you had a high quality, high efficiency PSU, 750w would be OK but I would trade up for an 850W gold or even better platinum. Search for "Shilka" on the forum, he's the PSU Guru.


Yup I know my 7870&270x xfired perform beautifully.


----------



## MoraisGT

Currently I have a 280X from VTX3D, and despite what people may think about the brand it's actually a good card, very quiet and with reasonable temperatures.

But I only had it on air for about a week, while I as waiting for some watercooling supplies to arrive.

And it's a good overclocker aswell. It comes out of the factory with 1000MHz on the core and 1500MHz on the memory.
I can play BF4 100% stable with 1265/1800MHz. For benchmarks I was able to get it up to 1300/1880MHz (these clocks only for catzilla).

Here's a pic of the system, hope you guys like it


----------



## F3ERS 2 ASH3S

Woot

i can get full waterblocks for my cards









http://www.coolingconfigurator.com/waterblock/3831109856697#DB_inline?height=260&width=530&inline_id=comp_table

XFX 280x = 7970 GHz edition design


----------



## mikemykeMB

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Woot
> 
> i can get full waterblocks for my cards
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.coolingconfigurator.com/waterblock/3831109856697#DB_inline?height=260&width=530&inline_id=comp_table
> 
> XFX 280x = 7970 GHz edition design


That's awesome..I have been contemplating an upgrade from their 270x..And now it has pushed a bit further so.


----------



## shilka

Quote:


> Originally Posted by *MoraisGT*
> 
> Currently I have a 280X from VTX3D, and despite what people may think about the brand it's actually a good card, very quiet and with reasonable temperatures.
> 
> But I only had it on air for about a week, while I as waiting for some watercooling supplies to arrive.
> 
> And it's a good overclocker aswell. It comes out of the factory with 1000MHz on the core and 1500MHz on the memory.
> I can play BF4 100% stable with 1265/1800MHz. For benchmarks I was able to get it up to 1300/1880MHz (these clocks only for catzilla).
> 
> Here's a pic of the system, hope you guys like it


What PSU is that cant see name or brand or anything really


----------



## MoraisGT

Quote:


> Originally Posted by *shilka*
> 
> What PSU is that cant see name or brand or anything really


I'm affraid to tell you based on your status









It's a Nox Hummer 700W, wich is the first thing I'm going to change when I get the $'s









I always drool when I see a SuperFlower psu (the white ones) but unfortunetly they are way out of my reach


----------



## shilka

Quote:


> Originally Posted by *MoraisGT*
> 
> I'm affraid to tell you based on your status
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's a Nox Hummer 700W, wich is the first thing I'm going to change when I get the $'s
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I always drool when I see a SuperFlower psu (the white ones) but unfortunetly they are way out of my reach


Cant find much info on it, so yeah replace when you can


----------



## lolwatpear

can someone recommend me the dual-x or vapor-x r9 270x? Price difference doesn't matter. I can't find any comparisons between these cards. I know the vapor-x is supposed to be a step above, but I've still seen some people suggest the dual-x (for different cards such as 7950).


----------



## neurotix

Quote:


> Originally Posted by *Devildog83*
> 
> The 280x in X-fire should be far superior than a 290. My 279x and 7870 will outperform most 290's. It should easily out perform a 290x also.
> 
> If you had a high quality, high efficiency PSU, 750w would be OK but I would trade up for an 850W gold or even better platinum. Search for "Shilka" on the forum, he's the PSU Guru.


7870 Crossfire and an R9 290 both have 2560 shaders, 64 ROPs and 160 TMUs. The performance should be right about the same theoretically. The 290 has many factors in it's advantage, though. It has 4GB of memory on a 512-bit bus and almost double (if not double) the memory bandwidth. Memory does not double in Crossfire, nor does the bandwidth. 7870s have to deal with Crossfire scaling and the fact that not all games support Crossfire. Crossfire scales well so you should get 80%-90-% the performance of a single card with dual cards, and of course this has diminishing returns the more cards you add. Finally, a 290 is a single card solution compared to a multi card solution so you need less room in your case, less PCI-E power connectors, and will probably draw less power at similar clocks.

When you factor in Crossfire scaling and slower memory performance of the 7870, the 290 comes out on top in most scenarios, especially overclocked. Crossfire scaling limits the performance of dual 7870s and when you overclock the 290s RAM to 400gb/sec this gives it an edge. In practical use, I will admit that a 290 and 7870 Crossfire bench quite close but in the overwhelming amount of cases a single 290 is a *little* bit faster due to the reasons I mentioned.

7870 CF does not "easily" outperform a 290X. You would probably have to overclock BOTH 7870s to 1250mhz+ just to keep up with a 290X at 1100mhz. The 290X installed properly with correctly installed drivers and optimizations should outperform 7870 Crossfire pretty easily, at lower clocks too.


----------



## BruceB

Quote:


> Originally Posted by *MoraisGT*
> 
> Currently I have a 280X from VTX3D, and despite what people may think about the brand it's actually a good card, very quiet and with reasonable temperatures.
> 
> But I only had it on air for about a week, while I as waiting for some watercooling supplies to arrive.
> 
> And it's a good overclocker aswell. It comes out of the factory with 1000MHz on the core and 1500MHz on the memory.
> I can play BF4 100% stable with 1265/1800MHz. For benchmarks I was able to get it up to 1300/1880MHz (these clocks only for catzilla).
> 
> Here's a pic of the system, hope you guys like it


Looks good! Have you run a FutureMark benchmark with that card yet? If so, who is listed as manufacturer?


----------



## MoraisGT

Thanks!









I ran 3DMark11 recently and it said the vendor was VisionTek


----------



## BruceB

Quote:


> Originally Posted by *MoraisGT*
> 
> Thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I ran 3DMark11 recently and it said the vendor was VisionTek


Thanks, I always think it's interesting to see who actually made the card







. Have you got a build log for your PC?


----------



## MoraisGT

Not on this forum I don't.

I have a build log-ish on a forum from my country (Portugal), there are little updates throughout the topic since before I got into WC.

You can check it out if you want but it's all in Portuguese









http://www.portugal-tech.pt/showthread.php?t=724


----------



## diggiddi

Quote:


> Originally Posted by *neurotix*
> 
> 7870 Crossfire and an R9 290 both have *2560 shaders, 64 ROPs and 160 TMUs.* The performance should be right about the same theoretically. The 290 has many factors in it's advantage, though. It has 4GB of memory on a 512-bit bus and almost double (if not double) the memory bandwidth. Memory does not double in Crossfire, nor does the bandwidth. 7870s have to deal with Crossfire scaling and the fact that not all games support Crossfire. Crossfire scales well so you should get 80%-90-% the performance of a single card with dual cards, and of course this has diminishing returns the more cards you add. Finally, a 290 is a single card solution compared to a multi card solution so you need less room in your case, less PCI-E power connectors, and will probably draw less power at similar clocks.
> 
> When you factor in Crossfire scaling and slower memory performance of the 7870, the 290 comes out on top in most scenarios, especially overclocked. Crossfire scaling limits the performance of dual 7870s and when you overclock the 290s RAM to 400gb/sec this gives it an edge. In practical use, I will admit that a 290 and 7870 Crossfire bench quite close but in the overwhelming amount of cases a single 290 is a *little* bit faster due to the reasons I mentioned.
> 
> 7870 CF does not "easily" outperform a 290X. You would probably have to overclock BOTH 7870s to 1250mhz+ just to keep up with a 290X at 1100mhz. The 290X installed properly with correctly installed drivers and optimizations should outperform 7870 Crossfire pretty easily, at lower clocks too.


I'm assuming Shaders are the most important in determining performance followed by TPu's then Rops lastly Memory bus right?


----------



## neurotix

Quote:


> Originally Posted by *diggiddi*
> 
> I'm assuming Shaders are the most important in determining performance followed by TPu's then Rops lastly Memory bus right?


Shaders are probably most important, then ROPs (render output unit or raster operator unit), then TMUs (texture mapping unit), then memory bus. Someone else correct me if I'm wrong. I'm familiar with the terminology and the general idea of these things but I'm not a computer engineer and I don't work for anandtech. Someone else like mcocod would know more. Where this becomes important is at resolutions higher than 1080p. ROPs affect your pixel fillrate, which is how fast your card can update pixels, and TMUs affect your texture fillrate, which is how fast your card can load and manipulate textures. Cards with a higher ROP count perform better at high resolutions. For example, a 290X has 64 ROPs and 176 TMUs. A 780ti has 48 ROPs and 224 TMUs (but it also has more, and slightly faster shaders). This is why (in general) at very high resolutions the 290X outperforms a 780ti in a lot of games and benchmarks. Look up some of the 4k benches done vs the 780ti and you'll see what I mean. At lower resolutions, and especially 1080p, the 780ti is actually a more powerful card and benchmarks reflect this, but when run at 4k the 290X pulls ahead. We also have to consider that the 780ti overclocks much higher in general, and this is why it can beat the 290X in benchmarking. At the same clocks, the 290X is probably faster. Even so, the cards are probably within 5% of each other at high resolutions, although the 290X usually costs much less.


----------



## diggiddi

Quote:


> Originally Posted by *neurotix*
> 
> Shaders are probably most important, then ROPs (render output unit or raster operator unit), then TMUs (texture mapping unit), then memory bus. Someone else correct me if I'm wrong. I'm familiar with the terminology and the general idea of these things but I'm not a computer engineer and I don't work for anandtech. Someone else like mcocod would know more. Where this becomes important is at resolutions higher than 1080p. ROPs affect your pixel fillrate, which is how fast your card can update pixels, and TMUs affect your texture fillrate, which is how fast your card can load and manipulate textures. Cards with a higher ROP count perform better at high resolutions. For example, a 290X has 64 ROPs and 176 TMUs. A 780ti has 48 ROPs and 224 TMUs (but it also has more, and slightly faster shaders). This is why (in general) at very high resolutions the 290X outperforms a 780ti in a lot of games and benchmarks. Look up some of the 4k benches done vs the 780ti and you'll see what I mean. At lower resolutions, and especially 1080p, the 780ti is actually a more powerful card and benchmarks reflect this, but when run at 4k the 290X pulls ahead. We also have to consider that the 780ti overclocks much higher in general, and this is why it can beat the 290X in benchmarking. At the same clocks, the 290X is probably faster. Even so, the cards are probably within 5% of each other at high resolutions, although the 290X usually costs much less.


Cool +rep


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> The 280x in X-fire should be far superior than a 290. My 279x and 7870 will outperform most 290's. It should easily out perform a 290x also.
> 
> If you had a high quality, high efficiency PSU, 750w would be OK but I would trade up for an 850W gold or even better platinum. Search for "Shilka" on the forum, he's the PSU Guru.


Put your Xfire up against my 290 in a few benches. My 4770k will smoke the 8350 in physics, so we can compare graphics scores.

Quote:


> Originally Posted by *neurotix*
> 
> 7870 Crossfire and an R9 290 both have 2560 shaders, 64 ROPs and 160 TMUs. The performance should be right about the same theoretically. The 290 has many factors in it's advantage, though. It has 4GB of memory on a 512-bit bus and almost double (if not double) the memory bandwidth. Memory does not double in Crossfire, nor does the bandwidth. 7870s have to deal with Crossfire scaling and the fact that not all games support Crossfire. Crossfire scales well so you should get 80%-90-% the performance of a single card with dual cards, and of course this has diminishing returns the more cards you add. Finally, a 290 is a single card solution compared to a multi card solution so you need less room in your case, less PCI-E power connectors, and will probably draw less power at similar clocks.
> 
> When you factor in Crossfire scaling and slower memory performance of the 7870, the 290 comes out on top in most scenarios, especially overclocked. Crossfire scaling limits the performance of dual 7870s and when you overclock the 290s RAM to 400gb/sec this gives it an edge. In practical use, I will admit that a 290 and 7870 Crossfire bench quite close but in the overwhelming amount of cases a single 290 is a *little* bit faster due to the reasons I mentioned.
> 
> 7870 CF does not "easily" outperform a 290X. You would probably have to overclock BOTH 7870s to 1250mhz+ just to keep up with a 290X at 1100mhz. The 290X installed properly with correctly installed drivers and optimizations should outperform 7870 Crossfire pretty easily, at lower clocks too.


Yup.


----------



## mikemykeMB

OOPsss..some one got bit N chewey....SPIT


----------



## mikemykeMB

Quote:


> Originally Posted by *mikemykeMB*
> 
> OOPsss..some one got bit N chewey....SPIT


And I did 'Q' that..never pit a score against something that will totally smash marks


----------



## bardacuda

Quote:


> Originally Posted by *TheN00bBuilder*
> 
> Bit of an update; I think I might have a golden chip. 1200 on core clock and 1620 on memory. Only an increase of 1C with fans at 25%.


Quote:


> Originally Posted by *TheN00bBuilder*
> 
> yeah. I ran TF2 for 10 minutes; no problem. Then I got to L4D2 and instant BSOD. Ah well. It went up pretty high.


Quote:


> Originally Posted by *Tobe404*
> 
> I reckon you could lower the voltage on your card at those clcoks and still be stable.
> 
> Don't have to. Would lower temps and power usage. Just an idea.
> (Edit: Just scrolled up and noticed you had Bsod in L4D2. My bad.)
> 
> Does your card allow voltage adjustment out of the box or is it locked?


Quote:


> Originally Posted by *TheN00bBuilder*
> 
> It seems to be locked.


Have you tried adding voltage by editing the VBIOS? The voltage on my 270s can't be changed in software but on the ASUS I am able to adjust voltage with a vbios edit.


----------



## Dasboogieman

Even though I have a 290 Tri X, this information may apply to 270X and 280X Tri X users

I finally fixed the second of the major Sapphire Tri X design flaws.
(The first being lack of integrated backplate to cool VRMs and the usage of rare M1.5 screws which preclude the use of aftermarket backplates with the Tri X cooler)

I don't know if anybody with a Tri-X has noticed, at specific fan speeds (mine were at 47% 50% 55% 60%), you get this rhythmic rattling sound. Mine was more of a pulsating buzz. The noise goes away completely at higher or lower fan speeds of >70% or <30%. The noise also can come on when the fan quickly ramps speed (e.g. steep fan curves)
Now normally, this buzz isn't a problem if the fans are loud but it is extremely prominent due to the sheer silence of the Tri X design.

I first thought the issue was caused by the VRMs, so I opened up the cooler and placed a small square of Sekisui 5760 on each choke. The coil buzz definitely went away but the rattling remained. I next thought it may be due to shoddy fans so I opened up an RMA with Sapphire. While waiting for the RMA ticket, I happened upon this youtube video.
http://www.youtube.com/watch?v=EomL1oysVXs

So I got some garden rubber gloves from Bunnings, cut them up in to rectangles and placed them so that the fan frame would not contact the HS anymore. Not only has the rattling speed gone away, the fan transitions are now buttery smooth and the whole assembly even sounds slightly less loud at full speed.

Basically, this issue is a design flaw where there is a lack of padding between the Fan assembly and the HS assembly leading to resonance of the metal.

Thought would be worth posting for anyone with a Tri-X, try this mod and the VRM mod before RMAing the card


----------



## HBizzle

What can I do to get better performance out of my PowerColor 280X OC edition? Any small tweaks to make it run better? Don't want to void the warranty or risk breaking it. Thanks.


----------



## BruceB

Quote:


> Originally Posted by *HBizzle*
> 
> What can I do to get better performance out of my PowerColor 280X OC edition? Any small tweaks to make it run better? Don't want to void the warranty or risk breaking it. Thanks.


The only tweaks I konw of (new TIM or an aftermarket cooler) are only going to help you OC further. I don't think there's any tweaks you can do to the Card per se that will help.

You can however mess around in CCC and do Things like turning off tessalation (+8% in _Fire Strike_ for me), Play with the Options there and see what helps


----------



## HBizzle

Quote:


> Originally Posted by *BruceB*
> 
> The only tweaks I konw of (new TIM or an aftermarket cooler) are only going to help you OC further. I don't think there's any tweaks you can do to the Card per se that will help.
> 
> You can however mess around in CCC and do Things like turning off tessalation (+8% in _Fire Strike_ for me), Play with the Options there and see what helps


So it is not advisable to turn it up any higher in the stock loadout?


----------



## diggiddi

Quote:


> Originally Posted by *HBizzle*
> 
> What can I do to get better performance out of my PowerColor 280X OC edition? Any small tweaks to make it run better? Don't want to void the warranty or risk breaking it. Thanks.


Do you need overclocking help? http://www.overclock.net/t/633816/how-to-overclock-your-amd-ati-gpu


----------



## BruceB

Quote:


> Originally Posted by *HBizzle*
> 
> So it is not advisable to turn it up any higher in the stock loadout?


If you mean overclocking, that will void your warranty and you risk breaking it. But yes, OC'ing is by far the best way to get more out of it


----------



## Devildog83

Quote:


> Originally Posted by *Roboyto*
> 
> Put your Xfire up against my 290 in a few benches. My 4770k will smoke the 8350 in physics, so we can compare graphics scores.
> 
> Yup.


Here is a 3DMark11 and Firestrike from a while back. I will run some know tests soon.




By the way, now that my CPU is under water I should be able to push way past the 5.0 Ghz threshold. It will help some I am sure.


----------



## DarthBaggins

Quote:


> Originally Posted by *Devildog83*
> 
> Here is a 3DMark11 and Firestrike from a while back. I will run some know tests soon.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> By the way, now that my CPU is under water I should be able to push way past the 5.0 Ghz threshold. It will help some I am sure.


But does it fold? lol You might see a bump in your scores.


----------



## Devildog83

Quote:


> Originally Posted by *DarthBaggins*
> 
> But does it fold? lol You might see a bump in your scores.


No but it serves milk with a PB&J.


----------



## diggiddi

Quote:


> Originally Posted by *Devildog83*
> 
> Here is a 3DMark11 and Firestrike from a while back. I will run some know tests soon.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> By the way, *now that my CPU is under water* I should be able to push way past the 5.0 Ghz threshold. It will help some I am sure.
> 
> *That, my friend is not water!
> 
> 
> 
> 
> 
> 
> 
> *
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> Originally Posted by *DarthBaggins*
> 
> But does it fold? lol *You might see a bump in your scores*.


After that tall cold glass of milk any processor will see a bump in scores for sure


----------



## Kokumotsu

so i just bought this 280x Dual X for 162$
what am i in for


----------



## diggiddi

Quote:


> Originally Posted by *Kokumotsu*
> 
> so i just bought this 280x Dual X for 162$
> what am i in for


A whole lotta goodness, enjoy your purchase


----------



## Kokumotsu

Quote:


> Originally Posted by *diggiddi*
> 
> A whole lotta goodness, enjoy your purchase


though about this card in particular. im seeing alot of bad stuff about it on newegg
. the guy i am getting it from stated the temps are good on it though in Heaven maxed 65° while my 7870 Hawk goes past 70 on Valley on extreme

also it has a 2 year warrenty if im correct. is that still valid i dont remember when the custom cooler R9's came out


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> Here is a 3DMark11 and Firestrike from a while back. I will run some know tests soon.
> 
> 
> 
> 
> By the way, now that my CPU is under water I should be able to push way past the 5.0 Ghz threshold. It will help some I am sure.
> 
> 
> 
> Spoiler: Warning: Spoiler!


Water loop looks nice







Is that acrylic tubing perchance? If so, may need to ask you for some pointers...been contemplating it for some time now.

Just need a card in there now that will take a FC block









On to the benches!









*XFX R9 290 BE @ 1300/1700 - 4770k @ 4.5GHz - 16GB Dominator Platinum @ 2400MHz* *- Win7 Ultimate - 13.12*



Spoiler: OC'd GPU Settings



*Tesselation ON - 3DMark11 Performance Presets - Graphics 18687*

http://www.3dmark.com/3dm11/8347743



*Tesselation OFF - 3DMark11 Performance Presets - Graphics 21240*

http://www.3dmark.com/3dm11/8347445



*Tesselation ON - Firestrike - Graphics 13616*

http://www.3dmark.com/3dm/3100692?



*Tesselation OFF - Firestrike - Graphics 14557*

http://www.3dmark.com/3dm/3100715?





*XFX R9 290 BE @ Stock 980/1250*



Spoiler: Stock GPU Settings



*Tesselation ON - 3DMark11 Performance Presets - Graphics 14420*

http://www.3dmark.com/3dm11/8347525



*Tesselation OFF - 3DMark11 Performance Presets - Graphics 16513*

http://www.3dmark.com/3dm11/8347531



*Tesselation ON - Firestrike - Graphics 10343*

http://www.3dmark.com/3dm/3100822?



*Tesselation OFF - Firestrike - Graphics 11361*

http://www.3dmark.com/3dm/3100796?





Went through all of this and just realized I only have one DIMM of my RAM installed...took one out to temporarily use in my HTPC and forgot to put it back in.











Should be roughly a 15% improvement in physics score judging by some old 3DMark11 scores I have saved...







We shall see!


----------



## Devildog83

Thanks Roboyto !!

I can answer any questions you have. It's 16mm EK tubing and fittings. The bending takes a bit of practice.
I can't match the Physics or combined scores but the Devil's sure do put out. My next build may be Intel just for a change, I haven't made up my mind yet.
Here is a shot of the build more complete now.


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> Thanks Roboyto !!
> 
> I can answer any questions you have. It's 16mm EK tubing and fittings. The bending takes a bit of practice.
> I can't match the Physics or combined scores but the Devil's sure do put out. My next build may be Intel just for a change, I haven't made up my mind yet.
> Here is a shot of the build more complete now.


I knew the physics/combined scores wouldn't be comparable, but you should run benches for graphics scores anyhow.

Mighty fine rig you have there, extra clean









What're you using to guide your power cables? I tried the Lutro0 things from Frozen and there were crap.


----------



## Devildog83

Quote:


> Originally Posted by *Roboyto*
> 
> I knew the physics/combined scores wouldn't be comparable, but you should run benches for graphics scores anyhow.
> 
> Mighty fine rig you have there, extra clean
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What're you using to guide your power cables? I tried the Lutro0 things from Frozen and there were crap.


They were from lutro0.


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> They were from lutro0.


I tried these:



Do you have these?


----------



## Devildog83

Quote:


> Originally Posted by *Roboyto*
> 
> I tried these:
> 
> 
> 
> Do you have these?


Yes the full cover ones. I tried the other too and got the new ones right away.


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> Yes the full cover ones. I tried the other too and got the new ones right away.


Ahh, very nice, those weren't available when I ordered mine a few months back. I will have to order some post haste! Thanks!


----------



## neurotix

Your build looks great Devildog, good work. I would definitely recommend doing Intel for your next build, I'm not disappointed with it at all and the experience overclocking them has been invaluable. So much to learn, I'm still learning about my system all the time. I feel like I can overclock anything now, and I'm not just limited to one platform or the other.

Also, at Roboyto, I don't even think the graphics scores are comparable, sadly. Devildog should remember, when I switched from AMD to Intel my graphics scores went up 6000 points using the same two cards (R9 290). Of course, the physics and combined scores were higher, but so was the graphics score. 6000 points is well out of the margin of error. Also, this was universal with all Futuremark products. Mind you, I'm top 10 on the enthusiast Hwbot team so I'm not talking out of my ass.

I think you guys would probably have to have the same platform to realistically compare scores.


----------



## Roboyto

Quote:


> Originally Posted by *neurotix*
> 
> Your build looks great Devildog, good work. I would definitely recommend doing Intel for your next build, I'm not disappointed with it at all and the experience overclocking them has been invaluable. So much to learn, I'm still learning about my system all the time. I feel like I can overclock anything now, and I'm not just limited to one platform or the other.
> 
> Also, at Roboyto, I don't even think the graphics scores are comparable, sadly. Devildog should remember, when I switched from AMD to Intel my graphics scores went up 6000 points using the same two cards (R9 290). Of course, the physics and combined scores were higher, but so was the graphics score. 6000 points is well out of the margin of error. Also, this was universal with all Futuremark products. Mind you, I'm top 10 on the enthusiast Hwbot team so I'm not talking out of my ass.
> 
> I think you guys would probably have to have the same platform to realistically compare scores.


I also saw a large improvement coming from a Phenom II X6 @ 4.2GHz on an 890FX to a 3570k on a Z77. My graphics power/scores increased ~25% on the 7870 I was using at the time. However, the difference between an FX chip in a 990FX and an Ivy/Haswell in a Z77/Z87 board is much smaller graphics wise.

Two GPUs in Xfire that match one GPU in every spec except for bus width aren't going to miraculously perform better by a large margin. The 7870 and 270X have Xfire scaling and bus width slowing them down compared to the 290. It is a close battle no doubt, but I still don't see the 2 cards outperforming the one even if he had the same/similar platform.

Xfire 270X is also a much different story than Xfire 290; Not surprising you saw a large leap switching to an i7 with 2 of these monstrous GPUs. No matter how you slice it, an 8-core FX is on a similar playing field to an i5.


----------



## End3R

Quote:


> Originally Posted by *Roboyto*
> 
> No matter how you slice it, an 8-core FX is on a similar playing field to an i5.


Yea no, http://www.cpubenchmark.net/common_cpus.html


----------



## neurotix

Quote:


> Originally Posted by *End3R*
> 
> Yea no, http://www.cpubenchmark.net/common_cpus.html


Yeah. No. Passmark is worthless and nobody uses it for anything because it's been reporting information like that ever since it's been around. Nobody uses Passmark.

Benchers and serious system builders (who you know, actually BUILD their own systems), as well as pretty much every review site out there, will tell you that an 8 core FX is close to an i5 in performance. The reality is that for gaming, the i5 is a better chip, and frequently outperforms the FX series slightly in many games.


----------



## Roboyto

Quote:


> Originally Posted by *End3R*
> 
> Yea no, http://www.cpubenchmark.net/common_cpus.html










That is only 1 benchmark from the 1st site that came up in your google search...

I don't want to get into this debate with your or anyone else for that matter, which is why I said SIMILAR playing field...but you asked for it so here you go...

The i5's are right behind the FX, in your link, which have *twice as many threads* and a clock speed advantage out of the box...I would hope they would have an advantage. Overclock both chips to the same level and what happens? And lets not discuss the outrageous inefficiency of the FX 8-core...they can draw as much wattage as my 290...

Let's check the 3DMark HoF for what kind of CPUs people are using...wherever could AMD be?

Yea...sorry, that's just how it is.

I used to be on the other side of the fence; Then I bought a laptop with an i5 in it. Which clowned my X6 @ 4.2GHz in many respects. I then bought a 3570k for my desktop and haven't looked back.

Still love AMD GPUs though!


----------



## End3R

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> 
> 
> 
> 
> 
> That is only 1 benchmark from the 1st site that came up in your google search...
> 
> I don't want to get into this debate with your or anyone else for that matter, which is why I said SIMILAR playing field...but you asked for it so here you go...
> 
> The i5's are right behind the FX, in your link, which have *twice as many threads* and a clock speed advantage out of the box...I would hope they would have an advantage. Overclock both chips to the same level and what happens? And lets not discuss the outrageous inefficiency of the FX 8-core...they can draw as much wattage as my 290...
> 
> Let's check the 3DMark HoF for what kind of CPUs people are using...wherever could AMD be?
> 
> Yea...sorry, that's just how it is.


I figured I'd keep things simple for you since clearly you don't understand how things work. Yes, there are some i5s there, that is a chart of the most common cpus so you'd be a moron if you thought there wouldn't be i5s on there. The point is FX cpus are stronger than i5s. The lowest FX on there benched around 4k, compared to the lowest i5 on there, which was around 2k. Now compare that to the fastest FX on there which is benched around 9k, and the fastest i5 benching 7.7k.

I'm not saying i5s are bad, far from it, but FX cpus are pretty much the amd equivalent of i7s. IF you want the full chart, here ya go http://www.cpubenchmark.net/high_end_cpus.html


----------



## neurotix

The reality is that if Passmark meant anything at all 1) It would be used by review sites, 2) It would probably be used by Anandtech and 3) It would be on HWBOT. Passmark is frequently totally wrong in many other examples too. It's a garbage benchmark.

EDIT: HERE is another benchmark. This is assuming all the GPUs are at stock clocks. A GTX 780 gets 8061. An R9 290X gets 6754. EVERY OTHER SINGLE REVIEW, BENCHMARK AND GAME WILL PERFORM BETTER ON A 290X. What a joke. Also, the 780 somehow manages to score better than the Titan even though the Titan has more shaders. Wow.

Don't use Passmark, kiddies.


----------



## Roboyto

Quote:



> Originally Posted by *End3R*
> 
> I figured I'd keep things simple for you since clearly you don't understand how things work. Yes, there are some i5s there, that is a chart of the most common cpus so you'd be a moron if you thought there wouldn't be i5s on there. The point is FX cpus are stronger than i5s. The lowest FX on there benched around 4k, compared to the lowest i5 on there, which was around 2k. Now compare that to the fastest FX on there which is benched around 9k, and the fastest i5 benching 7.7k.
> 
> I'm not saying i5s are bad, far from it, but FX cpus are pretty much the amd equivalent of i7s. IF you want the full chart, here ya go http://www.cpubenchmark.net/high_end_cpus.html










blocked


----------



## End3R

Hmm guess it makes no sense that tomshardware and 3dmark back it up then? Look at all those FX cpus surrounded by i7s.... But sure, you go ahead thinking i5s are more powerful. (not saying some of them aren't, there are good and bad of every series)
http://www.tomshardware.com/charts/cpu-charts-2013/-04-3DMark11,3157.html


----------



## neurotix

Tom's Hardware isn't reputable either. Tom's Hardware is the Yahoo! News of computer hardware websites.

EDIT: Take a real good look at the chart you posted from them. The FX is being beat by i7s that are 5 years old. It is also actually beaten by the i5 3570k. The i5 4670k, which is a very relevant chip, isn't even on the chart. Also, the ORIGINAL i5 750 Lynnfield from like 2008 is beating both the FX 6200 and the FX 4200. Your own chart shows your chip losing. The reality is that I don't know what version of 3dmark11 they're even running, every version I've seen, the Intel chips consistently score a few thousand points higher than AMD FX. If you *really* think your FX is the "equivilent of an i7", let's see your 3dmark11 results vs my i7 and 270X. Note, I'm almost a professional bencher at this point.


----------



## End3R

Quote:


> Originally Posted by *neurotix*
> 
> Tom's Hardware isn't reputable either. Tom's Hardware is the Yahoo! News of computer hardware websites.
> 
> EDIT: Take a real good look at the chart you posted from them. The FX is being beat by i7s that are 5 years old. It is also actually beaten by the i5 3570k. The i5 4670k, which is a very relevant chip, isn't even on the chart. Also, the ORIGINAL i5 750 Lynnfield from like 2008 is beating both the FX 6200 and the FX 4200. Your own chart shows your chip losing. The reality is that I don't know what version of 3dmark11 they're even running, every version I've seen, the Intel chips consistently score a few thousand points higher than AMD FX. If you *really* think your FX is the "equivilent of an i7", let's see your 3dmark11 results vs my i7 and 270X. Note, I'm almost a professional bencher at this point.


Sure thing just drop your overclock







Obviously I'm not saying all FXs are equal to i7s, but the statement FXs are i5s is wrong, if you had to say they were closer to an i5 or an i7, it would be an i7. I've provided 2 benchmarks showing FX series cards shoulder to shoulder with i7s, where are these mythical benches where all all i7s are above FX and FX is shoulder to shoulder with i5? Because I've yet to see a chart like that anywhere.


----------



## neurotix

lol.

Take a look on HWBOT. You'll see that in almost every relevant 3d benchmark, *especially* Futuremark, Intel chips score anywhere from a few thousand to tens of thousands points higher with the same graphics cards. For example, in 3dmark06, with an FX-8350 at 5ghz and a 290 at 1200mhz, I got 23000. After switching to Intel, that score with my i7 at 4.5ghz and my 290 at 1200mhz inflated to nearly 38000. My 3dmark11 went from 12000 to 16000. My Fire Strike went from 12000 to almost 19000.

Here's a good example.

Valley, 2 290s 1100/1500mhz, FX-8350 5ghz



Valley, 2 290s 1100/1500mhz, i7 4770k 4.5ghz



All of my games run 10-20 fps higher depending on the game at 5760x1080p. Areas like the Whiterun entrance in Skyrim that used to dip to below 30 fps, now run at a steady 60 fps. It's not just benchmarks.

There are many, many, many other examples. It doesn't matter if it's an i7 or an i5 because games and most benchmarks don't use more than 4 threads.

I used to bench and promote AMD, but I got sick of the pathetically low benchmark scores and switched. The difference is real. Intel is on a whole other level. The stuff AMD has now just barely matches what Intel put out 5 years ago. Why would I switch if I had an FX-8350 at 5ghz under liquid cooling? Even my i7 at *stock* trounces anything a 5ghz FX-8350 could do, in nearly every benchmark and game.

Do some research and educate yourself. And stop trying to be an apologist and explain away your inferior hardware to make yourself feel better when you haven't even done research, and the stuff you presented actually proves the opposite.


----------



## End3R

Quote:


> Originally Posted by *neurotix*
> 
> lol.
> 
> Here's a good example.
> 
> Valley, 2 290s 1100/1500mhz, FX-8350 5ghz
> 
> 
> 
> Valley, 2 290s 1100/1500mhz, i7 4770k 4.5ghz


(speaking in generalities) The argument was never that FX series chips are better than i7s, it's that they're better than i5s. I use an i5 at work and it's a great chip (not hating on anything here) but my fx at home blows it out of the water.


----------



## Dasboogieman

Quote:


> Originally Posted by *End3R*
> 
> (speaking in generalities) The argument was never that FX series chips are better than i7s, it's that they're better than i5s. I use an i5 at work and it's a great chip (not hating on anything here) but my fx at home blows it out of the water.


Blow out of the water how though? Both chips had different design paradigms.
The i5 was never meant to be an absolute performance leader, its a very general purpose workhorse chip, designed to excel in all non-parallel workloads (due to its powerful single core performance) and respectable performance in parallel loads. The FX chips leverage more resources in parallel to boost applications that can take advantage of that programming paradigm. However, the more specialized nature of the FX chips masks its weakness in simple, non-parallel workloads due to its weaker core, I don't think anybody can deny this.

Unfortunately, it is the year 2014 and most games and non-professional applications are simply not parallel enough to fully utilize an FX chip. Thus it is no surprise that the i5 retains the advantage here.

The gap becomes even more prominent once factors such as power and overclocking become included. The i5 uses much less power to achieve superior per core performance. The FX may have the edge in multithreaded apps (and thus most benchmarks) but it will always trail in lightly threaded applications.

At the end of the day, it comes down to cost, the FX is a real bargain (performance wise) if you can utilize its parallel architecture but the i5 is much more predictable and flexible in its operating field. Obviously, the i7 outclasses both but is much much more expensive.


----------



## DiceAir

So My card was fault. Was getting green lines on my Desktop and so I decided to RMA the card back to the PC shop. I'm running 1440p and the pc shop doesn't have a 1440p monitor. Also the PC shop is to far away to just drive to them with my monitor. Now I did take a video but they still need to test it. I get crashes in BF4 with mantle. It's like it's freezing and like I said green lines on desktop. I have my other r9-280x here and with that card it's fine. Now i don't know what to do.

I'm thinking of selling both card for a a reasonable amount even if I don't get much back. I then want to buy a MSI r9-290x gaming. Expensive card but cheaper than 780TI but the 780ti will also support dx12 and is the fastest card on the market now and with recent drivers the performance is even better. so what should I do?

Thanks

Edit: I can also see if I can get my hands on a Non reference r9-290 and crossfire later


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> Your build looks great Devildog, good work. I would definitely recommend doing Intel for your next build, I'm not disappointed with it at all and the experience overclocking them has been invaluable. So much to learn, I'm still learning about my system all the time. I feel like I can overclock anything now, and I'm not just limited to one platform or the other.
> 
> Also, at Roboyto, I don't even think the graphics scores are comparable, sadly. Devildog should remember, when I switched from AMD to Intel my graphics scores went up 6000 points using the same two cards (R9 290). Of course, the physics and combined scores were higher, but so was the graphics score. 6000 points is well out of the margin of error. Also, this was universal with all Futuremark products. Mind you, I'm top 10 on the enthusiast Hwbot team so I'm not talking out of my ass.
> 
> I think you guys would probably have to have the same platform to realistically compare scores.


Thanks *neurotix*...... I would not go Intel just because of benchmarks, if your comparing an FX to an Intel chip then benchmarks are not the way to do it anyway. There is an increased cost with Intel so I will be considering that but with me the motherboard features compatibility with what I want to build, the cost and of course the colors. The set-up I have now does everything I need it to do and more so bragging rights in a few bench's doesn't come in to the equation. That is a thing with a lot of people and that's good for them. When I say that my Devil's are at or near the performance of a 290 that's real world, it's fact no matter which chip is pushing it the GPU performance is there. Visually 99% of folks will never notice the difference between a 8350 and 4770k in a game the few FPS you might get is not so huge it makes that much of a difference. *That doesn't mean that the 4770k isn't better because it is*, it's just it's just not so much better that it makes a huge difference in real world applications.

Having said all of that I really am thinking of build with Intel next just the main driving force behind it is not that it's a major performance increase just to switch things up.


----------



## Devildog83

Quote:


> Originally Posted by *DiceAir*
> 
> So My card was fault. Was getting green lines on my Desktop and so I decided to RMA the card back to the PC shop. I'm running 1440p and the pc shop doesn't have a 1440p monitor. Also the PC shop is to far away to just drive to them with my monitor. Now I did take a video but they still need to test it. I get crashes in BF4 with mantle. It's like it's freezing and like I said green lines on desktop. I have my other r9-280x here and with that card it's fine. Now i don't know what to do.
> 
> I'm thinking of selling both card for a a reasonable amount even if I don't get much back. I then want to buy a MSI r9-290x gaming. Expensive card but cheaper than 780TI but the 780ti will also support dx12 and is the fastest card on the market now and with recent drivers the performance is even better. so what should I do?
> 
> Thanks
> 
> Edit: I can also see if I can get my hands on a Non reference r9-290 and crossfire later


For the price I would go with a 290, the difference between that and a 290x is not enough to warrant the cost increase unless you can find a deal on a 290x or a 780 Ti. Whether you go nVidia or AMD is up to what features that you think will help you for what games or applications you use. They both have there bright spots and lack some in others.


----------



## alani4837

Hello guys, I bought a xfx Black DD R9 280X 3Gb and i want water block for this card. Anyones knows where i can found it?

thank you


----------



## mikemykeMB

Quote:


> Originally Posted by *alani4837*
> 
> Hello guys, I bought a xfx Black DD R9 280X 3Gb and i want water block for this card. Anyones knows where i can found it?
> 
> thank you


See http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-270-owners-club/5940#post_22288575


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *alani4837*
> 
> Hello guys, I bought a xfx Black DD R9 280X 3Gb and i want water block for this card. Anyones knows where i can found it?
> 
> thank you
> Quote:
> 
> 
> 
> Originally Posted by *mikemykeMB*
> 
> See http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-270-owners-club/5940#post_22288575
Click to expand...

It is the same one for the 7970 ghz edition I match the PCB image with my PCB image

glad I found that


----------



## Recr3ational

Herro everybody!

Haven't been here a while! Been of my game lately!

How's everyone's rigs?


----------



## mikemykeMB

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> It is the same one for the 7970 ghz edition I match the PCB image with my PCB image
> 
> glad I found that


Not trying take the props..just pushed it to show.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *mikemykeMB*
> 
> Not trying take the props..just pushed it to show.


no worries just wanted to add some to it


----------



## Roboyto

Quote:


> Originally Posted by *Devildog83*
> 
> Thanks *neurotix*...... I would not go Intel just because of benchmarks, if your comparing an FX to an Intel chip then benchmarks are not the way to do it anyway. There is an increased cost with Intel so I will be considering that but with me the motherboard features compatibility with what I want to build, the cost and of course the colors. The set-up I have now does everything I need it to do and more so bragging rights in a few bench's doesn't come in to the equation. That is a thing with a lot of people and that's good for them. When I say that my Devil's are at or near the performance of a 290 that's real world, it's fact no matter which chip is pushing it the GPU performance is there. Visually 99% of folks will never notice the difference between a 8350 and 4770k in a game the few FPS you might get is not so huge it makes that much of a difference. *That doesn't mean that the 4770k isn't better because it is*, it's just it's just not so much better that it makes a huge difference in real world applications.
> 
> Having said all of that I really am thinking of build with Intel next just the main driving force behind it is not that it's a major performance increase just to switch things up.


There is an increased cost, but you can hold onto the hardware longer, sell it for more $ later, and if you really want to get down to the nitty gritty, you have the power consumption factor over the time you own the CPU.

270X Xfire is at/near the performance of a 290, depending on what you're doing. Once the resolution goes up, they fall behind. I couldn't use them for 5760*1080 and have the same experience I do with my 290.

It's not just gaming performance, or benchmarks, because your computer as a whole will simply run better than it does now. How can I quantify or properly explain this? I can't. I was firmly trenched in the AMD CPU camp for 15 years...until I got a laptop with an i5.

You will just have to take the leap and see for yourself. If you do, you probably won't be looking back.

Quote:


> Originally Posted by *Devildog83*
> 
> For the price I would go with a 290, the difference between that and a 290x is not enough to warrant the cost increase unless you can find a deal on a 290x or a 780 Ti. Whether you go nVidia or AMD is up to what features that you think will help you for what games or applications you use. They both have there bright spots and lack some in others.


Spot on advice from @Devildog83. The 290X is usually about 5% faster than a 290, BUT, some 290s clock better than 290X so they can pull ahead. 290X isn't worth the $150+ price tag unless you can find a deal.

Quote:


> Originally Posted by *Recr3ational*
> 
> Herro everybody!
> 
> Haven't been here a while! Been of my game lately!
> 
> How's everyone's rigs?


Rigs are good, how bout yours? Started in this thread with my first PowerColor product; a Devil 270X. I was very impressed with it's capabilities overall, but it didn't fit the bill in terms of noise for my Mini-ITX HTPC/Gaming rig. Unfortunately I sold it, but "stole" a PowerColor reference 290 for $299 from NewEgg open box. You can see that progress in my sig if you'd like


----------



## Devildog83

Quote:


> Originally Posted by *Roboyto*
> 
> There is an increased cost, but you can hold onto the hardware longer, sell it for more $ later, and if you really want to get down to the nitty gritty, you have the power consumption factor over the time you own the CPU.
> 
> 270X Xfire is at/near the performance of a 290, depending on what you're doing. Once the resolution goes up, they fall behind. I couldn't use them for 5760*1080 and have the same experience I do with my 290.
> 
> It's not just gaming performance, or benchmarks, because your computer as a whole will simply run better than it does now. How can I quantify or properly explain this? I can't. I was firmly trenched in the AMD CPU camp for 15 years...until I got a laptop with an i5.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> You will just have to take the leap and see for yourself. If you do, you probably won't be looking back.
> 
> Spot on advice from @Devildog83
> . The 290X is usually about 5% faster than a 290, BUT, some 290s clock better than 290X so they can pull ahead. 290X isn't worth the $150+ price tag unless you can find a deal.
> 
> Rigs are good, how bout yours? Started in this thread with my first PowerColor product; a Devil 270X. I was very impressed with it's capabilities overall, but it didn't fit the bill in terms of noise for my Mini-ITX HTPC/Gaming rig. Unfortunately I sold it, but "stole" a PowerColor reference 290 for $299 from NewEgg open box. You can see that progress in my sig if you'd like


I don't need to run those resolutions right now and actually did think about selling my 7870 and getting a 290 when they were first released but just could not pass up running 2 Devil's for some reason.

By the way I am not firmly entrenched in either camp, I kind of get tired of defending my CPU choice all of the time. I have all kinds of stuff I could say but I just don't want to anymore. If folks chose think AMD sucks that's OK, I do not and frankly this is the last time I even want to discuss it. It's like trying to convince a conspiracy theorist that the conspiracy is not true, no matter what you say there is always another reason why it's true. I never said AMD was better and frankly don't care anymore, these debates are pointless.


----------



## End3R

Quote:


> Originally Posted by *Devildog83*
> 
> [/SPOILER]
> 
> I don't need to run those resolutions right now and actually did think about selling my 7870 and getting a 290 when they were first released but just could not pass up running 2 Devil's for some reason.
> 
> By the way I am not firmly entrenched in either camp, I kind of get tired of defending my CPU choice all of the time. I have all kinds of stuff I could say but I just don't want to anymore. If folks chose think AMD sucks that's OK, I do not and frankly this is the last time I even want to discuss it. It's like trying to convince a conspiracy theorist that the conspiracy is not true, no matter what you say there is always another reason why it's true. I never said AMD was better and frankly don't care anymore, these debates are pointless.


I agree completely. At home I have my FX in my desktop, my i7 (i7-2630QM) in my laptop, and at work I alternate between using 2 machines, one with an i5, the other with an i7 (i7-3770K). I'm not running at resolutions higher than 1080p regardless of which machine I'm on. Honestly between all 4 I think my FX performs the best, but honestly there hasn't been a single real world scenario where my FX has bottlenecked me, and that includes while I'm streaming (with my cpu preset on slow to enchance quality on a low upload speed)/recording. I haven't been bottlenecked by my i7-3770K at work either, but I most certainly have by my laptop i7-2630QM.


----------



## neurotix

Y'know, the nature of the beast is that you're on a computing forum. People are going to have different opinions and experiences. Hopefully, you get people who are objective about their experience, and back up their experiences through knowledge, research and tangible proof. Not everyone does this, and not everyone is willing to admit to themselves the objective truth. I'd like to think that I'm objective and support my viewpoint with facts, and apparently other people see this and appreciate it (e.g. the rep).

AMD vs Intel is a difficult debate. At this point, it's one that's gone on for at least 20 years, and as long as both companies are around and producing chips, it will continue. If you don't like it, you don't have to post.

I never said AMD "sucks", I don't think Roboyto or anyone else did either. For the price, AMD's performance is great. I mean, my Vishera is still in use in another computer and likely will be for a long time to come. If someone only had $200 to spend on a CPU and motherboard, I would definitely recommend a cheap/used 8320 and a 990FX board. Some people get Microcenter deals and get 8320s for > $120. That's crazy and well worth the money.

However, the fact remains that Intel outpaces AMD by a good margin in performance. If someone could afford it, I'd recommend Intel every time. The benchmarks prove the difference. My *own* testing and benchmarks prove the difference, using the same cards. Some people just can't accept that. The difference in most of my games is more than "a few fps", in fact my fps in most games has gone up 10-20 fps depending on the game. That's huge. At 5760x1080 I need all the power I can get, and that extra fps makes a big difference.

This article does a good job at showing the differences.

I respect and understand other people's choice, but I'm glad I switched. I love AMD, my first real computer was a K6-2, I still have 3 AMD based systems in the house, and I want to see them on top again. However, Vishera is old tech by now, and there are no more plans to support the AM3+ socket with updates. They have left the enthusiast segment high and dry for over 2 years. AMD needs to scrap the Bulldozer architecture and come out with something completely different, but they won't. If AMD came out with something that rivaled Haswell performance, I'd make the switch in an instant. I highly doubt that will happen though.


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> Y'know, the nature of the beast is that you're on a computing forum. People are going to have different opinions and experiences. Hopefully, you get people who are objective about their experience, and back up their experiences through knowledge, research and tangible proof. Not everyone does this, and not everyone is willing to admit to themselves the objective truth. I'd like to think that I'm objective and support my viewpoint with facts, and apparently other people see this and appreciate it (e.g. the rep).
> 
> AMD vs Intel is a difficult debate. At this point, it's one that's gone on for at least 20 years, and as long as both companies are around and producing chips, it will continue. If you don't like it, you don't have to post.
> 
> I never said AMD "sucks", I don't think Roboyto or anyone else did either. For the price, AMD's performance is great. I mean, my Vishera is still in use in another computer and likely will be for a long time to come. If someone only had $200 to spend on a CPU and motherboard, I would definitely recommend a cheap/used 8320 and a 990FX board. Some people get Microcenter deals and get 8320s for > $120. That's crazy and well worth the money.
> 
> However, the fact remains that Intel outpaces AMD by a good margin in performance. If someone could afford it, I'd recommend Intel every time. The benchmarks prove the difference. My *own* testing and benchmarks prove the difference, using the same cards. Some people just can't accept that. The difference in most of my games is more than "a few fps", in fact my fps in most games has gone up 10-20 fps depending on the game. That's huge. At 5760x1080 I need all the power I can get, and that extra fps makes a big difference.
> 
> This article does a good job at showing the differences.
> 
> I respect and understand other people's choice, but I'm glad I switched. I love AMD, my first real computer was a K6-2, I still have 3 AMD based systems in the house, and I want to see them on top again. However, Vishera is old tech by now, and there are no more plans to support the AM3+ socket with updates. They have left the enthusiast segment high and dry for over 2 years. AMD needs to scrap the Bulldozer architecture and come out with something completely different, but they won't. If AMD came out with something that rivaled Haswell performance, I'd make the switch in an instant. I highly doubt that will happen though.


I know all of that, and I am sure if or when I get an Intel chip I will be as happy as a fat kid in a ding dong factory. I just get tired of the same old debate, there are many misconceptions going around and it get's old trying to dispel them and justify that I have what I have. I here that the FX chips bottleneck high end GPU's and they don't, I here that the power consumption make Intel's just as cost effective and it doesn't, I here that 2500k's outperform an 8350 and they don't. Debate is good but knowing what you are talking about helps. There is a Intel fanboy right now that is in RED1776's build log that is just being a complete idiot and I know of few folks that know as much or more about what he does than RED1776. The trolls are annoying. The whole debate is pointless, everyone know if you spend for for a chip with better tech it will perform better, but it always get's rammed down the throats of everyone running AMD and frankly it's passe. I will not spend over $300 on a chip and by a mid-range board so when I do go Intel it will be all out and will be $600 or so, of course it will outperform my current rig, it freakin' better, but for just over $400 I have top of the line AMD and I like it.

I hope that explains how I feel about this without ruffling any feathers, it's not what I mean to do.


----------



## End3R

Quote:


> Originally Posted by *Devildog83*
> 
> I know all of that, and I am sure if or when I get an Intel chip I will be as happy as a fat kid in a ding dong factory. I just get tired of the same old debate, there are many misconceptions going around and it get's old trying to dispel them and justify that I have what I have. I here that the FX chips bottleneck high end GPU's and they don't, I here that the power consumption make Intel's just as cost effective and it doesn't, I here that 2500k's outperform an 8350 and they don't. Debate is good but knowing what you are talking about helps. There is a Intel fanboy right now that is in RED1776's build log that is just being a complete idiot and I know of few folks that know as much or more about what he does than RED1776. The trolls are annoying. The whole debate is pointless, everyone know if you spend for for a chip with better tech it will perform better, but it always get's rammed down the throats of everyone running AMD and frankly it's passe. I will not spend over $300 on a chip and by a mid-range board so when I do go Intel it will be all out and will be $600 or so, of course it will outperform my current rig, it freakin' better, but for just over $400 I have top of the line AMD and I like it.
> 
> I hope that explains how I feel about this without ruffling any feathers, it's not what I mean to do.


Yes, I think you misunderstood what I and Devil were saying Neuro. Nobody was debating that at the top of the line, intel outperforms amd. I was simply saying that Rob's statement of an FX is the same as an i5 "nomatter how you slice it" is flat out wrong.


----------



## Dasboogieman

Quote:


> Originally Posted by *End3R*
> 
> Yes, I think you misunderstood what I and Devil were saying Neuro. Nobody was debating that at the top of the line, intel outperforms amd. I was simply saying that Rob's statement of an FX is the same as an i5 "nomatter how you slice it" is flat out wrong.


Sigh, yea, I wish the FX was as powerful as the i5 all around. Then I could've actually gotten a decent motherboard at the time for under $200 instead of the ridiculously overpriced Z68 chipset based board with lacking features (which AMD had in abundance).


----------



## Recr3ational

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> There is an increased cost, but you can hold onto the hardware longer, sell it for more $ later, and if you really want to get down to the nitty gritty, you have the power consumption factor over the time you own the CPU.
> 
> 270X Xfire is at/near the performance of a 290, depending on what you're doing. Once the resolution goes up, they fall behind. I couldn't use them for 5760*1080 and have the same experience I do with my 290.
> 
> It's not just gaming performance, or benchmarks, because your computer as a whole will simply run better than it does now. How can I quantify or properly explain this? I can't. I was firmly trenched in the AMD CPU camp for 15 years...until I got a laptop with an i5.
> 
> You will just have to take the leap and see for yourself. If you do, you probably won't be looking back.
> 
> Spot on advice from @Devildog83
> . The 290X is usually about 5% faster than a 290, BUT, some 290s clock better than 290X so they can pull ahead. 290X isn't worth the $150+ price tag unless you can find a deal.
> 
> 
> Rigs are good, how bout yours? Started in this thread with my first PowerColor product; a Devil 270X. I was very impressed with it's capabilities overall, but it didn't fit the bill in terms of noise for my Mini-ITX HTPC/Gaming rig. Unfortunately I sold it, but "stole" a PowerColor reference 290 for $299 from NewEgg open box. You can see that progress in my sig if you'd like


Nice job dude, How's the 290? Good?
Mine good, haven't changed much. Added an extra 120mm rad for the loop and thinking about going triple 280x.
bought 16GB of HyperX beasts

Haven't had the time to get into computers/gaming recently, been either working or to busy with friends.


----------



## Tempest2000

Just ordered an MSI 280 gaming from newegg for $199 (sale plus promo code plus rebate card). The current AMD free games selection isn't that great though...


----------



## End3R

Quote:


> Originally Posted by *Tempest2000*
> 
> The current AMD free games selection isn't that great though...


Murdered, Thief, and then either: Tomb Raider, Deus Ex: HR, Hitman, or Sleeping Dogs. Can't speak for Murdered as it isn't out yet, but it looks cool and is the newest/most expensive game on the list, and Thief is very beautiful/fun.


----------



## Tempest2000

Quote:


> Originally Posted by *End3R*
> 
> Murdered, Thief, and then either: Tomb Raider, Deus Ex: HR, Hitman, or Sleeping Dogs. Can't speak for Murdered as it isn't out yet, but it looks cool and is the newest/most expensive game on the list, and Thief is very beautiful/fun.


Yeah I'm thinking Murdered and Thief for sure. I'm not too interested in them but of all options, those are what I'm leaning towards. Third pick will probably be Deus Ex. I already have Tomb Raider.


----------



## End3R

Quote:


> Originally Posted by *Tempest2000*
> 
> Yeah I'm thinking Murdered and Thief for sure. I'm not too interested in them but of all options, those are what I'm leaning towards. Third pick will probably be Deus Ex. I already have Tomb Raider.


Deus Ex and Thief are both great games so I don't think you'll be disappointed. I'm still not sure about Murdered, I like the premise, solving your own murder, but the trailers I've seen make it look kinda campy, we'll see.


----------



## omari79

is there a point in getting a 270x over a none-x 270 if i am going to overclock it anyway?


----------



## fabiovtec

Quote:


> Originally Posted by *omari79*
> 
> is there a point in getting a 270x over a none-x 270 if i am going to overclock it anyway?


Yes and No.
If you have the money for the X version is better... If you want to save some money buy the r9 270


----------



## shilka

Quote:


> Originally Posted by *Tempest2000*
> 
> Yeah I'm thinking Murdered and Thief for sure. I'm not too interested in them but of all options, those are what I'm leaning towards. Third pick will probably be Deus Ex. I already have Tomb Raider.


Deus Ex is one of the best games i have played in many years so there is no reason not to pick it.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *shilka*
> 
> Deus Ex is one of the best games i have played in many years so there is no reason not to pick it.


I second that notion..


----------



## omari79

Quote:


> Originally Posted by *fabiovtec*
> 
> Yes and No.
> If you have the money for the X version is better... If you want to save some money buy the r9 270


Why? Whats the difference in performance between the two when both are oc to thier max?


----------



## Faster_is_better

Just a PSA for anyone with the ASUS DCII coolers (at least on 280x version but I think all the newer ones are the same build style). The fans will eventually spit out all the oil and seize up if running at high RPM and in the vertical configuration, such as for benches or mining. I suspect the cards would be fine running in a typical case with the fans pointing down, but if they are set up vertical they actually leak out the oil and once its all gone they just stop working. Don't know if this effects old DCII coolers or any others, quite possible since I don't think most of these fans are using any type of sealed bearings. They may also be fine if you can keep 60-70% fan speed at a max, at least the rattle is gone at those speeds typically, but it comes back at 80%+ on at least one of my cards.

One of my cards fans finally gave it up (mining), and at least 1 or 2 more of them have had bad rattling for a long time. I haven't had a chance to look at the one yet, it may be simple to drop a bit more oil into it, but maybe not.

Just FYI for you guys, this mining is tough on hardware


----------



## End3R

Quote:


> Originally Posted by *Faster_is_better*
> 
> Just FYI for you guys, this mining is tough on hardware


Words to take to heart before you think about buying one of the used r9s that miners are selling on ebay now.


----------



## fabiovtec

Quote:


> Originally Posted by *omari79*
> 
> Why? Whats the difference in performance between the two when both are oc to thier max?


1100 mhz is a good oc for r9 270
1250 mhz is a good oc for r9 270x

The x version have better cooling


----------



## Kokumotsu

Quote:


> Originally Posted by *End3R*
> 
> Words to take to heart before you think about buying one of the used r9s that miners are selling on ebay now.


Should I be scared. I just got a 280x off eBay for 162


----------



## neurotix

Graphics cards are made to withstand high temperatures under load.

I've been folding for years and never had a card die due to high temperatures.

As long as the card has been kept relatively cool (under 100C), it should be fine. If you put it in your system and it works, and you can get the card to display video properly and render 3D games, it's probably fine.


----------



## Marty99

R9 270x has 6+6 pin power connector while R9 270 only 6.
Maybe thats another reason why 270x oc better.


----------



## DarthBaggins

Quote:


> Originally Posted by *Marty99*
> 
> R9 270x has 6+6 pin power connector while R9 270 only 6.
> Maybe thats another reason why 270x oc better.


Have had no issues so far when I've turned the clocks up on mine, v850 pumps plenty of power to it when it demands it


----------



## omari79

Quote:


> Originally Posted by *fabiovtec*
> 
> 1100 mhz is a good oc for r9 270
> 1250 mhz is a good oc for r9 270x
> 
> The x version have better cooling


cheers..that what i wanted to know..review indicate a ~6% difference in performance between the two in most games but i knew that extra 6pin on the x had to be good for something and you just confirmed it..that's a ~10%-13% difference between the two right there if not more using better custom cooling

thanks and +rep


----------



## omari79

Quote:


> Originally Posted by *DarthBaggins*
> 
> Have had no issues so far when I've turned the clocks up on mine, v850 pumps plenty of power to it when it demands it


but whats your best stable oc on that card?


----------



## DarthBaggins

I'll reclock it once I'm done folding this month and see how high I can shoot the clocks and have stability at the end of the month (Team Folding going very well this)


----------



## End3R

Anyone else with a 270x play The Secret World? On ultra settings I get between 50-60 fps in Dx9, but turning on Dx11 will drop me all the way down to a staggering 3 fps. Yup, not 30, 3. I'm fairly certain there is nothing wrong with my drivers as Deus Ex on max settings in Dx11 got a constant 60 fps, and my Bioshock Infinite Dx11 Ultra benchmark had an avg of 80 fps.


----------



## Faster_is_better

Quote:


> Originally Posted by *Kokumotsu*
> 
> Quote:
> 
> 
> 
> Originally Posted by *End3R*
> 
> Words to take to heart before you think about buying one of the used r9s that miners are selling on ebay now.
> 
> 
> 
> Should I be scared. I just got a 280x off eBay for 162
Click to expand...

Depends on how it was run, obviously the card is being sold as in "good working condition" then it should have no fan or otherwise obvious problems and will likely be fine. You have to rely on the seller really if they are selling you a decent card.
Quote:


> Originally Posted by *neurotix*
> 
> Graphics cards are made to withstand high temperatures under load.
> 
> I've been folding for years and never had a card die due to high temperatures.
> 
> As long as the card has been kept relatively cool (under 100C), it should be fine. If you put it in your system and it works, and you can get the card to display video properly and render 3D games, it's probably fine.


Agreed, GPU's are actually really tough, but heatsinks themselves can range from great to shoddy. These DCII coolers (on the 280x at least) are actually really good, its just they can't survive with constant high fan rpm + vertical layout.


----------



## DarthBaggins

Love the dcII cooler on my 7870, keeps it under 55c even under heavy gaming and folding loads, also changing the TIM helped tremendously when I ran it in xfire along side the 270x playing bf4 on ultra and titanfall on insane. Debating on sending my 270x off to alphacool to be fitted with a waterblock.


----------



## Dasboogieman

Quote:


> Originally Posted by *omari79*
> 
> is there a point in getting a 270x over a none-x 270 if i am going to overclock it anyway?


the 270X will overclock better under most cases. The Manufacturers tend to put better VRM assemblies on the 270X that don't get taxed as much under overvoltage conditions. The 270 is more optimized for low power so while it may overclock to a similar degree, the overvoltage headroom is less.

That being said I can't speak for all models, some manufacturers use the exact same PCB between the 270 and 270X.


----------



## Devildog83

Quote:


> Originally Posted by *End3R*
> 
> Anyone else with a 270x play The Secret World? On ultra settings I get between 50-60 fps in Dx9, but turning on Dx11 will drop me all the way down to a staggering 3 fps. Yup, not 30, 3. I'm fairly certain there is nothing wrong with my drivers as Deus Ex on max settings in Dx11 got a constant 60 fps, and my Bioshock Infinite Dx11 Ultra benchmark had an avg of 80 fps.


It could be the game is not supported by DX11.


----------



## End3R

Quote:


> Originally Posted by *Devildog83*
> 
> It could be the game is not supported by DX11.


dx11 is one of the options in-game so I can only imagine it is... I haven't noticed much of a difference other than trail effects when walking through the water so I guess I'll just keep it in dx9, was more curious if anyone else could confirm it's an issue with that game and 270Xs specifically.


----------



## omari79

Quote:


> Originally Posted by *Dasboogieman*
> 
> the 270X will overclock better under most cases. The Manufacturers tend to put better VRM assemblies on the 270X that don't get taxed as much under overvoltage conditions. The 270 is more optimized for low power so while it may overclock to a similar degree, the overvoltage headroom is less.
> 
> That being said I can't speak for all models, some manufacturers use the exact same PCB between the 270 and 270X.


didn't know that..cheers and +rep


----------



## buttface420

just ordered a sapphire oc 280x ! using with a e5450 xeon how bad will my cpu bottleneck the card?


----------



## HBizzle

Couple of questions for folks in this thread. As a reference I have a Powercolor 280X OC edition.

Does anyone know what type of avg FPS a 280X should get on Cryengine?

I play Mechwarrior Online and am getting anywhere from 120 to 40 FPS depending on what is going on in the game. When particles start showing up it slows down noticeably at times, not always but at times. Have v-sync off and it went up to as high as 120.

Also at times if I am using google chrome I will get these lines in chrome that show up on the webpage when I am scrolling. They move with the web page as I scroll, so they are not something on the screen.

Also does anyone know what the correct voltage, mhz, etc... should be? I can run a gpu prog to pull this and see what it produces if anything is off.

Thanks guys.


----------



## mikemykeMB

Quote:


> Originally Posted by *buttface420*
> 
> just ordered a sapphire oc 280x ! using with a e5450 xeon how bad will my cpu bottleneck the card?


You'll BN 4sure, especially if you OC it...might wanna upgrade that elder cpu-socket-MB to get the most out of the upcoming delivery.


----------



## JaredLaskey82

One block installed. Cant wait for the rest of the parts to arrive for my other card and the CPU.

Though just running one card at the moment and missing my crossfire.


----------



## Recr3ational

Quote:


> Originally Posted by *HBizzle*
> 
> Couple of questions for folks in this thread. As a reference I have a Powercolor 280X OC edition.
> 
> Does anyone know what type of avg FPS a 280X should get on Cryengine?
> 
> I play Mechwarrior Online and am getting anywhere from 120 to 40 FPS depending on what is going on in the game. When particles start showing up it slows down noticeably at times, not always but at times. Have v-sync off and it went up to as high as 120.
> 
> Also at times if I am using google chrome I will get these lines in chrome that show up on the webpage when I am scrolling. They move with the web page as I scroll, so they are not something on the screen.
> 
> Also does anyone know what the correct voltage, mhz, etc... should be? I can run a gpu prog to pull this and see what it produces if anything is off.
> 
> Thanks guys.


I get the lines sometimes too,
I think it's something to do with powerplay. Also there is no "correct" voltages and clocks. You'll have to find out yourself, as cards varies.

With cry engine, when I tried crysis three I was getting over 60fps with triple screens and crossfire. Again, our rigs are different so you'll have to mess around with clocks, voltages etc.

The way I test my cards is to have the voltage to what I'm happy with. Temps etc, then put the clocks up until it crashes.

Though mine is underwater so I don't have to worry about the temps as much.


----------



## Destrto

Put me in as a new owner of one of these bad boys! Gigabyte R9 270X 4GB Windforce.


Running 1100/1400 currently.


----------



## Devildog83

Quote:


> Originally Posted by *Destrto*
> 
> Put me in as a new owner of one of these bad boys! Gigabyte R9 270X 4GB Windforce.
> 
> 
> Running 1100/1400 currently.


OP has been updated - could we see what this will do in Valley and maybe 3DMark11?


----------



## Destrto

Quote:


> Originally Posted by *Devildog83*
> 
> OP has been updated - could we see what this will do in Valley and maybe 3DMark11?


I'll get started on downloading them and post results.


----------



## neurotix

Quote:


> Originally Posted by *Destrto*
> 
> I'll get started on downloading them and post results.


Yeah, you need to rerun Valley on the "ExtremeHD" preset. No way a 270X is going to get more than 40 fps.


----------



## Destrto

Quote:


> Originally Posted by *neurotix*
> 
> Yeah, you need to rerun Valley on the "ExtremeHD" preset. No way a 270X is going to get more than 40 fps.


----------



## DarthBaggins

Quote:


> Originally Posted by *neurotix*
> 
> Yeah, you need to rerun Valley on the "ExtremeHD" preset. No way a 270X is going to get more than 40 fps.


A single should push 40,


----------



## Majentrix

The Gigabyte 270 has 6+6 connectors, so they're not all only one plug.


----------



## cyph3rz

Well, here's my benchmark results. My card is the Gigabyte R9 280X REV1.0.


----------



## neurotix

Quote:


> Originally Posted by *cyph3rz*
> 
> Well, here's my benchmark results. My card is the Gigabyte R9 280X REV1.0.


Yeah, you're not running Extreme HD either, but anyways your card should be pushing close to 60fps at 1600x900, easy. If your card isn't overclocked that might be why, but you should be able to do 55fps+ easily even at 1920x1080 when overclocked.


----------



## JaredLaskey82

Thought I would through my 280X cards into the mix.



Won't be able to update for a while while building my loop. Should be back up and running in two weeks after the parts arrive.


----------



## creationsh

Here is mine, @50 fps for Diamond 280X. I don't think it can go any faster without blowing up. 60 fps may be asking too much. Could we be talking about same benchmark (Heaven)?


----------



## End3R

Quote:


> Originally Posted by *neurotix*
> 
> Yeah, you need to rerun Valley on the "ExtremeHD" preset. No way a 270X is going to get more than 40 fps.


On ExtremeHD with AA on 8x you would be right, keep in mind in CCC I have Edge-Detect and Morphological Filtering on, and Texture Filtering Quality on High Quality in CCC. Setting AA to 2x drastically improved performance and honestly didn't look any worse. This was using the "ExtremeHD" preset, it just says Custom in both because my monitor is 1680x1050 and can't run the 1920x1080. My card is stock at 1050/1400, my CPU is also stock at 3.5GHz. (rig in my sig)


----------



## Arkanon

Quote:


> Originally Posted by *neurotix*
> 
> Yeah, you need to rerun Valley on the "ExtremeHD" preset. No way a 270X is going to get more than 40 fps.


Really ?


----------



## cyph3rz

And here are the results with my card slightly overclocked. I can't push my card to 1215 MHz GPU clock and 1620 MHz memory clock according to Techpowerup.


----------



## repo_man

Is anyone making a full coverage block for the 270Xs yet?


----------



## neurotix

270X can definitely do more than 40fps just to clarify, but that's at very high clocks that won't be stable for most cards. Most of these are going to end up right around the 1200mhz mark, and that should get you above 35 fps on Extreme HD. To pass 40 fps you need: 1) a fast Intel system 2) 1250mhz or more on the core clock.

I've done more than 40 fps myself and that was with an FX-8350. I believe for this, my card was at 1300mhz. So it's possible. It's just unrealistic to use those same clocks for gaming. (Any other game or benchmark would crash immediately at 1300mhz).

Ender, as per the rules of the Valley thread you're supposed to use 8x AA. Your other score doesn't matter.

Also, for benching AMD cards in Valley, you want to have your settings looking like this:



These tweaks are allowed for Valley, but they're also allowed for any 3D bench on HWBOT (3dmark etc) for AMD cards.

Make your settings look like mine and run Valley again and you should see an increase in score.

Additionally, these settings are good for gaming, I can't tell the difference between Tessellation on or off, I hate Vsync (I want as many frames as humanly possible), I don't need anisotropic filtering, and so on. You should notice a good speedup in most games with these settings and they will still allow AA if you want it (I usually just use FXAA because it's fast, looks decent and there's next to no performance hit). These settings will especially help midrange cards like the 270X, 7870, and 7870 XT in most games, especially newer titles.


----------



## Arkanon

My run was done with everything in CCC @ stock settings, tesselation on and blablabla, you get the point.
Valley isn't that cpu dependant, allthough i do imagine overclocked high end intel/amd cpu's scoring a tiny bit higher.
Card was clocked at 1360/1500 (270x, duh) which for the record is absolutely stable at those speeds in any game as well. Only downside is that you have to push a fair amount of mV on the card. I need to push 1.412 mV through the card for it to be fully stable at those clocks btw.


----------



## mikemykeMB

Quote:


> Originally Posted by *Arkanon*
> 
> My run was done with everything in CCC @ stock settings, tesselation on and blablabla, you get the point.
> Valley isn't that cpu dependant, allthough i do imagine overclocked high end intel/amd cpu's scoring a tiny bit higher.
> Card was clocked at 1360/1500 (270x, duh) which for the record is absolutely stable at those speeds in any game as well. Only downside is that you have to push a fair amount of mV on the card. I need to push 1.412 mV through the card for it to be fully stable at those clocks btw.


Is a high core OC for it,... what's your temps?


----------



## Roboyto

Quote:


> Originally Posted by *repo_man*
> 
> Is anyone making a full coverage block for the 270Xs yet?


Your best bet is to check Coolingconfigurator.com to see what EK has to offer. When I purchased my DC2 7870 that was the only block they had at the time.


----------



## mikemykeMB

Quote:


> Originally Posted by *Roboyto*
> 
> Your best bet is to check Coolingconfigurator.com to see what EK has to offer. When I purchased my DC2 7870 that was the only block they had at the time.


Well Rob, aren't ya gonna up sale the questione'er? Go for the 280x = FC block. LOL


----------



## Roboyto

Quote:


> Originally Posted by *mikemykeMB*
> 
> Well Rob, aren't ya gonna up sale the questione'er? Go for the 280x = FC block. LOL


280X would be the better choice for block/backplate availability from numerous manufacturers. Not to mention graphical horsepower and that extra GB of VRAM is really going to be put to use with new games that are coming out. I read somewhere that Titanfall can gobble up 3.6GB with their texture settings maxed out; not sure what resolution that was however.


----------



## mikemykeMB

Quote:


> Originally Posted by *Roboyto*
> 
> 280X would be the better choice for block/backplate availability from numerous manufacturers. Not to mention graphical horsepower and that extra GB of VRAM is really going to be put to use with new games that are coming out. I read somewhere that Titanfall can gobble up 3.6GB with their texture settings maxed out; not sure what resolution that was however.


Yup, got sold on the Toxic, waiting for block---patience


----------



## BulletSponge

Can anyone recommend a BIOS to flash a Diamond R9-280X with that will unlock voltage? My XFX DD R9-280X will go up to 1.275 whereas the Diamond 280X is locked at 1.2. I've seen a customer review on Newegg that stated an HIS BIOS would unlock it but prefer some OCN input on the matter.


----------



## repo_man

Quote:


> Originally Posted by *Roboyto*
> 
> Your best bet is to check Coolingconfigurator.com to see what EK has to offer. When I purchased my DC2 7870 that was the only block they had at the time.


No full block listed. Meh. Uni block it will probably be then. Thanks for the link!


----------



## mikemykeMB

Quote:


> Originally Posted by *repo_man*
> 
> No full block listed. Meh. Uni block it will probably be then. Thanks for the link!


Just as fyi..Was contemplating a Koolance GPU 220, instead put a XSPC Rasa on a XFX 270x ...does right well, also added one of the OE fans to top.


----------



## repo_man

Quote:


> Originally Posted by *mikemykeMB*
> 
> Just as fyi..Was contemplating a Koolance GPU 220, instead put a XSPC Rasa on a XFX 270x ...does right well, also added one of the OE fans to top.


Thanks man! I'm thinking I'll probably end up going with a supremacy block to match my CPU block. You need some ram sinks on the memory!


----------



## mikemykeMB

Quote:


> Originally Posted by *repo_man*
> 
> Thanks man! I'm thinking I'll probably end up going with a supremacy block to match my CPU block. You need some ram sinks on the memory!


Yeah..nah--have some laying around= just lazy to cut them out--find some adhesive...seems to be running top notch, fan helps some..


----------



## repo_man

Quote:


> Originally Posted by *mikemykeMB*
> 
> Yeah..nah--have some laying around= just lazy to cut them out--find some adhesive...seems to be running top notch, fan helps some..


At least you're honest.







I've got some thermal two sided tape if you'd like. Hint hint, nudge nudge. hahahah


----------



## mikemykeMB

Quote:


> Originally Posted by *repo_man*
> 
> At least you're honest.
> 
> 
> 
> 
> 
> 
> 
> I've got some thermal two sided tape if you'd like. Hint hint, nudge nudge. hahahah


Man,..







, if those QD's were installed, not having to drain loop..that's the lazy liquid draining blues.


----------



## repo_man

LOL, I hear that! Draining the loop is NEVER fun.


----------



## mikemykeMB

Quote:


> Originally Posted by *repo_man*
> 
> LOL, I hear that! Draining the loop is NEVER fun.


Ohhh..draining is fast and EZ, it's the FILL part = hiccups-bubbles to get rid of , i use a bicycle pump to burp it, but can never get them all.


----------



## repo_man

Quote:


> Originally Posted by *mikemykeMB*
> 
> Ohhh..draining is fast and EZ, it's the FILL part = hiccups-bubbles to get rid of , i use a bicycle pump to burp it, but can never get them all.


I have a reservoir pump top, so bleeding it of bubbles is the easier part for me, lol.


----------



## mikemykeMB

Quote:


> Originally Posted by *repo_man*
> 
> I have a reservoir pump top, so bleeding it of bubbles is the easier part for me, lol.


Same here..D5/ 270 Photon tube, just my loop is weird..I think filling the upper 360 is the obstacle, after that it fills alright.


----------



## Arkanon

Quote:


> Originally Posted by *mikemykeMB*
> 
> Is a high core OC for it,... what's your temps?


With colder outdoor temperatures the card maxxed out at 76°c. Now that the temperatures outside are picking up it hovers around 80-85°c which is still fairly good given the voltage pushed through the card.


----------



## Kokumotsu

yeah i just got a Second Sapphire Dual-x 280X OC
will post some screenshots and update my performance tab (coming from a 7870 HAwk)


----------



## creationsh

Quote:


> Originally Posted by *repo_man*
> 
> I have a reservoir pump top, so bleeding it of bubbles is the easier part for me, lol.


Only for people who have all the time in the world. (Looks at your avatar).... "Oh."


----------



## Devildog83

Quote:


> Originally Posted by *Kokumotsu*
> 
> yeah i just got a Second Sapphire Dual-x 280X OC
> will post some screenshots and update my performance tab (coming from a 7870 HAwk)


Looking forward to seeing it, or them.


----------



## Nfsdude0125

Hey guys, can you add me? Have pics of my R9 270X in my profile







Thanks.


----------



## Devildog83

Quote:


> Originally Posted by *Nfsdude0125*
> 
> Hey guys, can you add me? Have pics of my R9 270X in my profile
> 
> 
> 
> 
> 
> 
> 
> Thanks.


I will add you but I see no pics at all in your profile. Can you take at least a pic from your phone and download it. Anything. Are you running at stock clocks?


----------



## IVWolfeVI

MSI R9 270x 4gb Stock Clocks

amd.gif 23k .gif file


----------



## Devildog83

Quote:


> Originally Posted by *IVWolfeVI*
> 
> MSI R9 270x 4gb Stock Clocks
> 
> amd.gif 23k .gif file


Nice - do you have a pic of the card?


----------



## JaredLaskey82

Hi there guys.

I now am the proud owner of two EK FC R9 280X DCU2 water blocks for my cards.
My question is if the 'EK FC Terminal Dual Serial' bridge is compatible with them. I cant find it on the EK site at all.

Just want to be sure before making my order online of my loop parts.


----------



## madcratebuilder

Quote:


> Originally Posted by *JaredLaskey82*
> 
> Hi there guys.
> 
> I now am the proud owner of two EK FC R9 280X DCU2 water blocks for my cards.
> My question is if the 'EK FC Terminal Dual Serial' bridge is compatible with them. I cant find it on the EK site at all.
> 
> Just want to be sure before making my order online of my loop parts.


I'm interested in this also. I should have my second EK block for my asus 280x in a few weeks. GPU's are going to be in slot 1 and slot 4 (chvf-z). Looking for a clean, simple serial connection.


----------



## Devildog83

Quote:


> Originally Posted by *madcratebuilder*
> 
> I'm interested in this also. I should have my second EK block for my asus 280x in a few weeks. GPU's are going to be in slot 1 and slot 4 (chvf-z). Looking for a clean, simple serial connection.


Plus the 1st and 3rd PCI-E slots are 16X when X-fired on a CHVFZ.


----------



## madcratebuilder

Quote:


> Originally Posted by *Devildog83*
> 
> Plus the 1st and 3rd PCI-E slots are 16X when X-fired on a CHVFZ.


CHVF-Z manual, page 1-21. Slot 3 x8. Slot 1 x16, slot 5 x16/x8.

I've only used slot 1 and 5. Second card is still on air, it would be a easy swap to see what i get.

I've used xfire on 2 MB's with 3 different series of cards, this 280x setup is the best performer I've had, that is with the 14.4 drivers.


----------



## Devildog83

Quote:


> Originally Posted by *madcratebuilder*
> 
> CHVF-Z manual, page 1-21. Slot 3 x8. Slot 1 x16, slot 5 x16/x8.
> 
> I've only used slot 1 and 5. Second card is still on air, it would be a easy swap to see what i get.
> 
> I've used xfire on 2 MB's with 3 different series of cards, this 280x setup is the best performer I've had, that is with the 14.4 drivers.


I said PCI-E slots, not all slots. There are only 4 PCI-E slots, the 1st and 3rd are 16x. I don't count the 1x slots.


----------



## madcratebuilder

Quote:


> Originally Posted by *Devildog83*
> 
> I said PCI-E slots, not all slots. There are only 4 PCI-E slots, the 1st and 3rd are 16x. I don't count the 1x slots.


I see what you are saying. technically all 6 slots are PCIe 2.0, just the second and fourth are x1.

asus cfvf-z specs are

Slot 1-pci-e 2.0 x16
Slot 2-pci-e 2.0 x1
Slot 3 pci-e 2.0 x8
Slot 4 pci-e 2.0 x1
Slot 5 pci-e 2.0 x16x8
Slot 6 pci-e 2.0 x4


----------



## Ashura

Hey guys, I own a sapphire R9 280x dual x OC edition.

http://www.techpowerup.com/gpuz/cmu22/

Should I update the BIOS to 015.042 ?
http://www.techpowerup.com/vgabios/index.php?manufacturer=Sapphire&model=R9+280X

Is it recommended to update the BIOS? Will it help in Ocing?

Also, cpuz shows this card as a 7970.(I know its my card is basically the same as a 7970, just letting you guys know, if in case it tells you something







)


----------



## Kokumotsu

guys whats the stock voltage for the sapphire Dual X 280x OC (00363BF4L) version
mine is giving 1018 but i got it used and i get driver crashes in games unless i add voltage

also since is used. i just stresstested and my vrm temps are high on stock. what can i do to lower the temps. i have a fan pointing at my gpu


----------



## imran27

Hi guys, I'm thinking of buying a 270X card, can you recommend me a which card is most overclockable and is guaranteed to have Hynix memory.


----------



## Farih

Quote:


> Originally Posted by *imran27*
> 
> Hi guys, I'm thinking of buying a 270X card, can you recommend me a which card is most overclockable and is guaranteed to have Hynix memory.


Sapphire 270x Toxic always come with Hynix i think and they overclock ok to.


----------



## imran27

What about Vapor-X and Dual-X...if they are guaranteed to be Hynix then I might get the Dual-X 4GB version, ultra qual+too much AA uses quite a lot of VRAM...


----------



## Farih

Quote:


> Originally Posted by *imran27*
> 
> What about Vapor-X and Dual-X...if they are guaranteed to be Hynix then I might get the Dual-X 4GB version, ultra qual+too much AA uses quite a lot of VRAM...


No idea with those cards sorry, just remember 2GB Vram is enough for 1080P for 95% of all games.


----------



## imran27

I read on a forum that Sapphire has changed to Elpida in their recent batches of Toxic cards, It specifically mentioned 280X, no idea if it's really true & it is for all toxic cards or not, I hope it's not.

Else, Sapphire Toxic has always been my no.1 choice...what is the max OC achieved on toxic 270x (in this thread at-least, stock tri-x cooler)???

Thanks Farih, thanks a lot...


----------



## dabby91

Just out of interest, is there any way to lower the idle fan speed? I tried turning it down in afterburner and CCC however it seems to stay at a minimum of 20%.

The DC2 cooler is currently the noisiest thing in my case atm


----------



## Kokumotsu

Quote:


> Originally Posted by *Kokumotsu*
> 
> guys whats the stock voltage for the sapphire Dual X 280x OC (00363BF4L) version
> mine is giving 1018 but i got it used and i get driver crashes in games unless i add voltage
> 
> also since is used. i just stresstested and my vrm temps are high on stock. what can i do to lower the temps. i have a fan pointing at my gpu


thjat feeling when my post is SKIPPED =(


----------



## Ashura

Quote:


> Originally Posted by *Kokumotsu*
> 
> thjat feeling when my post is SKIPPED =(










me too









The stock voltage is 1.225 for my dual X @1020 /1500(Trixx)


----------



## Kokumotsu

Quote:


> Originally Posted by *Ashura*
> 
> 
> 
> 
> 
> 
> 
> 
> me too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The stock voltage is 1.225 for my dual X @1020 /1500(Trixx)


is this the new one ( with the grey in middle of card) or the old.
Also how are your vrm temps.
I just updated the bios to 15.042. It helped slightly but vrms are still high
Vrm1 being 66-70
Vrm2 being the the same


----------



## nvidiageek

Hi guys, I had built an all AMD rig few months ago. FX 8320 @ 3.7 GHz, 16GB 1833 GHz GSkill RipjawsX RAM, ASUS M5A97 R2.0 and XFX R9 280X @ 1.1 GHz with Corsair TX750 PSU.

I've run the Unigine Heaven 4.0 benchmark and it's giving me 795 score with 31.2 fps. I feel this is low compared to the benchmarks I've seen for other R9 280X cards. Also I get fps of 25-40 on High, 4x MSAA @ 1080p (DX11) in BF4, Mantle gives me 55-75 fps, no Vsync on both. And here's the funny thing, on Medium settings in Watch_Dogs, I get 15-20 fps and while driving it goes down to 10.

I'm on 14.6 and Win 8.1 64-bit. I've always felt like I wasn't getting the performance I expected from this rig. I had one stick of 8 GB RAM, bought another to make it dual-channel. And now, I've read that FX 8320 isn't a good CPU. I had the choice of buying Intel Core i5 4670, but being an AMD guy for 10 years, I stuck to it.









So, has my GPU gone mental? Or is it the CPU? Or is everything fine?


----------



## IVWolfeVI

Quote:


> Originally Posted by *nvidiageek*
> 
> Hi guys, I had built an all AMD rig few months ago. FX 8320 @ 3.7 GHz, 16GB 1833 GHz GSkill RipjawsX RAM, ASUS M5A97 R2.0 and XFX R9 280X @ 1.1 GHz with Corsair TX750 PSU.
> 
> I've run the Unigine Heaven 4.0 benchmark and it's giving me 795 score with 31.2 fps. I feel this is low compared to the benchmarks I've seen for other R9 280X cards. Also I get fps of 25-40 on High, 4x MSAA @ 1080p (DX11) in BF4, Mantle gives me 55-75 fps, no Vsync on both. And here's the funny thing, on Medium settings in Watch_Dogs, I get 15-20 fps and while driving it goes down to 10.
> 
> I'm on 14.6 and Win 8.1 64-bit. I've always felt like I wasn't getting the performance I expected from this rig. I had one stick of 8 GB RAM, bought another to make it dual-channel. And now, I've read that FX 8320 isn't a good CPU. I had the choice of buying Intel Core i5 4670, but being an AMD guy for 10 years, I stuck to it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So, has my GPU gone mental? Or is it the CPU? Or is everything fine?


That seems really low considering on ultra 1080p bf4 I get 50 to 60 with a 270x 4gb. With an i5 4440 and 12 gb of ram


----------



## nvidiageek

Quote:


> Originally Posted by *IVWolfeVI*
> 
> That seems really low considering on ultra 1080p bf4 I get 50 to 60 with a 270x 4gb. With an i5 4440 and 12 gb of ram


DX11 or Mantle?


----------



## ramsclub

Guess I will be joining. I bought a MSI R9 280 Sunday for $200.00 Newegg had it 20% off with a $30 mail in rebate.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *nvidiageek*
> 
> DX11 or Mantle?


What drivers did you install?


----------



## nvidiageek

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> What drivers did you install?


14.6. I always use DDU whenever updating the drivers.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *nvidiageek*
> 
> Hi guys, I had built an all AMD rig few months ago. FX 8320 @ 3.7 GHz, 16GB 1833 GHz GSkill RipjawsX RAM, ASUS M5A97 R2.0 and XFX R9 280X @ 1.1 GHz with Corsair TX750 PSU.
> 
> I've run the Unigine Heaven 4.0 benchmark and it's giving me 795 score with 31.2 fps. I feel this is low compared to the benchmarks I've seen for other R9 280X cards. Also I get fps of 25-40 on High, 4x MSAA @ 1080p (DX11) in BF4, Mantle gives me 55-75 fps, no Vsync on both. And here's the funny thing, on Medium settings in Watch_Dogs, I get 15-20 fps and while driving it goes down to 10.
> 
> I'm on 14.6 and Win 8.1 64-bit. I've always felt like I wasn't getting the performance I expected from this rig. I had one stick of 8 GB RAM, bought another to make it dual-channel. And now, I've read that FX 8320 isn't a good CPU. I had the choice of buying Intel Core i5 4670, but being an AMD guy for 10 years, I stuck to it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So, has my GPU gone mental? Or is it the CPU? Or is everything fine?


I am downloading heaven now to compare. as I have pretty much the same setup.. Overclock that 8320 man.. you will see a big difference. and stock for stock and 8350 vs an 3750k is very close, so your chip is not bad just need to know how to get it there.. Put your rig in the sig and also what cooler do you have for the CPU


----------



## nvidiageek

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> I am downloading heaven now to compare. as I have pretty much the same setup.. Overclock that 8320 man.. you will see a big difference. and stock for stock and 8350 vs an 3750k is very close, so your chip is not bad just need to know how to get it there.. Put your rig in the sig and also what cooler do you have for the CPU


Yeah, that's what I thought. I'm currently using stock cooler, I was going for Noctua DH-14(?) I just wanted to make sure my GPU or CPU isn't borked.

Please do share your results.


----------



## F3ERS 2 ASH3S

For everyone else wanting the Beta 14.6
http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx?cmpid=social24760556
Quote:


> Originally Posted by *nvidiageek*
> 
> Yeah, that's what I thought. I'm currently using stock cooler, I was going for Noctua DH-14(?) I just wanted to make sure my GPU or CPU isn't borked.
> 
> Please do share your results.


If you go for the Noctura you should be able to hit 4.6-4.8


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *nvidiageek*
> 
> Yeah, that's what I thought. I'm currently using stock cooler, I was going for Noctua DH-14(?) I just wanted to make sure my GPU or CPU isn't borked.
> 
> Please do share your results.


Ok I got a 869 score but my CPU is @ 5.1Ghz that was on extreme at 1080p full screen


----------



## End3R

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Ok I got a 869 score but my CPU is @ 5.1Ghz that was on extreme at 1080p full screen


That's with an 8350 @5ghz with a 280x? I can't imagine bumping the resolution to 1920x1080 would drop your score that much, because you should be scoring higher than mine (I'm at stock on both my card and cpu, and running these at ExtremeHD (just says custom because my monitor is 1680x1050, the higher score is dropping the AA to 2x)


Spoiler: Warning: Spoiler!


----------



## nvidiageek

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Ok I got a 869 score but my CPU is @ 5.1Ghz that was on extreme at 1080p full screen


Hmm... So there is no problem with my setup, but the reviews of the other R9 280X cards show a score of ~1150. What was your fps?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *End3R*
> 
> That's with an 8350 @5ghz with a 280x? I can't imagine bumping the resolution to 1920x1080 would drop your score that much, because you should be scoring higher than mine (I'm at stock on both my card and cpu, and running these at ExtremeHD (just says ultra because my monitor is 1680x1050, the higher score is dropping the AA to 2x)
> 
> 
> Spoiler: Warning: Spoiler!


At that res I score about 100 points higher than you


----------



## nvidiageek

Quote:


> Originally Posted by *End3R*
> 
> That's with an 8350 @5ghz with a 280x? I can't imagine bumping the resolution to 1920x1080 would drop your score that much, because you should be scoring higher than mine (I'm at stock on both my card and cpu, and running these at ExtremeHD (just says ultra because my monitor is 1680x1050, the higher score is dropping the AA to 2x)
> 
> 
> Spoiler: Warning: Spoiler!


That's Valley, try Heaven.


----------



## End3R

Quote:


> Originally Posted by *nvidiageek*
> 
> That's Valley, try Heaven.


Valley is newer btw.
Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> At that res I score about 100 points higher than you


Ah ok, still I'd expect a higher jump with a better cpu and gpu.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *End3R*
> 
> Valley is newer btw.
> Ah ok, still I'd expect a higher jump with a better cpu and gpu.


Newer yes scores are different though.


----------



## End3R

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Newer yes scores are different though.


True true, I thought he was running valley, I'll re-download heaven to update it

So not sure if something has changed but the scores I'm getting are lower than I remember, but oh well. Weird that it's picking my OS up as Windows NT...


Spoiler: Warning: Spoiler!















I also noticed I had all my supersampling etc boosted in CCC when I ran my valley benches, fixing that:

Again this is ExtremeHD preset, it just shows as custom because my monitor can only do 1680x1050. My card is at a stock 1050/1400.


Spoiler: Warning: Spoiler!


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *End3R*
> 
> That's with an 8350 @5ghz with a 280x? I can't imagine bumping the resolution to 1920x1080 would drop your score that much, because you should be scoring higher than mine (I'm at stock on both my card and cpu, and running these at ExtremeHD (just says custom because my monitor is 1680x1050, the higher score is dropping the AA to 2x)
> 
> 
> Spoiler: Warning: Spoiler!


This is the difference that you were talking about


----------



## End3R

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> This is the difference that you were talking about


huh?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *End3R*
> 
> huh?


the 1st score was the valley score that you said I should have gotten a better score on.... cause you didn't realize it was heaven I benched


----------



## nvidiageek

Quote:


> Originally Posted by *End3R*
> 
> True true, I thought he was running valley, I'll re-download heaven to update it
> 
> So not sure if something has changed but the scores I'm getting are lower than I remember, but oh well. Weird that it's picking my OS up as Windows NT...
> 
> 
> Spoiler: Warning: Spoiler!


So, it is FX-8320. Should've gone with Intel but, I'd rather buy a cooler now and wait for next series of CPUs and GPUs.


----------



## End3R

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> the 1st score was the valley score that you said I should have gotten a better score on.... cause you didn't realize it was heaven I benched


Ah ok







Quote:


> Originally Posted by *nvidiageek*
> 
> So, it is FX-8320.


So what is?


----------



## nvidiageek

Quote:


> Originally Posted by *End3R*
> So what is?


The reason for low score and bad performance in games.


----------



## End3R

Quote:


> Originally Posted by *nvidiageek*
> 
> Reason for low score and bad performance in games.


Hmm I haven't had any bad performance in any games (unless physx is concerned).... And the score is only low in heaven for some reason.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *nvidiageek*
> 
> The reason for low score and bad performance in games.


You just need to clock that chip is all.. On the stock cooler you should be able to clock it to 4.2 like nothing


----------



## nvidiageek

Quote:


> Originally Posted by *End3R*
> 
> Hmm I haven't had any bad performance in any games (unless physx is concerned).... And the score is only low in heaven for some reason.


Have you tried Watch_Dogs? Or BF4? What fps do you get in BF4 (DX11)@ Ultra?


----------



## nvidiageek

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> You just need to clock that chip is all.. On the stock cooler you should be able to clock it to 4.2 like nothing


Mate, it already reaches 67 while playing BF4.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *nvidiageek*
> 
> Mate, it already reaches 67 while playing BF4.


Thats fine, max safe temp is 70c

use mantle with an AMD GPU you will get steadier frames


----------



## End3R

Quote:


> Originally Posted by *nvidiageek*
> 
> Have you tried Watch_Dogs? Or BF4? What fps do you get in BF4 (DX11)@ Ultra?


I'm waiting for Watch Dogs to be on sale a month from now due to the terrible launch. And I don't play BF4, I do however play other graphic intensive games. Keep in mind most these benches were run with my supersampling, edge-detect, and morphological filtering turned *ON* in ccc. And were all in Dx11 if it was an option. Messing with settings since running the benches in these charts (since I ran them awhile ago) I've figured out the correct settings to keep Assassins creed on near max with an avg of 50 fps.


Spoiler: Warning: Spoiler!


----------



## nvidiageek

Quote:


> Originally Posted by *End3R*
> 
> I'm waiting for Watch Dogs to be on sale a month from now due to the terrible launch. And I don't play BF4, I do however play other graphic intensive games. Keep in mind most these benches were run with my supersampling, edge-detect, and morphological filtering turned *ON* in ccc. And were all in Dx11 if it was an option. Messing with settings since running the benches in these charts (since I ran them awhile ago) I've figured out the correct settings to keep Assassins creed on near max with an avg of 50 fps.
> 
> 
> Spoiler: Warning: Spoiler!


Your Thief fps is pretty much what I get, and also ACIV. Shouldn't my card be giving a better performance?


----------



## End3R

Quote:


> Originally Posted by *nvidiageek*
> 
> Your Thief fps is pretty much what I get, and also ACIV. Shouldn't my card be giving a better performance?


I'd assume so but what are your speeds at? I haven't oc'd mine at all, I'm at 1050/1400.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *nvidiageek*
> 
> Your Thief fps is pretty much what I get, and also ACIV. Shouldn't my card be giving a better performance?


Are you playing with an SSD or HDD?

Also from what I see there, he has modified the settings, I am assuming you are just running them vanilla correct?


----------



## End3R

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Also from what I see there, he has modified the settings, I am assuming you are just running them vanilla correct?


Here are my settings for AC IV that gets me an avg of 50 fps, and nothing was modified in Thief, other than the supersampling/edge-detect/morphplogical filtering but those decrease performance for quality










Spoiler: Warning: Spoiler!


----------



## nvidiageek

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Are you playing with an SSD or HDD?
> 
> Also from what I see there, he has modified the settings, I am assuming you are just running them vanilla correct?


I use HDD, and my clocks are at 1100/1622. And yeah, I haven't done any changes in CCC. I use the settings in-game itself.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *nvidiageek*
> 
> I use HDD, and my clocks are at 1100/1622. And yeah, I haven't done any changes in CCC. I use the settings in-game itself.


A lot of my stuff is on SSD, only put games that I don't play much on HDD... the could also explain the difference in frames. When I added an SSD I saw a 5-10FPS jump


----------



## Bumtsiki

Can somebody tell me about VTX3D brand, i´m choosing a new card (sold my msi 270) and i have two options. Brand new VTX3D 280 card or coined msi 280x with warranty (used). If this VTX3D is resonable and the performance between 280x isnt big i would like to go with brand new card.

Also the first thing is little OC to 280 to bump it up to 280x or near to that.


----------



## Recr3ational

Quote:


> Originally Posted by *Bumtsiki*
> 
> Can somebody tell me about VTX3D brand, i´m choosing a new card (sold my msi 270) and i have two options. Brand new VTX3D 280 card or coined msi 280x with warranty (used). If this VTX3D is resonable and the performance between 280x isnt big i would like to go with brand new card.
> 
> Also the first thing is little OC to 280 to bump it up to 280x or near to that.


Get the 280x for sure. I have a coined 280x in my rig. Perfectly fine. You got warranty too. So might as well get the better card


----------



## nvidiageek

So, I've overclocked my CPU to 4 GHz (stock cooler) and it's pretty stable. I've run Unigine Valley benchmark and this is the result:


Spoiler: Warning: Spoiler!







This was the GPU usage:


Spoiler: Warning: Spoiler!







I've also installed 13.12 drivers, as it was giving me more fps in BF4 than 14.6 beta.

So is everything fine? I feel that the GPU usage is erratic.


----------



## DarthBaggins

Have you tried the 14.2 drivers? Also looks good really you should have had a higher frame rate IMO


----------



## End3R

Quote:


> Originally Posted by *nvidiageek*
> 
> So, I've overclocked my CPU to 4 GHz (stock cooler) and it's pretty stable. I've run Unigine Valley benchmark and this is the result:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> This was the GPU usage:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I've also installed 13.12 drivers, as it was giving me more fps in BF4 than 14.6 beta.
> 
> So is everything fine? I feel that the GPU usage is erratic.


Wow I didn't think anyone actually still played Warface...

In all seriousness though,
Your bench looks okay to me, mine was slightly better but it's also at a lower resolution. Still at Extreme HD preset though (as i explained previously it just says custom because my monitor cant do the 1920x1080) so I'd assume your score would be higher than mine if you were at 1680x1050.


Spoiler: Warning: Spoiler!















Let me run it again with gpu-z open to see what my usage looks like....

Edit: Here ya go, the part outlined in green is when I was running then bench, and it was pretty much at 98% the whole time.


Spoiler: Warning: Spoiler!


----------



## DarthBaggins

I'll run it on mine in xfire (7870 with r9 270x) to see.


----------



## nvidiageek

Quote:


> Originally Posted by *End3R*
> 
> Wow I didn't think anyone actually still played Warface...
> 
> In all seriousness though,
> Your bench looks okay to me, mine was slightly better but it's also at a lower resolution. Still at Extreme HD preset though (as i explained previously it just says custom because my monitor cant do the 1920x1080) so I'd assume your score would be higher than mine if you were at 1680x1050.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Let me run it again with gpu-z open to see what my usage looks like....
> 
> Edit: Here ya go, the part outlined in green is when I was running then bench, and it was pretty much at 98% the whole time.
> 
> 
> Spoiler: Warning: Spoiler!


I was on a F2P train for a few days. Tried as many F2P games as possible.







I rarely play it now, it's mostly BF4 and Thief.

So, my GPU usage isn't upto the mark. Is there any possibility that my card might be bad? I cleaned my PC yesterday, and ensured all the wires are firmly attached.

I've seen a guy with same proccy as mine with 7970 scoring ~2450 with 57.3 fps in the Valley Scores thread. That really bugged me.


----------



## End3R

Quote:


> Originally Posted by *nvidiageek*
> 
> I was on a F2P train for a few days. Tried as many F2P games as possible.
> 
> 
> 
> 
> 
> 
> 
> I rarely play it now, it's mostly BF4 and Thief.
> 
> So, my GPU usage isn't upto the mark. Is there any possibility that my card might be bad? I cleaned my PC yesterday, and ensured all the wires are firmly attached.
> 
> I've seen a guy with same proccy as mine with 7970 scoring ~2450 with 57.3 fps in the Valley Scores thread. That really bugged me.


It's possible, do you have another card you can swap out to test? Then again your card is more powerful than mine so it may not need to be at 98% the whole time.


----------



## Kokumotsu

His 7970 might be over clocked
My 280x gets 1850-60. And 45fps stock

And this is a little off topic. But I'm selling my HD 7870 Hawk. But I don't know how much its worth ( I assume around 150)
But I cant list it in the seller threads because I can never get rep XD
Pm if interested ( us 48 only)


----------



## nvidiageek

Quote:


> Originally Posted by *End3R*
> 
> It's possible, do you have another card you can swap out to test? Then again your card is more powerful than mine so it may not need to be at 98% the whole time.


I don't have any other card, is there any way to check whether my card is borked or not?

Also, the usage of my card goes to zero the moment there is a black screen, like when the scenes change in Heaven, the usage usage becomes zero. I'm assuming heavy power saving feature.


----------



## creationsh

Quote:


> Originally Posted by *nvidiageek*
> 
> I was on a F2P train for a few days. Tried as many F2P games as possible.
> 
> 
> 
> 
> 
> 
> 
> I rarely play it now, it's mostly BF4 and Thief.
> 
> So, my GPU usage isn't upto the mark. Is there any possibility that my card might be bad? I cleaned my PC yesterday, and ensured all the wires are firmly attached.
> 
> I've seen a guy with same proccy as mine with 7970 scoring ~2450 with 57.3 fps in the Valley Scores thread. That really bugged me.


If he's running it in ExtremeHD @ 1080p, then it would also puzzled me too. My score is 2083 and 49.8 fps @ 1200mhz/1750.
Edit: Running 2500k @ 4.5ghz


----------



## nvidiageek

Quote:


> Originally Posted by *creationsh*
> 
> If he's running it in ExtremeHD @ 1080p, then it would also puzzled me too. My score is 2083 and 49.8 fps @ 1200mhz/1750.


Yes, one should post the score only if it's Extreme HD @ 1080p, that's that thread rule, it's here in Overclock.net. But, I'm puzzled why my GPU isn't even close to 40 fps. What's your CPU mate?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *nvidiageek*
> 
> Yes, one should post the score only if it's Extreme HD @ 1080p, that's that thread rule, it's here in Overclock.net. But, I'm puzzled why my GPU isn't even close to 40 fps. What's your CPU mate?


How did you remove nvidia drivers? Or is this a fresh os install?


----------



## creationsh

Quote:


> Originally Posted by *nvidiageek*
> 
> Yes, one should post the score only if it's Extreme HD @ 1080p, that's that thread rule, it's here in Overclock.net. But, I'm puzzled why my GPU isn't even close to 40 fps. What's your CPU mate?


I run a 2500k @ 4.5 ghz. I ran the test again at stock gpu settings and got 42.6 fps, score 1780. I'm guessing if I ran @ 3.3 ghz, I would probably be seeing results similar to yours..


----------



## nvidiageek

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> How did you remove nvidia drivers? Or is this a fresh os install?


I had HD5850 before I bought R9 280X, and I always use DDU when installing drivers.


----------



## rdr09

Quote:


> Originally Posted by *nvidiageek*
> 
> Yes, one should post the score only if it's Extreme HD @ 1080p, that's that thread rule, it's here in Overclock.net. But, I'm puzzled why my GPU isn't even close to 40 fps. What's your CPU mate?


it might be an unstable oc. try running Valley at stock. you should get around 39 using the same settings.

if you need to reinstall driver i suggest these steps . . .

1. Use 13.12 to UNINSTALL the existing driver. Use Express and answer yes to delete all folders associated with the driver. Reboot.

2. Run 13.12 again and this time - INSTALL it. Again, use Express. Reboot.

edit: Let the process finish. It may take awhile to get to the desktop. Be patient and just let it finish.

BTW, you can run another bench and compare with others. I recommend 3DMark11.


----------



## GroovyMotion

Greetings, I currently own a ZOTAC GeForce GT 640 2Gb and looking into getting an MSI R9 270X GAMING 2G, it seems to have pretty good reviews. Anyone has feedback on this particular one?
Also, since I have an NVIDIA card is it best to reinstall win7 so that there are no remains of NVIDIA when I install the AMD?


----------



## nvidiageek

Wow! I just flashed a new version of BIOS through AI Suite and went back to stock clocks (just the GPU), not I'm getting 8 fps in Valley. ***?


----------



## End3R

Quote:


> Originally Posted by *nvidiageek*
> 
> Wow! I just flashed a new version of BIOS through AI Suite and went back to stock clocks (just the GPU), not I'm getting 8 fps in Valley. ***?


Did you save a backup of your old vbios? Something may have messed up I'd suggest going back. Also, did you already clean/update your drivers?


----------



## nvidiageek

Quote:


> Originally Posted by *End3R*
> 
> Did you save a backup of your old vbios? Something may have messed up I'd suggest going back. Also, did you already clean/update your drivers?


Reinstalled the graphics driver, now it's back to normal. Phew! Currently running Valley at stock clocks.

EDIT:
At stock clock (GPU):

Valley:


Spoiler: Warning: Spoiler!







GPU Usage during benchmarking:


Spoiler: Warning: Spoiler!







At 1.1 GHz:

Valley:


Spoiler: Warning: Spoiler!







GPU Usage:


Spoiler: Warning: Spoiler!







So OCing does give me some benefits, but the usage. IDK why it's not nearly 100%.


----------



## End3R

Quote:


> Originally Posted by *nvidiageek*
> 
> So OCing does give me some benefits, but the usage. IDK why it's not nearly 100%.


Your voltage looks low, try boosting it a bit.


----------



## nvidiageek

Quote:


> Originally Posted by *End3R*
> 
> Your voltage looks low, try boosting it a bit.


It's at 1250 in Afterburner. My GPU isn't bad right? No need to worry about RMA and things like that?


----------



## End3R

Quote:


> Originally Posted by *nvidiageek*
> 
> It's at 1250 in Afterburner. My GPU isn't bad right? No need to worry about RMA and things like that?


Well according to your gpu-z you're card is getting 11.97v and 0.846v, my weaker card gets 12.13v and 1.13v. (12v and vddc)


----------



## creationsh

Quote:


> Originally Posted by *End3R*
> 
> Well according to your gpu-z you're card is getting 11.97v and 0.846v, my weaker card gets 12.13v and 1.13v. (12v and vddc)


The reading you saw was from idle.


----------



## End3R

Quote:


> Originally Posted by *creationsh*
> 
> The reading you saw was from idle.


Even at idle my 12v doesn't drop below 12.13v, but you're right the vddc might be ok, not sure what it was at during the bench. My VDDC at idle is .873V.


----------



## creationsh

Here is a screenshot at one of my benchmark. As you can see mine does hit 11.97v while it was at load.
http://i61.tinypic.com/2cdbfut.png


----------



## End3R

Quote:


> Originally Posted by *creationsh*
> 
> Here is a screenshot at one of my benchmark. As you can see mine does hit 11.97v while it was at load.
> http://i61.tinypic.com/2cdbfut.png


Cool guess my 270x just pulls a bit more there for some reason, but you're also at a nice 1.2 on your VDDC, so I guess the question is nvidiageek, what is your VDDC at during the bench?


----------



## nvidiageek

Quote:


> Originally Posted by *End3R*
> 
> Cool guess my 270x just pulls a bit more there for some reason, but you're also at a nice 1.2 on your VDDC, so I guess the question is nvidiageek, what is your VDDC at during the bench?


During benchmarking:


Spoiler: Warning: Spoiler!


----------



## End3R

Quote:


> Originally Posted by *nvidiageek*
> 
> During benchmarking:
> 
> 
> Spoiler: Warning: Spoiler!


I dunno, im just guessing here but your current looks erratic, maybe it's a power supply issue, anyone else got any ideas? Whether it's the card or psu it seems to be a power issue.


----------



## nvidiageek

Quote:


> Originally Posted by *End3R*
> 
> I dunno, im just guessing here but your current looks erratic, maybe it's a power supply issue, anyone else got any ideas? Whether it's the card or psu it seems to be a power issue.


For the 8-pin connector in my GPU, I've used two 6-pin to one 8-pin connector, as my PSU has no direct 8-pin connector except for the CPU.


----------



## End3R

Quote:


> Originally Posted by *nvidiageek*
> 
> For the 8-pin connector in my GPU, I've used two 6-pin to one 8-pin connector, as my PSU has no direct 8-pin connector except for the CPU.


I think we found the culprit.


----------



## nvidiageek

Quote:


> Originally Posted by *End3R*
> 
> I think we found the culprit.


But there's one, 6-pin with another dangling 2-pin thingy, will it suffice?


----------



## End3R

Quote:


> Originally Posted by *nvidiageek*
> 
> But there's one, 6-pin with another dangling 2-pin thingy, will it suffice?


Yea


----------



## nvidiageek

Quote:


> Originally Posted by *End3R*
> 
> Yea


Will get back after connecting.


----------



## End3R

Quote:


> Originally Posted by *nvidiageek*
> 
> Will get back after connecting.


good luck


----------



## nvidiageek

No change. :< It actually reduced the fps to 26 in Valley.


Spoiler: Warning: Spoiler!


----------



## End3R

Quote:


> Originally Posted by *nvidiageek*
> 
> No change. :< It actually reduced the fps to 26 in Valley.
> 
> 
> Spoiler: Warning: Spoiler!


Not sure then, someone else may wanna jump in


----------



## Nfsdude0125

Weird that you couldn't see images on my profile but here it is, 270X Dual-X 4GB.


----------



## Devildog83

Quote:


> Originally Posted by *GroovyMotion*
> 
> Greetings, I currently own a ZOTAC GeForce GT 640 2Gb and looking into getting an MSI R9 270X GAMING 2G, it seems to have pretty good reviews. Anyone has feedback on this particular one?
> Also, since I have an NVIDIA card is it best to reinstall win7 so that there are no remains of NVIDIA when I install the AMD?


The Gaming should be good, MSI's Hawk and Gaming cards have good rep. You should see a serious performance boost from a 640.


----------



## Devildog83

I am on vacation and sick to boot but if anyone needs to be added or updated please speak up or PM me because I am having trouble concentrating enough to go through all of the posts and catch up. Sorry.


----------



## nvidiageek

I've got a 3DMark 11 score of 3132. That's terribly low, when R9 280X is capable of getting 10k. I think something is very wrong.

My 3DMark 11 score:


Spoiler: Warning: Spoiler!







Review score of a R9 280X card:


Spoiler: Warning: Spoiler!







However, my VRM temps, GPU loads and VDDC current while running this benchmark is fine. All these except VRM temps are erractic while running games and Unigine benchmarks.

The card's broke?


----------



## Devildog83

Quote:


> Originally Posted by *nvidiageek*
> 
> I've got a 3DMark 11 score of 3132. That's terribly low, when R9 280X is capable of getting 10k. I think something is very wrong.
> 
> My 3DMark 11 score:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Review score of a R9 280X card:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> However, my VRM temps, GPU loads and VDDC current while running this benchmark is fine. All these except VRM temps are erractic while running games and Unigine benchmarks.
> 
> The card's broke?


The 1st one is 3DMark11 extreme and the second is performance. Totally different that's why.


----------



## nvidiageek

Quote:


> Originally Posted by *Devildog83*
> 
> The 1st one is 3DMark11 extreme and the second is performance. Totally different that's why.


I thought 'X' was a fail grade or something.









This was mine in performance setting.


Spoiler: Warning: Spoiler!







Low Physics score, maybe because of the CPU, so can I rule out that my GPU is broken?


----------



## neurotix

Quote:


> Originally Posted by *nvidiageek*
> 
> I've got a 3DMark 11 score of 3132. That's terribly low, when R9 280X is capable of getting 10k. I think something is very wrong.
> 
> My 3DMark 11 score:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Review score of a R9 280X card:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> However, my VRM temps, GPU loads and VDDC current while running this benchmark is fine. All these except VRM temps are erractic while running games and Unigine benchmarks.
> 
> The card's broke?


Uh, your 3132 score is on the "Extreme" preset. The review score is on the "performance" preset. That's why your score is low







Rerun the benchmark on "performance" and you'll get a better score.

Also, your Valley scores are pathetically low for a R9 280X. Either something is seriously wrong with your PC, something is seriously wrong with your Windows install, you have a bad card, or some combination of all those.


----------



## nvidiageek

Quote:


> Originally Posted by *neurotix*
> 
> Uh, your 3132 score is on the "Extreme" preset. The review score is on the "performance" preset. That's why your score is low
> 
> 
> 
> 
> 
> 
> 
> Rerun the benchmark on "performance" and you'll get a better score.
> 
> Also, your Valley scores are pathetically low for a R9 280X. Either something is seriously wrong with your PC, something is seriously wrong with your Windows install, you have a bad card, or some combination of all those.


Look at my performance setting score above your post, I don't mind low CPU score, but I want to know if my GPU isn't gone wrong.


----------



## End3R

Quote:


> Originally Posted by *nvidiageek*
> 
> Look at my performance setting score above your post, I don't mind low CPU score, but I want to know if my GPU isn't gone wrong.


I'm downloading the demo for 3dmark11 right now, these were my benches for just "3dmark" which i "think" is more recent. Physics always screw over ATI









Spoiler: Warning: Spoiler!


----------



## nvidiageek

Quote:


> Originally Posted by *End3R*
> 
> I'm downloading the demo for 3dmark11 right now, these were my benches for just "3dmark" which i "think" is more recent. Physics always screw over ATI
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


I just want to say thanks for you and all the others for the involvement in my problem. Appreciate it a lot.









EDIT: Just ran Metro: LL benchmark and I got 18 fps compared to 50 fps in most reviews. CPU and GPU combo isn't working very well I'm guessing.


----------



## End3R

Quote:


> Originally Posted by *nvidiageek*
> 
> I just want to say thanks for you and all the others for the involvement in my problem. Appreciate it a lot.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Just ran Metro: LL benchmark and I got 18 fps compared to 50 fps in most reviews. CPU and GPU combo isn't working very well I'm guessing.


I dunno, 3dmark11 doesn't seem to like my card as much as 3dmark, but my score is lower than yours so it seems to be getting the right scores. Physics just kill atis









I'm at stock 1050/1400


Spoiler: Warning: Spoiler!


----------



## nvidiageek

Quote:


> Originally Posted by *End3R*
> 
> I dunno, 3dmark11 doesn't seem to like my card as much as 3dmark, but my score is lower than yours so it seems to be getting the right scores. Physics just kill atis
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm at stock 1050/1400
> 
> 
> Spoiler: Warning: Spoiler!


Maybe because of your GPU, but your game benchmarks are great for your card. I'm sure the low physics score is because of the CPU.

My GPU and CPU combo isn't working well. Getting a pathetic 18 fps in Metro LL benchmark, when others with R9 280X are getting close to 50.

EDIT: So I finally found out the culprit, it was the setting in BIOS. I used the AMD Bulldozer OCing thread and followed it (however my board doesn't have DIGI+ settings), set the AI Tweaker to Manual and bam, avg fps of 43 from 18 in Metro LL. However, my CPU wasn't downclocking on it's own, it was always running at full speed.

I just need to solve this last puzzle on downclocking the CPU while overclocked and my PC is back to it's glory. GPU isn't borked, nor the PSU and CPU is just fine.

So I only need help in downclocking the CPU when it's OC'd which isn't happening currently.


----------



## rdr09

Quote:


> Originally Posted by *nvidiageek*
> 
> Maybe because of your GPU, but your game benchmarks are great for your card. I'm sure the low physics score is because of the CPU.
> 
> My GPU and CPU combo isn't working well. Getting a pathetic 18 fps in Metro LL benchmark, when others with R9 280X are getting close to 50.
> 
> EDIT: So I finally found out the culprit, it was the setting in BIOS. I used the AMD Bulldozer OCing thread and followed it (however my board doesn't have DIGI+ settings), set the AI Tweaker to Manual and bam, avg fps of 43 from 18 in Metro LL. However, my CPU wasn't downclocking on it's own, it was always running at full speed.
> 
> I just need to solve this last puzzle on downclocking the CPU while overclocked and my PC is back to it's glory. GPU isn't borked, nor the PSU and CPU is just fine.
> 
> So I only need help in *downclocking the CPU when it's OC'd* which isn't happening currently.


have you tried enabling Cool & Quiet in BIOS? that and set Power options in Windows to Balance.

Setting these might keep the cpu from pushing the gpu in some games.


----------



## nvidiageek

Quote:


> Originally Posted by *rdr09*
> 
> have you tried enabling Cool & Quiet in BIOS? that and set Power options in Windows to Balance.
> 
> Setting these might keep the cpu from pushing the gpu in some games.


Yes, the power is set to Balance in Windows. But after I updated the BIOS to 2501, there's no option of Cool n Quiet as well as C1E and SVM. I'm assuming it'll be always turned on.


----------



## rdr09

Quote:


> Originally Posted by *nvidiageek*
> 
> Yes, the power is set to Balance in Windows. But after I updated the BIOS to 2501, there's no option of Cool n Quiet as well as C1E and SVM. I'm assuming it'll be always turned on.


wth? that's messed up. you checked under cpu configurations?


----------



## nvidiageek

Quote:


> Originally Posted by *rdr09*
> 
> wth? that's messed up. you checked under cpu configurations?


Yes, I did check it in CPU config, only those three settings.


----------



## rdr09

Quote:


> Originally Posted by *nvidiageek*
> 
> Yes, I did check it in CPU config, only those three settings.


can't help you. unless you flash it back to the old bios if that was not giving you issues . . . that's prolly the only way. if you really want the feature. normall oc'ers want the cpu at full bore. i never use on both my amd and intel rigs.


----------



## nvidiageek

Quote:


> Originally Posted by *rdr09*
> 
> can't help you. unless you flash it back to the old bios if that was not giving you issues . . . that's prolly the only way. if you really want the feature. normall oc'ers want the cpu at full bore. i never use on both my amd and intel rigs.


Okay then, will flash back to old BIOS.

EDIT: As soon as I set CPU & NB voltage to manual, the Cool n Quiet, C1E and SVM vanish from CPU config. It's happening with all versions of BIOS.


----------



## GroovyMotion

Quote:


> Originally Posted by *Devildog83*
> 
> The Gaming should be good, MSI's Hawk and Gaming cards have good rep. You should see a serious performance boost from a 640.


Thanks! Yeah that 640 was a temporary solution but with the new rig it doesn't cut it anymore!


----------



## Farih

Traded my 270x toxic for a 280x toxic.

Can only overclock to 1180mhz with 1.256V
Did i just get a lemon ?


----------



## Devildog83

Quote:


> Originally Posted by *Farih*
> 
> Traded my 270x toxic for a 280x toxic.
> 
> Can only overclock to 1180mhz with 1.256V
> Did i just get a lemon ?


You should be able to bump the core up to at least 1.3v and get a lot higher. The memory goes much higher than the 270x right?


----------



## Devildog83

Quote:


> Originally Posted by *End3R*
> 
> I dunno, 3dmark11 doesn't seem to like my card as much as 3dmark, but my score is lower than yours so it seems to be getting the right scores. Physics just kill atis
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm at stock 1050/1400
> 
> 
> Spoiler: Warning: Spoiler!


The Physics is not your GPU it is the CPU, futuremark bench's are not geared toward AMD CPU's so your Physics and combined will be low. If your just worrying about the card performance only pay attention to the Graphics score or use another bench like heaven or valley.

You will ntice very high Graphics here but the Physics and combined will not compete with an Intel on this bench -


----------



## JaredLaskey82

Hey there guys.
As you can see I got board over the weekend and decided to freshen up my case before my cooling parts arrive for my two 280X cards and cpu.


----------



## AmazingSchlong

So I've seen people do a mod with the Titan/780 cooler where they remove the green layer on the Geforce GTX logo so that the led is white, and was wondering if it's possible to do with the Sapphire Toxic coolers? I want to do an orange and black build with my 270x and think it would look really nice with the Sapphire logo shining white, especially along side a G710+.


----------



## Roboyto

Quote:


> Originally Posted by *AmazingSchlong*
> 
> So I've seen people do a mod with the Titan/780 cooler where they remove the green layer on the Geforce GTX logo so that the led is white, and was wondering if it's possible to do with the Sapphire Toxic coolers? I want to do an orange and black build with my 270x and think it would look really nice with the Sapphire logo shining white, especially along side a G710+.


It looks like the card lights up a yellow/orange color. There appears to be no filter giving it this color, or that you could remove. If it's like my Vapor-X 4890 then your best bet would be to change the led to the color of your choice.


----------



## Farih

Quote:


> Originally Posted by *Devildog83*
> 
> You should be able to bump the core up to at least 1.3v and get a lot higher. The memory goes much higher than the 270x right?


When i set 1.3V in Trixx or MSI AB it only goes to 1.256V
Memory i had on 1700mhz, 3dmark stable but crashed in every game instantly.... also with 1.6V on the memory.

Havent changed the bios yet but it allready feels like a lemon.


----------



## Roboyto

Quote:


> Originally Posted by *nvidiageek*
> 
> Yes, I did check it in CPU config, only those three settings.


I had the same issue with my i7 with it not downclocking.

*Power Options > Change Plan Settings > Advanced Power Settings > Processor Power Management > Minimum Processor State*



Not sure if this is the answer, but it's worth checking.


----------



## creationsh

Quote:


> Originally Posted by *nvidiageek*
> 
> It's at 1250 in Afterburner. My GPU isn't bad right? No need to worry about RMA and things like that?


I think your Bios setting for the motherbard is jacked up. Your gpu is fine from the start, however looking at your graphs, it does look like your cpu is effecting your overall performance. It is kind of tough to get a careful assessment without looking at your cpu and gpu performance/temperature/voltage/bios settings all together. I do believe that your computer can perform better. I have a feeling that your setting is set to prevent temperature from reaching a certain degree. Maybe I'm wrong? There needs to be a reason as to why your graphics is spiking. If you are still using the stock cooler for the cpu, then it needs to go immediately. Get a decent cpu cooler and start over(default everything first).

Edit: Cool'n' Quiet enabled !?


----------



## nvidiageek

Quote:


> Originally Posted by *Roboyto*
> 
> I had the same issue with my i7 with it not downclocking.
> 
> *Power Options > Change Plan Settings > Advanced Power Settings > Processor Power Management > Minimum Processor State*
> 
> 
> 
> 
> Not sure if this is the answer, but it's worth checking.


I've been a fool all this time, the Power Option was on Power Saver mode. My ignorance fooled me, I thought I had selected Balanced mode, it wasn't the case. Now everything is as it needs to be, CPU's stable @ 4 GHz, downclocks properly. Here's the Valley benchmark:


Spoiler: Warning: Spoiler!







Thank you all for involving.


----------



## creationsh

Quote:


> Originally Posted by *nvidiageek*
> 
> Thank you all for involving.










Horray!


----------



## AmazingSchlong

Coolio, I think I'd rather do that anyway. Might post pictures if/when it's done.


----------



## mick2d2

Hello

I'm not yet an owner, but am thinking of swapping my 2 x HD6850 for 3 x R9 280.

I like gaming at 5760x1080 and have read that although the third card doesn't necessarily add that many more frames, the triple Xfire can really help eliminate microstutter. I also believe it's the way to go when gaming at higher resolutions.

The 280 is the cheapest way into 3-way Xfire at about 190€ each card.

I'd appreciate any comments on this (preferably from real experience, rather than hearsay 8).

Thanks!


----------



## Recr3ational

Can you tell me what cpu you have? I can't see as I'm on the phone.


----------



## mick2d2

I have a Phenom II x6 clocked @ 3,8GHz. I also have another rig with an i5 4670K


----------



## Recr3ational

Quote:


> Originally Posted by *mick2d2*
> 
> I have a Phenom II x6 clocked @ 3,8GHz. I also have another rig with an i5 4670K


Right, well to be honest with you I think it's better just to get a better card and get two of them.

Get like 2 x 280x or even 290s.

You have to worry about bottleneck, power draw etc.

I play Eyefinity just like you, I rarely get micro stuttering. It would be advisable and possible save you a lot of hassle in the future just to crossfire a card.

That's just my opinion.

Edit: I know it's going to create hassle is because even having two cards create problems. Especially when having multi monitors.

If you were playing 4K it would make sense to go trifire. For 1080p, I don't think it's worth it.


----------



## Roboyto

Quote:


> Originally Posted by *Recr3ational*
> 
> Right, well to be honest with you I think it's better just to get a better card and get two of them.
> 
> Get like 2 x 280x or even 290s.
> 
> You have to worry about bottleneck, power draw etc.
> 
> I play Eyefinity just like you, I rarely get micro stuttering. It would be advisable and possible save you a lot of hassle in the future just to crossfire a card.
> 
> That's just my opinion.
> 
> Edit: I know it's going to create hassle is because even having two cards create problems. Especially when having multi monitors.
> 
> If you were playing 4K it would make sense to go trifire. For 1080p, I don't think it's worth it.


Yeah I agree, more cards = more problems.

280X is the better choice for air cooling.

If you have the time, patience, and resources to properly power and cool some 290s then they will shred 5760*1080. Hell, I run eyefinity off of a single 290 and it's GLORIOUS. Tomb Raider max everything and I average low-mid 40 FPS. I can just dial down AA a little and frames would come up accordingly.

Just be aware that they make alot of heat and will be tough to keep a pair properly cooled on air.

Thinking about 290s? Read here first: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


----------



## Recr3ational

Quote:


> Originally Posted by *Roboyto*
> 
> Yeah I agree, more cards = more problems.
> 
> 280X is the better choice for air cooling.
> 
> If you have the time, patience, and resources to properly power and cool some 290s then they will shred 5760*1080. Hell, I run eyefinity off of a single 290 and it's GLORIOUS. Tomb Raider max everything and I average low-mid 40 FPS. I can just dial down AA a little and frames would come up accordingly.
> 
> Just be aware that they make alot of heat and will be tough to keep a pair properly cooled on air.
> 
> Thinking about 290s? Read here first: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


Yeah man. You could always water cool 280x or something? I have dual 280x under water and it owns games at 1080p.


----------



## mick2d2

Thanks for the input. I've been gaming on two HD6850s for a while and before that on two 460s and have not had too many issues. On Eyeinfinity with games like Crysis 3 I'm down to about 25fps even if I dial things down, which is why I was thinking of changing. So it's 1 x 290, 2 x 280X or 3 x 280. I must say I'm more than a bit curious to try 3-way Xfire. It's surprising how little information/benchmarks you find on this subject. You hear a lot of people comparing SLI to normal Xfire, but very little on triple. You also hear a lot of unfavourable comparisons between SLI and Xfire, usually in favour of SLI. I actually had more headaches with the 460s than I've had with the 6850s, although I think the problem was with the motherboard, more than the gfx cards.


----------



## Roboyto

Quote:


> Originally Posted by *mick2d2*
> 
> Thanks for the input. I've been gaming on two HD6850s for a while and before that on two 460s and have not had too many issues. On Eyeinfinity with games like Crysis 3 I'm down to about 25fps even if I dial things down, which is why I was thinking of changing. So it's 1 x 290, 2 x 280X or 3 x 280. I must say I'm more than a bit curious to try 3-way Xfire. It's surprising how little information/benchmarks you find on this subject. You hear a lot of people comparing SLI to normal Xfire, but very little on triple. You also hear a lot of unfavourable comparisons between SLI and Xfire, usually in favour of SLI. I actually had more headaches with the 460s than I've had with the 6850s, although I think the problem was with the motherboard, more than the gfx cards.


Xfire or SLI always has its downsides especially if a title doesn't support or isn't optimized for multiple cards.

I have 2 systems with 290s in them and can say for the price you can get them for it is undoubtedly best bang for your buck. If you want to take a chance on a used 290(X) they are fairly plentiful at the moment.


----------



## diggiddi

Guys as of now which is the best 280X for overclocking? Noise is not a factor, btw what happened to the matrix platinums are they still in production?


----------



## imran27

As far as AMD cards are concerned I always believe that *Sapphire Toxic* are the best overclockers, no idea about matrix platinums but I guess ASUS tends to use Elpida memory on most of their cards while Sapphire's Tri-X and Toxic come with Hynix memory guaranteed. For overclocking never go with any card that is doubted to have an elpida memory because elpida's chips don't overclock well. Even if they are able to reach the same overclocked speeds as Hynix's they will provide less performance, after a particular frequency the elpida's show negative scaling (decrease in perf as you increase freq). So in short, my vote for Toxic


----------



## diggiddi

Really? i thought the Asus came with Hynix


----------



## Darkchild

i recently bought another asus hd7870 from amazon and it in fact has elpida memory.
my other asus and powercolor (both 7870s) have hynix.
From what i have found the vendors use whats available. if hynix and samsung memory is short then they use elpida.
The asus 7870 i have with the elpida wont touch the mem clocks that my others will.
They will run at 1400/1450 depending on core clocks. my original asus will do 1250/1450 easy while the powercolor will only do 1175/1400 both hynix.
The elpida asus 7870 will do 1205/1300 solo and in cfx 1175/1300.
So the elpida mem wont go over 1300. Its all luck really cause ive seen elpida mem that supposedly has hit 1400+.
Im actually glad i had a chance to test this personally cause elpida has the bad rap and now i see IN MY TESTING it is true.
also i noticed with my hynix cards both separate and in crossfire that after 1375 memory the performance went down. may not effect you 270x guys with faster mem but pay attention.


----------



## Darkchild

Quote:


> Originally Posted by *imran27*
> 
> As far as AMD cards are concerned I always believe that *Sapphire Toxic* are the best overclockers, no idea about matrix platinums but I guess ASUS tends to use Elpida memory on most of their cards while Sapphire's Tri-X and Toxic come with Hynix memory guaranteed. For overclocking never go with any card that is doubted to have an elpida memory because elpida's chips don't overclock well. Even if they are able to reach the same overclocked speeds as Hynix's they will provide less performance, after a particular frequency the elpida's show negative scaling (decrease in perf as you increase freq). So in short, my vote for Toxic


i recently bought another asus hd7870 from amazon and it in fact has elpida memory.
my other asus and powercolor (both 7870s) have hynix.
From what i have found the vendors use whats available. if hynix and samsung memory is short then they use elpida.
The asus 7870 i have with the elpida wont touch the mem clocks that my others will.
They will run at 1400/1450 depending on core clocks. my original asus will do 1250/1450 easy while the powercolor will only do 1175/1400 Solo or cfx (mem clocks that is my system dont like over 1175 core in cfx ) both hynix.
The elpida asus 7870 will do 1205/1300 solo and in cfx 1175/1300.
So the elpida mem wont go over 1300. Its all luck really cause ive seen elpida mem that supposedly has hit 1400+.
Im actually glad i had a chance to test this personally cause elpida has the bad rap and now i see IN MY TESTING it is true.
also i noticed with my hynix cards both separate and in crossfire that after 1375 memory the performance went down. may not effect you 270x guys with faster mem but pay attention.


----------



## Devildog83

Quote:


> Originally Posted by *Darkchild*
> 
> i recently bought another asus hd7870 from amazon and it in fact has elpida memory.
> my other asus and powercolor (both 7870s) have hynix.
> From what i have found the vendors use whats available. if hynix and samsung memory is short then they use elpida.
> The asus 7870 i have with the elpida wont touch the mem clocks that my others will.
> They will run at 1400/1450 depending on core clocks. my original asus will do 1250/1450 easy while the powercolor will only do 1175/1400 Solo or cfx (mem clocks that is my system dont like over 1175 core in cfx ) both hynix.
> The elpida asus 7870 will do 1205/1300 solo and in cfx 1175/1300.
> So the elpida mem wont go over 1300. Its all luck really cause ive seen elpida mem that supposedly has hit 1400+.
> Im actually glad i had a chance to test this personally cause elpida has the bad rap and now i see IN MY TESTING it is true.
> also i noticed with my hynix cards both separate and in crossfire that after 1375 memory the performance went down. may not effect you 270x guys with faster mem but pay attention.


My 7870 Devil with Elpida mem will do 1250/1475 easy
My 270x Devil with Elpida mem will only do 1235 or so on the core but will do 1590 on the memory. I think it has to do more with the controller than anything else.

You get far more boost from core clocks than memory anyway.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> My 7870 Devil with Elpida mem will do 1250/1475 easy
> My 270x Devil with Elpida mem will only do 1235 or so on the core but will do 1590 on the memory. I think it has to do more with the controller than anything else.
> 
> You get far more boost from core clocks than memory anyway.


Core is king.


----------



## Arkanon

afik elpida had the memory rated at 1500MHz but only uses 1400MHz on the 270(x). Plenty of people around getting their memory clocks up at 1500MHz without taking a performance hit, once above that they suffer from performance going down the drain.


----------



## Rainmaker91

In the last few months I have been building up a small guide for those who wish to use closed loop coolers on their GPUs. I have managed to gather the most known pieces there along with a few less known ones, but what I really need now is peoples experiences with them. So I encourage all who wish to do so to stop by my thread and post your experiences. If you would happen to know of some solutions that has not been mentioned in the thread I would be happy to include them as well, just post a post in the thread and I will add it.

I am aware that not everyone is a big fan of the use of AIO coolers instead of an open loop, but there is people who are and I made this guide for them. I do hope you will take your time to stop by, and I'm happy to take any constructive criticism and apply it to the thread as well. The guide is for all the users after all and I want to offer the best possible help I can for those on the hunt for something other then regular air coolers.

Once again check it out here, and thank you for your time.


----------



## Devildog83

Quote:


> Originally Posted by *Arkanon*
> 
> afik elpida had the memory rated at 1500MHz but only uses 1400MHz on the 270(x). Plenty of people around getting their memory clocks up at 1500MHz without taking a performance hit, once above that they suffer from performance going down the drain.


All 270X's are not created equal. As I said I have had mine to 1590 but yes there is a plateau past 1500. I would not say I ever saw a drop in performance but after 1500 it stopped gaining much. When you overclock you should max the core out first and then max the memory until you stop gaining. Remember the baseline we use to get these conclusions are synthetic bench's mostly and not real world applications. Some get higher overclocks on Hynix but I seriously doubt the actual performance is any or much more.


----------



## imran27

There are good elpida chips that reach good overclocks but there is still a performance drop past some frequency. This basically happens because memory overclocking on gpu is not as "manual" as system memory overclocking, you have no control over latency of VRAM as in case of DRAM. So when we increase memory speed the timings/latencies of VRAM are automatically tuned for stability, the problem is that elpidas normally need higher timings for stability at a certain speed and a lower voltage as compared to hynix.

If BIOS is tuned for elpida then it may not be the case...


----------



## Kuhl

I'm planning on upgrading my AMD 6850 to an r9 280(not the x). Is there anything I should be concerned about?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Kuhl*
> 
> I'm planning on upgrading my AMD 6850 to an r9 280(not the x). Is there anything I should be concerned about?


Tacos, they are known to be delicious. What brand are you thinking of going with?

Also, much better improved frames, if you know you worry about better performance


----------



## Kuhl

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Tacos, they are known to be delicious. What brand are you thinking of going with?
> 
> Also, much better improved frames, if you know you worry about better performance


For brands I'm going with whoever has the cheapest price at checkout, but seriously i've been considering the xfx r9 280 as it has been at $200 for the past few weeks. I have a slight grudge against MSI after a BIOS issue/RMA issue on a mobo but I've been considering them as well.


----------



## End3R

From what I've noticed Sapphire has the best reputation, but when it comes down to it, it's luck. Every company has some chance of shipping you a bad card.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Kuhl*
> 
> For brands I'm going with whoever has the cheapest price at checkout, but seriously i've been considering the xfx r9 280 as it has been at $200 for the past few weeks. I have a slight grudge against MSI after a BIOS issue/RMA issue on a mobo but I've been considering them as well.


I am not sure on the non X of the 280s from XFX, however I have 2 of the 280x one ios voltage locked the other is not. As far as the heatsink and fan, it does keep it cool but a lot of heat is dumped into the case so keep that in mind if you get one.


----------



## Kuhl

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> I am not sure on the non X of the 280s from XFX, however I have 2 of the 280x one ios voltage locked the other is not. As far as the heatsink and fan, it does keep it cool but a lot of heat is dumped into the case so keep that in mind if you get one.


The current cooler on my sapphire 6850 seems to dump heat into the case too. I'm not all that concerned and the voltage locking isn't of much concern either as I don't plan on overclocking heavily.


----------



## link1393

Hi guys, I just bought a Gigabyte R9 280X Rev. 1 from a OCN member and I want to what is the diffenrence between the Rev. 1 and the Rev. 2.

And wich 280X are the Tahiti XTL ?

This is all the card from the beginning of the 280X or not ?

Thanks

- Link1393


----------



## Rainmaker91

Quote:


> Originally Posted by *link1393*
> 
> Hi guys, I just bought a Gigabyte R9 280X Rev. 1 from a OCN member and I want to what is the diffenrence between the Rev. 1 and the Rev. 2.
> 
> And wich 280X are the Tahiti XTL ?
> 
> This is all the card from the beginning of the 280X or not ?
> 
> Thanks
> 
> - Link1393


If I remember correct it's something about the bios and the fan speed. I know that the rev 2 is more aggressive in it's fan speed and thus cooler but far more noisy. There was another significant change that I can't remember. You will either have to google it or wait for someone with a better answer on that one.

Edit: might be something about VRM temps but don't quote me on that.


----------



## link1393

Ok, if it's only the cooler it's good because I don't have the cooler


----------



## Rainmaker91

Quote:


> Originally Posted by *link1393*
> 
> Ok, if it's only the cooler it's good because I don't have the cooler


yeah I think it was but then again it might be more. since I know the earlier gigabyte 7970 had problems with VRM temps, so it might be that they added a heatsink with 280x rev 2 or it might be something more significant. Let me google it and come back to you on that one.

Edit: apparently it seems that the rev 1 is just the reused 7970 OC stock and that rev 2 is the new ones. They are identical though but the sticker isn't there on the rev 2. Check this thread. There might still be more but I can't find anything as of yet.


----------



## agrims

Rev. 2 has a VRM heatsink. That is what the metal bar is running from top to bottom. And the PCB is black vs. blue. And it runs much faster, 1000-1100 vs 850-1000...

BTW, I would like to be added to the list of owners, as my budget rig just got a little more budget; I found an open box Gigabyte 280x Rev. 2 for $185.99... I was saving for a 290, but I could not pass up a smoking hot deal!

System specs in my sig rig. I now play every game at 1080p ultra at 60+ FPS. The best part is that it is quieter than my 7850 that i replaced, and it only had one fan... But it does release alot of heat into the case. Thank goodness my case is a breather!


----------



## aman27deep

I have the xfx DD 270x and I never got any of the free game codes. What should I do?


----------



## DarthBaggins

Quote:


> Originally Posted by *aman27deep*
> 
> I have the xfx DD 270x and I never got any of the free game codes. What should I do?


Who'd you purchase the GPU through? The store you purchased the card from is responsible for the codes. Also you can contact amd and furnish a receipt for it and they might help you out, but not 100% on that.


----------



## Ashura

Here's Mine,
My R9 280x Dual-X @1100/1550 @stock volts


----------



## Obsyd

Hi guys!

Is there a way to modify my UEFI bios fan profile for my 280x? VBE7 only supports legacy bios.

Thank you for your answers!


----------



## imran27

MSI Hawk vs Sapphire Toxic, which one is better / preferable??

MSI's military class components and their separate heatsinks for GPU and mem/VRM are appealing but Toxic is just more full of awesomeness...

Does the MSI Hawk also come with Hynix guaranteed?? The Toxic does and that matters to me


----------



## J4ckV4lentine

Hi Everyone,

I want to know who ends up with this problem: "Display driver amdkmdap stopped responding and has successfully recovered." ? I have a Gigabyte AMD Radeon R9 270x OC and when I just sit in windows browsing the net the display drivers stops responding its like the screen freezes and then black and then everything comes back to life, I have tried everything in the book to fix the issue (except for placing the card in another computer, yet my computer is brand new) but the problem still persist, I think its when I browse in Chrome when the error mostly happens, I am at the point where I want to go Nvidia as the is no cure.

Any thoughts anyone?

Best Regards,
Lloyd


----------



## imran27

Quote:


> Originally Posted by *J4ckV4lentine*
> 
> Hi Everyone,
> 
> I want to know who ends up with this problem: "Display driver amdkmdap stopped responding and has successfully recovered." ? I have a Gigabyte AMD Radeon R9 270x OC and when I just sit in windows browsing the net the display drivers stops responding its like the screen freezes and then black and then everything comes back to life, I have tried everything in the book to fix the issue (except for placing the card in another computer, yet my computer is brand new) but the problem still persist, I think its when I browse in Chrome when the error mostly happens, I am at the point where I want to go Nvidia as the is no cure.
> 
> Any thoughts anyone?
> 
> Best Regards,
> Lloyd


Try disabling Hardware acceleration for Internet Video, I think you need to do it in CCC a.w.a. in chrome... Disable Hardware Acceleration from CCC and Chrome, it "may" help...


----------



## End3R

Quote:


> Originally Posted by *J4ckV4lentine*
> 
> Hi Everyone,
> 
> I want to know who ends up with this problem: "Display driver amdkmdap stopped responding and has successfully recovered." ? I have a Gigabyte AMD Radeon R9 270x OC and when I just sit in windows browsing the net the display drivers stops responding its like the screen freezes and then black and then everything comes back to life, I have tried everything in the book to fix the issue (except for placing the card in another computer, yet my computer is brand new) but the problem still persist, I think its when I browse in Chrome when the error mostly happens, I am at the point where I want to go Nvidia as the is no cure.
> 
> Any thoughts anyone?
> 
> Best Regards,
> Lloyd


I had the same issue, the card I had ended up being bad and I had to RMA it. Apparently it's a common issue with r9s, just google "r9 amdkmdap", or "r9 flicker" and you'll see tons of reports about the same thing.


----------



## J4ckV4lentine

Hi @End3R

I am glad I am not the only one with this issue, I have send my card back to the distributor for RMA, at this point in time they didn't pick up any issues yet, I did tell them over the phone I think Google Chrome cause the issue but I still have to hear back from them, I think I am going to go with Nvidia from now on I'm really tired of these problems. The only difficult thing is to convince AMD Radeon or Gigabyte about this issue.

Best Regards


----------



## BruceB

Quote:


> Originally Posted by *J4ckV4lentine*
> 
> Hi @End3R
> 
> I am glad I am not the only one with this issue, I have send my card back to the distributor for RMA, at this point in time they didn't pick up any issues yet, I did tell them over the phone I think Google Chrome cause the issue but I still have to hear back from them, I think I am going to go with Nvidia from now on I'm really tired of these problems. The only difficult thing is to convince AMD Radeon or Gigabyte about this issue.
> 
> Best Regards


I thought this Problem only happened when you push your OC too far? Have you tried underclocking to see if the Problem is resolved?


----------



## Recr3ational

I used to get the same stuttering problem with my old crossfired 7950s, now it stopped when I added two more monitors. Having triple monitors caused my cards to constantly be on 3D mode.

So maybe it's something to do with voltages or clocks at 2D modes? I know you can change it via afterburner.

This is a conplete guess btw. But that's what happened to me.


----------



## Unknownm

Quote:


> Originally Posted by *agrims*
> 
> Rev. 2 has a VRM heatsink. That is what the metal bar is running from top to bottom. And the PCB is black vs. blue. And it runs much faster, 1000-1100 vs 850-1000...


Quote:


> Originally Posted by *link1393*
> 
> Hi guys, I just bought a Gigabyte R9 280X Rev. 1 from a OCN member and I want to what is the diffenrence between the Rev. 1 and the Rev. 2.
> 
> And wich 280X are the Tahiti XTL ?
> 
> This is all the card from the beginning of the 280X or not ?
> 
> Thanks
> 
> - Link1393


Rev 1/2 gigabyte version have the same base/boost clocks. The AMD 280 (non-x) runs lower clock while 280x gigabyte version much higher (and most 280x do)

Rev 2 BIOS allows 1.3v on the card while Rev 1 only allow 1.256v. That gigabyte card he bought from me was flash to Rev 2 BIOS and it works but it was unstable to me so I switched back to Rev 1 bios. Maybe you have to cool the VRM on the side if you decide to watercool!


----------



## agrims

Quote:


> Originally Posted by *Unknownm*
> 
> Rev 1/2 gigabyte version have the same base/boost clocks. The AMD 280 (non-x) runs lower clock while 280x gigabyte version much higher (and most 280x do)
> 
> Rev 2 BIOS allows 1.3v on the card while Rev 1 only allow 1.256v. That gigabyte card he bought from me was flash to Rev 2 BIOS and it works but it was unstable to me so I switched back to Rev 1 bios. Maybe you have to cool the VRM on the side if you decide to watercool!


On gigabytes website rev 1 runs at 850/1000 where rev 2 runs 1000/1100. I trust the manufacturers website.


----------



## Unknownm

Quote:


> Originally Posted by *agrims*
> 
> On gigabytes website rev 1 runs at 850/1000 where rev 2 runs 1000/1100. I trust the manufacturers website.


http://www.gigabyte.com/products/product-page.aspx?pid=4793#ov

Rev 1 Base / Boost clock：1000 / 1100 MHz
Rev 2 Base / Boost clock：1000 / 1100 MHz


----------



## agrims

Quote:


> Originally Posted by *Unknownm*
> 
> http://www.gigabyte.com/products/product-page.aspx?pid=4793#ov
> 
> Rev 1 Base / Boost clock：1000 / 1100 MHz
> Rev 2 Base / Boost clock：1000 / 1100 MHz


Brother, we can do this all day long:

http://www.gigabyte.com/products/product-page.aspx?pid=4913#sp

Rev 1 Base Boost: 850/1000
Rev 2 Base Boost: 850/1000

What we need to find out is whether or not there is the OC version or not that is being talked about.

BTW: We were both 100% right. Both 280X, both come in rev 1 and rev 2.


----------



## LocoDiceGR

Asus Radeon R9 280X 3GB DirectCU II <== IF i buy this card for Star Sitizen/Arma 3/Dayz SA/Battlefield mostly,
and then all the other next generation/new 2014-15 games...i can run them on ultra 1080p?

Combination with 4670K (maybe) and 16gb (sure) of ram.


----------



## agrims

Yes but don't expect amazing FPS. They are extremely strong cards, but games have caught up. Mine on BF4 ultra, 4X MSAA, I am getting roughly 55-60 FPS in a multiplayer map. You will get more as the CPU can handle more, but for 1080P, that is about where I top out on ultra, but who can complain, it is 2x+ as fast as a PS4!


----------



## LocoDiceGR

what about artifacts problems reported on all variety of 280x ??
any fixes? any news?


----------



## Ashura

Quote:


> Originally Posted by *BALANTAKOS*
> 
> what about artifacts problems reported on all variety of 280x ??
> any fixes? any news?


My 280x gave very bad artifacts when I tried to oc it. I was on 14.4 & many on various forums had reported that reverting back to 13.12 fixed their problem.
I'm running 13.12 now. R9 280x @1100/1550 stock voltages, No artifacts whatsoever.
(At least not yet







)


----------



## link1393

Quote:


> Originally Posted by *Unknownm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *agrims*
> 
> Rev. 2 has a VRM heatsink. That is what the metal bar is running from top to bottom. And the PCB is black vs. blue. And it runs much faster, 1000-1100 vs 850-1000...
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *link1393*
> 
> Hi guys, I just bought a Gigabyte R9 280X Rev. 1 from a OCN member and I want to what is the diffenrence between the Rev. 1 and the Rev. 2.
> 
> And wich 280X are the Tahiti XTL ?
> 
> This is all the card from the beginning of the 280X or not ?
> 
> Thanks
> 
> - Link1393
> 
> Click to expand...
> 
> Rev 1/2 gigabyte version have the same base/boost clocks. The AMD 280 (non-x) runs lower clock while 280x gigabyte version much higher (and most 280x do)
> 
> Rev 2 BIOS allows 1.3v on the card while Rev 1 only allow 1.256v. That gigabyte card he bought from me was flash to Rev 2 BIOS and it works but it was unstable to me so I switched back to Rev 1 bios. Maybe you have to cool the VRM on the side if you decide to watercool!
Click to expand...

Thanks for your answers, which driver do you use ? and do you recommend to me to put some heat sink on the RAM ?

EDIT : The 2 BIOS on the card are voltage limited to 1.256v


----------



## agrims

Quote:


> Originally Posted by *link1393*
> 
> Thanks for your answers, which driver do you use ? and do you recommend to me to put some heat sink on the RAM ?
> 
> EDIT : The 2 BIOS on the card are voltage limited to 1.256v


I use 14.6 Beta after a clean folder by folder cleaning of the C: drive and the registry. No issues, and works wonders. I have not had any anomolies either with artifacting, but then again, I have not OC'd the card past stock, as I have not had a need to!

The Ram doesn't need any heatsinks on Gigabyte cards as they put a hugemongous plate for the rams! My card stock at 1100/1500 runs 53c on loading in games. Not bad at all, and quiet as a mouse.

Do you have the OC version or the plain 280X?


----------



## link1393

I have the OC. I buy it from Unknownm
But when I do a load test I got some artefacts afer 5-6 minutes and if I stop it before the artefacts appear in the same range of time

This is the same on the 2 bios.

btw my card is water cooled by the red mod.


----------



## agrims

Try doing a clean driver install.

Step 1: Uninstall amd catalyst.
Step 2: turn off auto update
Step 3: restart
Step 4: go to C drive, show hidden files
Step 5: find any file named AMD or ATI. They will be in program files, x86 files, common files, user files, everywhere. Look in any file that states "common". There should be 10 different locations that will have an ATI or AMD folder.
Step 6: go into system registry, delete anything that states AMD or ATI.
Step 7. Restart
Step 8: download 14.6 Beta.
Step 9: enjoy.

I never use DDU, proprietary uninstallers, or anything else. Good ole fashioned dirty hands is the best method for both amd and nvidia...

If that fails to solve the problem, clean install of windows may fix it. If not, you have a bad card...


----------



## link1393

Did you have another solution before the format....I don't want to format...

TOO MANY GAMES AND SOFTWARES


----------



## agrims

Quote:


> Originally Posted by *link1393*
> 
> Did you have another solution before the format....I don't want to format...
> 
> TOO MANY GAMES AND SOFTWARES


Yeah try the clean uninstall and install option in my previous post. That won't reformat anything and should wipe amd off the computer.


----------



## link1393

The old school method have work perfectly









Now I need to try to downvolt it because my AIO cooled like the crap on my 7870 LE.


----------



## agrims

Quote:


> Originally Posted by *link1393*
> 
> The old school method have work perfectly
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now I need to try to downvolt it because my AIO cooled like the crap on my 7870 LE.


I'm glad that has worked for you! It is like sex panther... Works 80% of the time... Every time!


----------



## link1393

Here is some pics and my validation : http://www.techpowerup.com/gpuz/gmg7v/


----------



## Roboyto

Quote:


> Originally Posted by *link1393*
> 
> The old school method have work perfectly
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now I need to try to downvolt it because my AIO cooled like the crap on my 7870 LE.


Just curious as to what AIO you're running to get poor results on (essentially) a 7950? I have no issues with an Antec 620 cooling a mildly OC'd R9 290; this is with a weak, but silent, SilenX Effizio 120mm fan.

Quote:


> Originally Posted by *agrims*
> 
> I'm glad that has worked for you! It is like sex panther... Works 80% of the time... Every time!










great movie reference


----------



## link1393

My AIO is an Antec 920 and it loss in performance with the time. Now is cooling performance are slightly better than the air cooler on my 7870 LE.


----------



## Unknownm

Quote:


> Originally Posted by *agrims*
> 
> Brother, we can do this all day long:
> 
> http://www.gigabyte.com/products/product-page.aspx?pid=4913#sp
> 
> Rev 1 Base Boost: 850/1000
> Rev 2 Base Boost: 850/1000
> 
> What we need to find out is whether or not there is the OC version or not that is being talked about.
> 
> BTW: We were both 100% right. Both 280X, both come in rev 1 and rev 2.


we can do this all day. The fact is the link I posted was his version of the card I sold him which is the overclocked version. I was not talking about "normal" clocked 280x. lol
Quote:


> Originally Posted by *link1393*
> 
> Thanks for your answers, which driver do you use ? and do you recommend to me to put some heat sink on the RAM ?
> 
> EDIT : The 2 BIOS on the card are voltage limited to 1.256v


Quote:


> Originally Posted by *link1393*
> 
> I have the OC. I buy it from Unknownm
> But when I do a load test I got some artefacts afer 5-6 minutes and if I stop it before the artefacts appear in the same range of time
> 
> This is the same on the 2 bios.
> 
> btw my card is water cooled by the red mod.


Quote:


> Originally Posted by *agrims*
> 
> Try doing a clean driver install.
> 
> Step 1: Uninstall amd catalyst.
> Step 2: turn off auto update
> Step 3: restart
> Step 4: go to C drive, show hidden files
> Step 5: find any file named AMD or ATI. They will be in program files, x86 files, common files, user files, everywhere. Look in any file that states "common". There should be 10 different locations that will have an ATI or AMD folder.
> Step 6: go into system registry, delete anything that states AMD or ATI.
> Step 7. Restart
> Step 8: download 14.6 Beta.
> Step 9: enjoy.
> 
> I never use DDU, proprietary uninstallers, or anything else. Good ole fashioned dirty hands is the best method for both amd and nvidia...
> 
> If that fails to solve the problem, clean install of windows may fix it. If not, you have a bad card...


Quote:


> Originally Posted by *link1393*
> 
> Did you have another solution before the format....I don't want to format...
> 
> TOO MANY GAMES AND SOFTWARES


Quote:


> Originally Posted by *agrims*
> 
> Yeah try the clean uninstall and install option in my previous post. That won't reformat anything and should wipe amd off the computer.


Quote:


> Originally Posted by *link1393*
> 
> The old school method have work perfectly
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now I need to try to downvolt it because my AIO cooled like the crap on my 7870 LE.


Quote:


> Originally Posted by *agrims*
> 
> I'm glad that has worked for you! It is like sex panther... Works 80% of the time... Every time!


Quote:


> Originally Posted by *link1393*
> 
> Here is some pics and my validation : http://www.techpowerup.com/gpuz/gmg7v/


Glad it worked out! That card is amazing for the price (I know you got it cheap) but when I paid 299 way back it was a steal!

Also Link! here is where I got the BIOS with 1.3v => http://www.overclock.net/t/1450456/gigabyte-r9-280x-v2-voltage-lock-bypass-through-bios-mod

Try it out and see how it works you may have to cool the vrms more! good luck


----------



## alani4837

hello again guys, i bought rk-fc7970 waterblock for xfx r9 280x dd black and isnt the same. i cant install cause the 2r2,r22 and L327016z coil....


----------



## link1393

Quote:


> Originally Posted by *Unknownm*
> 
> 
> 
> Spoiler: Warning: QUOTES !
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *agrims*
> 
> Brother, we can do this all day long:
> 
> http://www.gigabyte.com/products/product-page.aspx?pid=4913#sp
> 
> Rev 1 Base Boost: 850/1000
> Rev 2 Base Boost: 850/1000
> 
> What we need to find out is whether or not there is the OC version or not that is being talked about.
> 
> BTW: We were both 100% right. Both 280X, both come in rev 1 and rev 2.
> 
> 
> 
> we can do this all day. The fact is the link I posted was his version of the card I sold him which is the overclocked version. I was not talking about "normal" clocked 280x. lol
> Quote:
> 
> 
> 
> Originally Posted by *link1393*
> 
> Thanks for your answers, which driver do you use ? and do you recommend to me to put some heat sink on the RAM ?
> 
> EDIT : The 2 BIOS on the card are voltage limited to 1.256v
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *link1393*
> 
> I have the OC. I buy it from Unknownm
> But when I do a load test I got some artefacts afer 5-6 minutes and if I stop it before the artefacts appear in the same range of time
> 
> This is the same on the 2 bios.
> 
> btw my card is water cooled by the red mod.
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *agrims*
> 
> Try doing a clean driver install.
> 
> Step 1: Uninstall amd catalyst.
> Step 2: turn off auto update
> Step 3: restart
> Step 4: go to C drive, show hidden files
> Step 5: find any file named AMD or ATI. They will be in program files, x86 files, common files, user files, everywhere. Look in any file that states "common". There should be 10 different locations that will have an ATI or AMD folder.
> Step 6: go into system registry, delete anything that states AMD or ATI.
> Step 7. Restart
> Step 8: download 14.6 Beta.
> Step 9: enjoy.
> 
> I never use DDU, proprietary uninstallers, or anything else. Good ole fashioned dirty hands is the best method for both amd and nvidia...
> 
> If that fails to solve the problem, clean install of windows may fix it. If not, you have a bad card...
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *link1393*
> 
> Did you have another solution before the format....I don't want to format...
> 
> TOO MANY GAMES AND SOFTWARES
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *agrims*
> 
> Yeah try the clean uninstall and install option in my previous post. That won't reformat anything and should wipe amd off the computer.
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *link1393*
> 
> The old school method have work perfectly
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now I need to try to downvolt it because my AIO cooled like the crap on my 7870 LE.
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *agrims*
> 
> I'm glad that has worked for you! It is like sex panther... Works 80% of the time... Every time!
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *link1393*
> 
> Here is some pics and my validation : http://www.techpowerup.com/gpuz/gmg7v/
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> 
> 
> 
> 
> Glad it worked out! That card is amazing for the price (I know you got it cheap) but when I paid 299 way back it was a steal!
> 
> Also Link! here is where I got the BIOS with 1.3v => http://www.overclock.net/t/1450456/gigabyte-r9-280x-v2-voltage-lock-bypass-through-bios-mod
> 
> Try it out and see how it works you may have to cool the vrms more! good luck
Click to expand...

Yes it was very cheap for this card, I will try it, but later because like you have read before I have some cooling problem.

BUT if I have the time, my father and me will made a custom desk for my first water cooling loop this summer







at this time I will be good to OC this beast


----------



## alani4837

anyone???


----------



## rdr09

Quote:


> Originally Posted by *alani4837*
> 
> anyone???


is this about the block that won't fit? if you tried and failed you have no choice but to either return the block or sell the 280X and buy a cheap reference 7970.

did you use the cooling configurator found in the ek website? normally members will ask for help before watercooling and we will suggest posting pictures of the pcb.


----------



## DarthBaggins

Yup need to check the cooling configurator to ensure pcb's are identical


----------



## alani4837

its not!!!


----------



## DarthBaggins

Then it definitely won't work obviously, so return the block or re-sell it, simple as that.


----------



## rdr09

Quote:


> Originally Posted by *alani4837*
> 
> its not!!!


there is an adage that goes like this . . .

you can use the same tissue to wipe your glasses and blow your nose . . . but make sure you do the sequence properly.

hope you can return your block.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *alani4837*
> 
> its not!!!


What block did you buy anyway?


----------



## smoke2

Guys, I need help to choose right 280 or 280x.
I can buy Sapphire 280 (850(Boost:940 MHz)) with Dual-X cooler for 155,- bucks.
Or I can buy Sapphire 280x (950(Boost:1000 MHz)) with Tri-x cooler for 215,- bucks.
I'm not planning to overclock them.

Is one of the cooling solution noticeably quieter than the other?
Which one would you choose?


----------



## alani4837

Its the EK-FC7970 csq nickel acetal +backplate...if anyone need it


----------



## Recr3ational

Quote:


> Originally Posted by *alani4837*
> 
> Its the EK-FC7970 csq nickel acetal +backplate...if anyone need it


Where are you located?


----------



## neurotix

Quote:


> Originally Posted by *smoke2*
> 
> Guys, I need help to choose right 280 or 280x.
> I can buy Sapphire 280 (850(Boost:940 MHz)) with Dual-X cooler for 155,- bucks.
> Or I can buy Sapphire 280x (950(Boost:1000 MHz)) with Tri-x cooler for 215,- bucks.
> I'm not planning to overclock them.
> 
> Is one of the cooling solution noticeably quieter than the other?
> Which one would you choose?


Get the 280X with Tri-X cooler if you don't plan to overclock. It has more shaders and runs at a faster stock speed. The performance difference will be noticeable. And on top of that, it has the Tri-X cooler which should perform a lot better than the Dual-X.


----------



## alani4837

Quote:


> Originally Posted by *Recr3ational*
> 
> Where are you located?


Greece..yours?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *alani4837*
> 
> Its the EK-FC7970 csq nickel acetal +backplate...if anyone need it


http://www.coolingconfigurator.com/step1_complist#DB_inline?height=260&width=530&inline_id=comp_table


----------



## SeanOMatic

Just got an XFX R9 270XX DD for $180


----------



## BruceB

I've just realised I haven't actually joined this Club yet!








Now I'm gonna stop lurking and start joining:




That's my Powercolor 280X OC TurboDuo @ Stock (1030/1500)


----------



## GuestVeea

I'm sorta new to AMD in general after being with Intel and nVidia for so long. I've been having some performance and BSOD issues so I was wondering, do I need both AMD's all in 1 driver from the motherboard, AND catalyst control center for desktop GPUs (my 280x)? I dont want to have conflicting softwares that will cause even more problems. Thanks In advance


----------



## Devildog83

*BruceB* has been added, nice card. Love the back plate.

*GuestVeea* Do not install the all-in-one from the motherboard disc. Get all of the drivers straight from the AMD website. If you have and are having issues do a full driver sweep and reinstall CCC. The old 13.12 seems to work for most but the 14.6 is looking good too.


----------



## link1393

Quote:


> Originally Posted by *Devildog83*
> 
> *BruceB* has been added, nice card. Love the back plate.
> 
> *GuestVeea* Do not install the all-in-one from the motherboard disc. Get all of the drivers straight from the AMD website. If you have and are having issues do a full driver sweep and reinstall CCC. The old 13.12 seems to work for most but the 14.6 is looking good too.


Did you miss me









http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-270-owners-club/6260#post_22405812


----------



## JaredLaskey82

Well It is all finished.
All the cooling is installed and it is running cool and quiet. I even had time to carbon fibre vinyl my case.
Let me know what you think.


----------



## diggiddi

Clean rig


----------



## JaredLaskey82

Quote:


> Originally Posted by *diggiddi*
> 
> Clean rig


Thanx man.


----------



## BruceB

Quote:


> Originally Posted by *Devildog83*
> 
> *BruceB* has been added, nice card. Love the back plate.


Thanks








The powercolor's backplate Looks awesome, I just wish they'd printed the lettering on the other way up


----------



## Recr3ational

Quote:


> Originally Posted by *BruceB*
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> The powercolor's backplate Looks awesome, I just wish they'd printed the lettering on the other way up


I don't know why powercolor did that lol.
I didnt like it so i had to make one myself.


----------



## BruceB

Quote:


> Originally Posted by *Recr3ational*
> 
> I don't know why powercolor did that lol.
> I didnt like it so i had to make one myself.


I'd like to see that, have you got pics?


----------



## Recr3ational

Here they are, sorry bout the quality


----------



## BruceB

Quote:


> Originally Posted by *Recr3ational*
> 
> Here they are, sorry bout the quality
> 
> 
> Spoiler: Warning: Spoiler!


Sweet! I just read the build log (good work on the btw







) that Looks great, maybe you should become an artisan and sell them?


----------



## End3R

Quote:


> Originally Posted by *Recr3ational*


What is covering your MB? I've never seen anything like that 

Edit: nvm I see now it's the mb itself, weird though I've never seen anyone use something like that... isn't it a bit excessive, especially for a water cooled system?








http://www.newegg.com/Product/Product.aspx?gclid=CjkKEQjw5e-cBRDysazatpTm5b8BEiQAWxTHh9Sbf1HJu5cxBcPUJsYK_dCqHlKd9Fv6SNrt8TgdQmPw_wcB&Item=N82E16813131976&nm_mc=KNC-GoogleAdwords&cm_mmc=KNC-GoogleAdwords-_-pla-_-Intel+Motherboards-_-N82E16813131976&ef_id=U03CigAAAWakAxce:20140614134013:s


----------



## Devildog83

Quote:


> Originally Posted by *End3R*
> 
> What is covering your MB? I've never seen anything like that
> 
> Edit: nvm I see now it's the mb itself, weird though I've never seen anyone use something like that... isn't it a bit excessive, especially for a water cooled system?
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.newegg.com/Product/Product.aspx?gclid=CjkKEQjw5e-cBRDysazatpTm5b8BEiQAWxTHh9Sbf1HJu5cxBcPUJsYK_dCqHlKd9Fv6SNrt8TgdQmPw_wcB&Item=N82E16813131976&nm_mc=KNC-GoogleAdwords&cm_mmc=KNC-GoogleAdwords-_-pla-_-Intel+Motherboards-_-N82E16813131976&ef_id=U03CigAAAWakAxce:20140614134013:s


Wait, you are actually going to ask if the armor on a Saberkitty is excessive? This is OCN right?


----------



## End3R

Quote:


> Originally Posted by *Devildog83*
> 
> Wait, you are actually going to ask if the armor on a Saberkitty is excessive? This is OCN right?


lol true


----------



## Devildog83

Quote:


> Originally Posted by *End3R*
> 
> lol true










, the Armor if for looks as much as cooling.


----------



## Recr3ational

Excessive?
This is me holding back on spending loads of money lol.







I've added a few more thing on my rig just haven't taken a picture.

The sabertooth is the sexiest board ever.
Quote:


> Originally Posted by *BruceB*
> 
> [/SPOILER]
> Sweet! I just read the build log (good work on the btw
> 
> 
> 
> 
> 
> 
> 
> ) that Looks great, maybe you should become an artisan and sell them?


Sorry just read your post,
It's not worth selling it tbh haha. It cost me like £5 to make. It's pretty simple if you have the tools.


----------



## Ashura

Final Oc @ stock voltage.


Trixx would reset my clock @ startup. Something to do with CCC I believe.
Afterburner won't let me adjust voltage.(No matter what







)

Any other utility/software you guys would recommend?


----------



## NameUnknown

How would dual 270s compare to a 280x to my 5970?

My 5970 is on the way out and microcenter has 2 270s open box for 147 each.


----------



## mikemykeMB

Quote:


> Originally Posted by *Recr3ational*
> 
> I don't know why powercolor did that lol.
> I didnt like it so i had to make one myself.


Quote:


> Originally Posted by *BruceB*
> 
> [/SPOILER]
> Sweet! I just read the build log (good work on the btw
> 
> 
> 
> 
> 
> 
> 
> ) that Looks great, maybe you should become an artisan and sell them?


Making a backplate is easy and gives better appearance, check it...used the box logos and added it the acrylic cut out and painted. Viola

'


----------



## Recr3ational

Quote:


> Originally Posted by *Ashura*
> 
> Final Oc @ stock voltage.
> 
> 
> Trixx would reset my clock @ startup. Something to do with CCC I believe.
> Afterburner won't let me adjust voltage.(No matter what
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Any other utility/software you guys would recommend?


You could try out HIS iTurbo?
Are your cards voltaged locked?
Quote:


> Originally Posted by *mikemykeMB*
> 
> Making a backplate is easy and gives better appearance, check it...used the box logos and added it the acrylic cut out and painted. Viola
> 
> '


But but, what if you need to send the cards back for Rma :/


----------



## mikemykeMB

Quote:


> Originally Posted by *Recr3ational*
> 
> You could try out HIS iTurbo?
> Are your cards voltaged locked?
> But but, what if you need to send the cards back for Rma :/


It comes with a box in a box, still have it, and the bp is only attached with 2 sided 3m tape, comes off easily.. but as far as RMA...if it quits because of improper use then it's my fault..


----------



## electrocabeza

Sapphire R9 270X Toxic 1300/1600MHz (GPU/VRam) 1.278V didnt had problems with Catzilla 1440P

HWBOT 1st place with 270X in this benchmark


----------



## link1393

Hey guys I'm looking for a best solution to cool the VRM of my 280X what did you think is the best :

Enzotech MST-88
or
Enzotech MOS-C1

The MST-88 will need a little bit of modding but nothing dramatic.

EDIT: *electrocabeza* nice bench


----------



## Farih

Quote:


> Originally Posted by *electrocabeza*
> 
> Sapphire R9 270X Toxic 1300/1600MHz (GPU/VRam) 1.278V didnt had problems with Catzilla 1440P
> 
> HWBOT 1st place with 270X in this benchmark


Get the voltage up to 1.362V, your Toxic can take it !







(Wouldnt do it 24/7 though)
Mine went up to only 1362mhz with those voltages but i believe your card could go over 1400mhz then.
For 1300mhz i allready needed 1.3V you just 1.278V...........

I had a Toxic Lemon...

..
.
Now i got another Toxic lemon, a 280X
@ 1.256V just 1180mhz
@ 1.318V just 1210mhz

Will go up in voltages later but probabbbly will never cross 1300mhz


----------



## NameUnknown

Well, I just got back from Microcenter, bought one of their last 280X. Unfortunately its a Diamond, all of their MSI & XFX were out of stock and I can barely browse the web without getting artifacting.


----------



## electrocabeza

Quote:


> Originally Posted by *Farih*
> 
> Get the voltage up to 1.362V, your Toxic can take it !
> 
> 
> 
> 
> 
> 
> 
> (Wouldnt do it 24/7 though)
> Mine went up to only 1362mhz with those voltages but i believe your card could go over 1400mhz then.
> For 1300mhz i allready needed 1.3V you just 1.278V...........
> 
> I had a Toxic Lemon...
> 
> ..
> .
> Now i got another Toxic lemon, a 280X
> @ 1.256V just 1180mhz
> @ 1.318V just 1210mhz
> 
> Will go up in voltages later but probabbbly will never cross 1300mhz


For 3DM11 it did 1360/1600MHz with 1.308V but it doesnt reach 1370MHz even with 1.325V so 1360 its the max I got...


----------



## NameUnknown

Swapped out my 5970 for the 280X, the computer seems to be behaving much better and I dont even have the new drivers installed yet. Got the 280X so I can go SLI down the road when I need more power. The 5970 was also hot to the touch, felt hotter than it was reporting in that it was.

Also For a reference here is the 280X next to my 5970, basically the same size just a bit wider given the heat pipes.


----------



## Recr3ational

Quote:


> Originally Posted by *NameUnknown*
> 
> Swapped out my 5970 for the 280X, the computer seems to be behaving much better and I dont even have the new drivers installed yet. Got the 280X so I can go SLI down the road when I need more power. The 5970 was also hot to the touch, felt hotter than it was reporting in that it was.
> 
> Also For a reference here is the 280X next to my 5970, basically the same size just a bit wider given the heat pipes.


Irrc, the 5790 was a beast, I think I had it running over 100c and still working perfectly fine. It's probably still working.


----------



## JaredLaskey82

Well first benchmark for my new rig. I am happy so far.


----------



## link1393

Quote:


> Originally Posted by *link1393*
> 
> Hey guys I'm looking for a best solution to cool the VRM of my 280X what did you think is the best :
> 
> Enzotech MST-88
> or
> Enzotech MOS-C1
> 
> The MST-88 will need a little bit of modding but nothing dramatic.


Someone ?


----------



## PyroTechNiK

Is this worth the price?

http://www.newegg.ca/Product/Product.aspx?Item=N82E16814150678


----------



## NameUnknown

Quote:


> Originally Posted by *Recr3ational*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NameUnknown*
> 
> Swapped out my 5970 for the 280X, the computer seems to be behaving much better and I dont even have the new drivers installed yet. Got the 280X so I can go SLI down the road when I need more power. The 5970 was also hot to the touch, felt hotter than it was reporting in that it was.
> 
> Also For a reference here is the 280X next to my 5970, basically the same size just a bit wider given the heat pipes.
> 
> 
> 
> 
> 
> Irrc, the 5790 was a beast, I think I had it running over 100c and still working perfectly fine. It's probably still working.
Click to expand...

Mine was artifacting without any load beyond Windows and web browsing at 48-50C with the fan at 100%. Took the backplate off hoping I could pop the shroud off but I couldn't without releasing the heatsink. Since I didn't have any extra thermal paste I didn't do it, I may do it later since I need to get some to redo my heatsink.

As for power, it was a true powerhouse since it was dual 5850s that could (iirc) be unlocked to 5870s.


----------



## NameUnknown

Quote:


> Originally Posted by *PyroTechNiK*
> 
> Is this worth the price?
> 
> http://www.newegg.ca/Product/Product.aspx?Item=N82E16814150678


I sure hope so as mine was the same price. Is that price CAD or USD?

EDIT: meant to add this to my post above, sorry for double.


----------



## JaredLaskey82

Quote:


> Originally Posted by *PyroTechNiK*
> 
> Is this worth the price?
> 
> http://www.newegg.ca/Product/Product.aspx?Item=N82E16814150678


The MSI, Sapphire or ASUS cards are more stable and have a few more options when it comes to water cooling if you want to go down that line in the future.


----------



## Kuhl

So today I got a used Asus R9 280x for $260 that I can't install because I'm missing an 8pin for my modular power supply. I'm pretty upset right now I was pretty excited to start using it but hopefully OCZ gets back to me and I can use this thing. I really don't want to buy a new PSU but I will. (and won't be getting a stupid psu that doesn't have universal cables.)


----------



## mikemykeMB

Quote:


> Originally Posted by *Kuhl*
> 
> So today I got a used Asus R9 280x for $260 that I can't install because I'm missing an 8pin for my modular power supply. I'm pretty upset right now I was pretty excited to start using it but hopefully OCZ gets back to me and I can use this thing. I really don't want to buy a new PSU but I will. (and won't be getting a stupid psu that doesn't have universal cables.)


That's a







on a modular PSU w/out it


----------



## Devildog83

Quote:


> Originally Posted by *NameUnknown*
> 
> Swapped out my 5970 for the 280X, the computer seems to be behaving much better and I dont even have the new drivers installed yet. Got the 280X so I can go SLI down the road when I need more power. The 5970 was also hot to the touch, felt hotter than it was reporting in that it was.
> 
> Also For a reference here is the 280X next to my 5970, basically the same size just a bit wider given the heat pipes.


That's a nice card and a good plan, but, it will be tough to go SLI with a 280x.


----------



## NameUnknown

Whoops, meant crossfire









EDIT: Anyone know if the Diamond cards are any good to OC or not?


----------



## link1393

I guys, it's me (again







)
VRM question here

I know the main VRM are under the heatsink, but I want to know if the card have other VRM.


Thanks


----------



## Lisjak

Hey guys, so today something weird happened. I started up my pc and one of the fans started to make a loud grinding noise. First I though a moped was driving by my house







. I checked both fans by stopping them and it turns out it was the unusual looking one. I tried varying fan speed but it didn't work. Then about 10 minutes later it started to get more and more silent and now it's completely silent again.









Did this happen to anyone before?


----------



## BruceB

Quote:


> Originally Posted by *Lisjak*
> 
> Hey guys, so today something weird happened. I started up my pc and one of the fans started to make a loud grinding noise. First I though a moped was driving by my house
> 
> 
> 
> 
> 
> 
> 
> . I checked both fans by stopping them and it turns out it was the unusual looking one. I tried varying fan speed but it didn't work. Then about 10 minutes later it started to get more and more silent and now it's completely silent again.
> 
> 
> 
> 
> 
> 
> 
> 
> Did this happen to anyone before?


Maybe it just Needs some oil? One of my case fans started making a grinding noise so I took it apart and put a drop of oil in it (don't go crazy!







) and after that it was back to normal. You could give it a go?


----------



## Lisjak

That would probably work all though I'd rather not take a gpu fan apart. It's silent now so I will leave it as is for the time being. It is a good plan B however


----------



## BruceB

Quote:


> Originally Posted by *Lisjak*
> 
> That would probably work all though I'd rather not take a gpu fan apart. It's silent now so I will leave it as is for the time being. It is a good plan B however


I can understand that. TBH, if it starts again and your card's still in warranty just RMA it (unless you've got a golden OC'er or something







).


----------



## Lisjak

Yeah that was the plan at first. But then I would be computer-less for the time being because I don't have a spare gpu







And no it does not OC that great unfortunately


----------



## Crowe98

Guess I'll be joining the club then...











Better get to know you guys


----------



## captaindyson

Hello Everyone,
I'm joining the club with my Sapphire Dual x R9 280X oc and opening with a question.

I wasn't sure where to post this so I have done so here and in its own thread, I'm sorry if that is not acceptable, but my query is this:

I have recently acquired a Sapphire Dual-X R9 280X and am looking to water cool it. Now I am aware that this is a custom PCB design and that no full cover blocks have been made for it, but I also know that in general R9 280X's are just re-branded 7970 cards, so I was wondering if any of you knew if there is a 7970 card that has the same custom PCB design but did have a block made for it?

For example this covers a number of 7970 cards including sapphire ones:
http://www.coolingconfigurator.com/waterblock/3831109856697

Alternatively I found this thread which shows modding a cooler to fit a sapphire 7950 card, does it have the same custom PCB? could I do the same (or similar) mod?
http://www.overclock.net/t/1448003/watercooling-a-sapphire-hd-7970-ghz-edition-3gb-vapor-x

Thankyou in advance for any responses.
Kind Regards.


----------



## bios_R_us

Hey guys,

Just joining the club after my 7870xt has been replaced with the 280x DualX (have been a Sapphire user for quite some time now, ever since the 9600xt). Here's a screenie of what I'm running on it, quite pleased with the card so far, it's faster, cooler and even shorter than the 7870xt 

Cheers!


----------



## Devildog83

*Captaindyson and bios-R-us have been added to the club* Welcome !!!

Please post pics of your cards if you could and Captaindyson what ever clocks you are running.


----------



## Devildog83

Looking forward to adding you *Crowe98*


----------



## captaindyson

Quote:


> Originally Posted by *Devildog83*
> 
> *Captaindyson and bios-R-us have been added to the club* Welcome !!!
> 
> Please post pics of your cards if you could and Captaindyson what ever clocks you are running.


I'm afraid my system is not even close to being built yet, I am just amassing parts at the mo, hence my question above, but I will most certainly post details once it is built (hopefully within the next 2 weeks.)


----------



## Devildog83

Quote:


> Originally Posted by *captaindyson*
> 
> Hello Everyone,
> I'm joining the club with my Sapphire Dual x R9 280X oc and opening with a question.
> 
> I wasn't sure where to post this so I have done so here and in its own thread, I'm sorry if that is not acceptable, but my query is this:
> 
> I have recently acquired a Sapphire Dual-X R9 280X and am looking to water cool it. Now I am aware that this is a custom PCB design and that no full cover blocks have been made for it, but I also know that in general R9 280X's are just re-branded 7970 cards, so I was wondering if any of you knew if there is a 7970 card that has the same custom PCB design but did have a block made for it?
> 
> For example this covers a number of 7970 cards including sapphire ones:
> http://www.coolingconfigurator.com/waterblock/3831109856697
> 
> Alternatively I found this thread which shows modding a cooler to fit a sapphire 7950 card, does it have the same custom PCB? could I do the same (or similar) mod?
> http://www.overclock.net/t/1448003/watercooling-a-sapphire-hd-7970-ghz-edition-3gb-vapor-x
> 
> Thankyou in advance for any responses.
> Kind Regards.


I don't know but the one for the7970 dual X might be the closest but I would try and contact EK and see if it will fit.


----------



## NameUnknown

I'll formally join











Sitting at stock clocks until I find out if Diamond OCs well and how the temps go.


----------



## creationsh

Quote:


> Originally Posted by *NameUnknown*
> 
> I'll formally join
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sitting at stock clocks until I find out if Diamond OCs well and how the temps go.


Diamond OC' very well. Vrm (core & memory) temperature seems to be the limiting factor. Maybe the 3rd generation diamond 280x isn't just the new red badass sticker. Don't bother with upgrading to another air cooler, the stock cooler is very good.


----------



## POETICTRAGEDY

will join


----------



## NameUnknown

Quote:


> Originally Posted by *creationsh*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NameUnknown*
> 
> I'll formally join
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sitting at stock clocks until I find out if Diamond OCs well and how the temps go.
> 
> 
> 
> Diamond OC' very well. Vrm (core & memory) temperature seems to be the limiting factor. Maybe the 3rd generation diamond 280x isn't just the new red badass sticker. Don't bother with upgrading to another air cooler, the stock cooler is very good.
Click to expand...

Glad to hear it. I haven't seen much Diamond hardware on OCN so I wasn't sure but at the same time I had to get a new card.

One other question. I see no reason this would be a problem since chipsets are identical, but does mixing brands effect Xfire in any way?


----------



## DiceAir

Guys please explain to me how come my friends gigabyte windforce oc 7970 running clocks of 1000MHz core and 1375MHz memory can be faster than my club3d r9-280x running clocks of 1100MHz core and 1500 memory?


----------



## BruceB

Quote:


> Originally Posted by *DiceAir*
> 
> Guys please explain to me how come my friends gigabyte windforce oc 7970 running clocks of 1000MHz core and 1375MHz memory can be faster than my club3d r9-280x running clocks of 1100MHz core and 1500 memory?


??? It shouldn't be.
Maybe your card it getting throttled due to heat or something?
Try running _FurMark_, it gives you realtime core & memory speed and throttling percentage (watch the temps!).
If your temps get to high the card will enter "Panic mode" where it will down clock to 501MHz and run the fans at 100%.


----------



## DiceAir

Quote:


> Originally Posted by *BruceB*
> 
> ??? It shouldn't be.
> Maybe your card it getting throttled due to heat or something?
> Try running _FurMark_, it gives you realtime core & memory speed and throttling percentage (watch the temps!).
> If your temps get to high the card will enter "Panic mode" where it will down clock to 501MHz and run the fans at 100%.


not getting throttled. I checked MSI afterburner and my card is running full clocks the whole time.


----------



## Devildog83

Quote:


> Originally Posted by *DiceAir*
> 
> not getting throttled. I checked MSI afterburner and my card is running full clocks the whole time.


are you running the same CPU and memory?


----------



## DiceAir

Quote:


> Originally Posted by *Devildog83*
> 
> are you running the same CPU and memory?


We tested it in the same machine. i7-2600k @ 4.4GHz. So really only thing changed was the card.


----------



## Devildog83

*ATTENTION* - Everyone's welcome in this club but as it says in the OP, please post what kind of card, (make,model) and the clocks you are running with a pic if possible. This way I don't have to continually edit or only post half of the info.

THANKS,
DD


----------



## MTDEW

Quote:


> Originally Posted by *DiceAir*
> 
> Guys please explain to me how come my friends gigabyte windforce oc 7970 running clocks of 1000MHz core and 1375MHz memory can be faster than my club3d r9-280x running clocks of 1100MHz core and 1500 memory?


Quote:


> Originally Posted by *DiceAir*
> 
> We tested it in the same machine. i7-2600k @ 4.4GHz. So really only thing changed was the card.


Your memory could simply be getting and correcting errors at those clocks.
Try lowering your memory to his 1375mhz clocks and see if your performance / benchmark scores improve.

If they do then you know that is it, and you'll have to test to find your memory clocks sweet-spot.


----------



## DiceAir

Quote:


> Originally Posted by *MTDEW*
> 
> Your memory could simply be getting and correcting errors at those clocks.
> Try lowering your memory to his 1375mhz clocks and see if your performance / benchmark scores improve.
> 
> If they do then you know that is it, and you'll have to test to find your memory clocks sweet-spot.


Will try tonight. Will also admit. It wasn't on the same map in Black ops 2. The only game we tested for now cause he plays a lot of black ops 2.


----------



## BruceB

Quote:


> Originally Posted by *DiceAir*
> 
> not getting throttled. I checked MSI afterburner and my card is running full clocks the whole time.


What whole time? The GPU will thottle itsself when it gets hot. It won't get hot if you're playing, for example, _Angry Birds_. Run furmark to check the card's cooling works properly.








Quote:


> Originally Posted by *DiceAir*
> 
> Will try tonight. Will also admit. It wasn't on the same map in Black ops 2. The only game we tested for now cause he plays a lot of black ops 2.


If you've got steam I'd recommend you download the trial version of 3DMark (it's free!) and use the benchmarks there to test your cards, that way you know that both cards are getting the same test.


----------



## MTDEW

Quote:


> Originally Posted by *DiceAir*
> 
> Will try tonight. Will also admit. It wasn't on the same map in Black ops 2. The only game we tested for now cause he plays a lot of black ops 2.


Oh, I wouldn't get too excited then, unless you run something like the Valley benchmark and your scores look way off to his.
He could get slightly higher scores in benchmarks with his 2600k vs your 2500k if he is running the 2600k with HT on.

And looking at your sig, i see crossfire , which if you're testing with just one card with x-fire disabled, your single card will still be running at x8 instead of x16 since the second PCI-e slot is still populated by your other card.

But neither of the above should really effect performance much in benchmarks designed to stress the GPU like the Unigine Valley benchmark by more than 1% or 2%.

And i'm with *BruceB*, be SURE your card isn't throttling under load.
Use MSI AB to monitor it real time while benching/testing. (which you seem to already know how)


----------



## DiceAir

Quote:


> Originally Posted by *MTDEW*
> 
> Oh, I wouldn't get too excited then, unless you run something like the Valley benchmark and your scores look way off to his.
> He could get slightly higher scores in benchmarks with his 2600k vs your 2500k if he is running the 2600k with HT on.
> 
> And looking at your sig, i see crossfire , which if you're testing with just one card with x-fire disabled, your single card will still be running at x8 instead of x16 since the second PCI-e slot is still populated by your other card.
> 
> But neither of the above should really effect performance much in benchmarks designed to stress the GPU like the Unigine Valley benchmark by more than 1% or 2%.


Forgot to change my specs. I have the 3770k at stock now that I borrowed from work. Just waiting for 4790k to upgrade to crossfire motherboard and get crossfire running again


----------



## DiceAir

Quote:


> Originally Posted by *MTDEW*
> 
> Oh, I wouldn't get too excited then, unless you run something like the Valley benchmark and your scores look way off to his.
> He could get slightly higher scores in benchmarks with his 2600k vs your 2500k if he is running the 2600k with HT on.
> 
> And looking at your sig, i see crossfire , which if you're testing with just one card with x-fire disabled, your single card will still be running at x8 instead of x16 since the second PCI-e slot is still populated by your other card.
> 
> But neither of the above should really effect performance much in benchmarks designed to stress the GPU like the Unigine Valley benchmark by more than 1% or 2%.
> 
> And i'm with *BruceB*, be SURE your card isn't throttling under load.
> Use MSI AB to monitor it real time while benching/testing. (which you seem to already know how)


Yes I can play any game, do any benchmark and according to MSI afterburner my GPU is not getting throttled. As I said we changed cards in his machine so everything sort of stayed the same excepts for graphics cards.


----------



## BruceB

Quote:


> Originally Posted by *DiceAir*
> 
> Yes I can play any game, do any benchmark and according to MSI afterburner my GPU is not getting throttled. As I said we changed cards in his machine so everything sort of stayed the same excepts for graphics cards.


What are your Benchmark scores with those Cards? Could you post the result links?
That could help us find the Problem!


----------



## Recr3ational

Quote:


> Originally Posted by *DiceAir*
> 
> not getting throttled. I checked MSI afterburner and my card is running full clocks the whole time.


Isn't the 280x basically a 7970?


----------



## anubis1127

Quote:


> Originally Posted by *Recr3ational*
> 
> Isn't the 280x basically a 7970?


Yep, it is a 7970.


----------



## Recr3ational

Quote:


> Originally Posted by *anubis1127*
> 
> Yep, it is a 7970.


Then treat both of the cards as 7970s, some cards are better than others.


----------



## agrims

Also remember that GDDR5 error corrects it's pants off before crapping out, and sounds like that is what happens. I came from a 7850 to the 280X and my first 7850 would run ridiculous memory clocks, but was posting lower scores because of the corrections.


----------



## DiceAir

Quote:


> Originally Posted by *Recr3ational*
> 
> Then treat both of the cards as 7970s, some cards are better than others.


I know they both 7970 and that's why I find it strange that a lower clocked card is better than mine


----------



## Crowe98

*COME ON ALREADY*


----------



## Kuhl

Quote:


> Originally Posted by *Crowe98*
> 
> *COME ON ALREADY*


Thats kind of how I feel I had to order a new 8pin connector I miss placed and I've had my 280x in hand since sunday. Its just eating me away.


----------



## Roaches

Got a second 270x Devil incoming...First time Crossfiring, hopefully its just as good as my SLI setup.


----------



## Crowe98

false
Quote:


> Originally Posted by *Kuhl*
> 
> Thats kind of how I feel I had to order a new 8pin connector I miss placed and I've had my 280x in hand since sunday. Its just eating me away.


Its killing me man, but I want to sell my sig rig and put the 280x in a fresh system...


----------



## creationsh

Quote:


> Originally Posted by *NameUnknown*
> 
> Glad to hear it. I haven't seen much Diamond hardware on OCN so I wasn't sure but at the same time I had to get a new card.
> 
> One other question. I see no reason this would be a problem since chipsets are identical, but does mixing brands effect Xfire in any way?


Unless you wanna go over the 600+ pages, I'll try to sum it up for you.................................................................................................No.

Edit:
What i said does not apply to overclocking in crossfire. I look forward to post your results.


----------



## NameUnknown

Quote:


> Originally Posted by *creationsh*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NameUnknown*
> 
> Glad to hear it. I haven't seen much Diamond hardware on OCN so I wasn't sure but at the same time I had to get a new card.
> 
> One other question. I see no reason this would be a problem since chipsets are identical, but does mixing brands effect Xfire in any way?
> 
> 
> 
> Unless you wanna go over the 600+ pages, I'll try to sum it up for you.................................................................................................No.
> 
> Edit:
> What i said does not apply to overclocking in crossfire. I look forward to post your results.
Click to expand...

Didn't think it would but thought I would check with someone first. When I get my second card I'll post some results, may not be for another month though unless I can catch one for cheap. Have a house payment, car payment, 2 night trip, and 4th of july coming up. All in all, that series of stuff will run me around 1250 outside of my regular bills as well.


----------



## rdr09

Quote:


> Originally Posted by *Crowe98*
> 
> false
> Its killing me man, but I want to sell my sig rig and put the 280x in a fresh system...


your sig rig can handle a Kingpin. even 2.


----------



## NameUnknown

One last question, what all brands can you buy used and still be able to RMA without having to go through the seller.


----------



## BruceB

Quote:


> Originally Posted by *NameUnknown*
> 
> One last question, what all brands can you buy used and still be able to RMA without having to go through the seller.


I don't think there are any brands where you _have to_ go through the seller, are there? I go through the seller just because I don't like to wait for my new card


----------



## Dasboogieman

Quote:


> Originally Posted by *BruceB*
> 
> I don't think there are any brands where you _have to_ go through the seller, are there? I go through the seller just because I don't like to wait for my new card


Actually I did experience this once. I used to own a Palit GTX 570 Twin Fan which I bought in Penang Malaysia. Anyway, to RMA the card, Palit requires you to fill out a form from their website, you then send the form to your original retailer and thus they send it back to Palit. I thought of RMAing the card due to the shoddy thermal paste application + warranty void sticker but having to jump through so many hoops really put me off.


----------



## NameUnknown

Quote:


> Originally Posted by *BruceB*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NameUnknown*
> 
> One last question, what all brands can you buy used and still be able to RMA without having to go through the seller.
> 
> 
> 
> I don't think there are any brands where you _have to_ go through the seller, are there? I go through the seller just because I don't like to wait for my new card
Click to expand...

I mean if I buy a used card. Since I wouldn't be the original buyer I would have to go through the seller (person I bought from) to get the warranty carried out. I know some of the manufacturers go by SN & some by owner. The thing is, I dont know which are which.


----------



## BruceB

Quote:


> Originally Posted by *NameUnknown*
> 
> I mean if I buy a used card. Since I wouldn't be the original buyer I would have to go through the seller (person I bought from) to get the warranty carried out. I know some of the manufacturers go by SN & some by owner. The thing is, I dont know which are which.


I see, I thought you ment seller as in the shop. If the original owner paid with cash then you can take it back to the shop (assuming you have a recipt), if they paid with Card then you could try the shop but they may want to see ID.

Doing it through the manufacturer would be similar, if the original owner paid with cash then they can't prove you're not the original owner, if they paid with Card then their Name will be written on the recipt you've got to send in with your RMA and it'll be obvious that you're not the original owner.

I hope you understand what I'm trying to say here








Quote:


> Originally Posted by *Dasboogieman*
> 
> Actually I did experience this once. I used to own a Palit GTX 570 Twin Fan which I bought in Penang Malaysia. Anyway, to RMA the card, Palit requires you to fill out a form from their website, you then send the form to your original retailer and thus they send it back to Palit. I thought of RMAing the card due to the shoddy thermal paste application + warranty void sticker but having to jump through so many hoops really put me off.


^^That's madness! It actually puts me off buying Palit...


----------



## NameUnknown

Quote:


> Originally Posted by *BruceB*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NameUnknown*
> 
> I mean if I buy a used card. Since I wouldn't be the original buyer I would have to go through the seller (person I bought from) to get the warranty carried out. I know some of the manufacturers go by SN & some by owner. The thing is, I dont know which are which.
> 
> 
> 
> I see, I thought you ment seller as in the shop. If the original owner paid with cash then you can take it back to the shop (assuming you have a recipt), if they paid with Card then you could try the shop but they may want to see ID.
> 
> Doing it through the manufacturer would be similar, if the original owner paid with cash then they can't prove you're not the original owner, if they paid with Card then their Name will be written on the recipt you've got to send in with your RMA and it'll be obvious that you're not the original owner.
> 
> I hope you understand what I'm trying to say here
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

I understand that, I sold two Sapphire HD4890s on here a long time ago and iirc one died and we had to RMA it. I helped the buyer out and we were able to do it. But that's what I'm trying to avoid, having to deal with the seller after its bought. I know some of the brands dont require a proof of purchase as the warranty is based off of the SN of the component, not off of the owner. Since I will likely try to get another 280X cheap off ebay I would prefer a brand that goes by SN for warranty as it would be a hassle elsewise should the card go south.

EDIT: How many lanes does a 280X need before it starts to saturate the PCIE lanes and suffer in performance?


----------



## BruceB

Quote:


> Originally Posted by *NameUnknown*
> 
> I understand that, I sold two Sapphire HD4890s on here a long time ago and iirc one died and we had to RMA it. I helped the buyer out and we were able to do it. But that's what I'm trying to avoid, having to deal with the seller after its bought. I know some of the brands dont require a proof of purchase as the warranty is based off of the SN of the component, not off of the owner. Since I will likely try to get another 280X cheap off ebay I would prefer a brand that goes by SN for warranty as it would be a hassle elsewise should the card go south.
> 
> EDIT: How many lanes does a 280X need before it starts to saturate the PCIE lanes and suffer in performance?


Right. I've only had to RMA 2 GFX cards, but I can say that Powercolor and Inno3D both want to see a copy of the recipt.







I think I would check if they paid cash and bought it at a shop/chain I could get to, I prefer to handle problems like RMAs in person.









All 280Xs have the same number of lanes. A 280X cannot saturate a PCI-E 2.0 x16 bus.


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> Got a second 270x Devil incoming...First time Crossfiring, hopefully its just as good as my SLI setup.


If it's anything like my Devil's in X-Fire it should rock. I get great scaling with mine. Beat most 290's with the same CPU as mine.


----------



## Roaches

Yeah, I saw the bench charts before making the purchase, I'm hoping to get good frame-times and smoothness once I get it installed...Build quality of these cards are over the top. My server rig wouldn't look complete without a second devil.

Should arrive by next week monday.


----------



## NameUnknown

Quote:


> Originally Posted by *BruceB*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NameUnknown*
> 
> I understand that, I sold two Sapphire HD4890s on here a long time ago and iirc one died and we had to RMA it. I helped the buyer out and we were able to do it. But that's what I'm trying to avoid, having to deal with the seller after its bought. I know some of the brands dont require a proof of purchase as the warranty is based off of the SN of the component, not off of the owner. Since I will likely try to get another 280X cheap off ebay I would prefer a brand that goes by SN for warranty as it would be a hassle elsewise should the card go south.
> 
> EDIT: How many lanes does a 280X need before it starts to saturate the PCIE lanes and suffer in performance?
> 
> 
> 
> Right. I've only had to RMA 2 GFX cards, but I can say that Powercolor and Inno3D both want to see a copy of the recipt.
> 
> 
> 
> 
> 
> 
> 
> I think I would check if they paid cash and bought it at a shop/chain I could get to, I prefer to handle problems like RMAs in person.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All 280Xs have the same number of lanes. A 280X cannot saturate a PCI-E 2.0 x16 bus.
Click to expand...

But can it saturate an x8 bus? Surely it would saturatee an x4 no?


----------



## BruceB

Quote:


> Originally Posted by *NameUnknown*
> 
> But can it saturate an x8 bus? Surely it would saturatee an x4 no?


I'm not sure, I think most XFire setups run x8/x8, so I expect that it won't _totally_ saturate an x8. I have no idea if it could saturate an x4 though

[EDIT]
That's a good question actually, I'd like to know the answer too


----------



## Recr3ational

Quote:


> Originally Posted by *BruceB*
> 
> I'm not sure, I think most XFire setups run x8/x8, so I expect that it won't _totally_ saturate an x8. I have no idea if it could saturate an x4 though
> 
> [EDIT]
> That's a good question actually, I'd like to know the answer too


I've seen somewhere running cards x16/x16 vs x8/x8 there's not much in performance increase.
Probably minimal. I haven't got an idea with x4 though.


----------



## Crowe98

Quote:


> Originally Posted by *rdr09*
> 
> your sig rig can handle a Kingpin. even 2.


I know it can, but I don't like how the look of my rig turned out xD


----------



## Dasboogieman

Quote:


> Originally Posted by *BruceB*
> 
> I hope you understand what I'm trying to say here
> 
> 
> 
> 
> 
> 
> 
> 
> ^^That's madness! It actually puts me off buying Palit...


That wasn't the worst part, I told palit that the original retailer I bought it off has gone bankrupt (I lied to see what their reaction was) and they still wanted me to do the form thing except to send it to one of their regional "partners" which meant I had to do the RMA with some no-name PC stores......provided those PC stores agreed which they usually didn't since it's not one of theirs.


----------



## BruceB

Quote:


> Originally Posted by *NameUnknown*
> 
> But can it saturate an x8 bus? Surely it would saturatee an x4 no?


Quote:


> Originally Posted by *Recr3ational*
> 
> I've seen somewhere running cards x16/x16 vs x8/x8 there's not much in performance increase.
> Probably minimal. I haven't got an idea with x4 though.


I found quite a lot of stuff on the internet about this, it's a question that lots of people are interested in apparently.
I've tried to summerise it here as best I can but it's quite a large subject so if I haven't explained something clearly you can always respond or send me a PM about it!









The general consensus is that a PCI-E 2.0 x4 connection will very slightly bottleneck a 7970, see this image from _logicalincrements_ (we're looking at the right-hand-side because the test on the left doesn't have enough throughput to saturate any of the buses):


When you read this table, it's improtant to know this:
Standards from highest bandwidth to lowest bandwidth:
PCI-E 3.0 x16
PCI-E 2.0 x16 = PCI-E 3.0 x8
PCI-E 1.1 x16 = PCI-E 2.0 x8 = PCI-E 3.0 x4
PCI-E 1.1 x8 = PCI-E 2.0 x4
PCI-E 1.1 x4
(when I say "=" I mean _has the same bandwidth as_ not _is_)

The image shows that a 7970 gets slightly (1FPS) lower FPS on a PCI-E 2. x4 bus than on a 2.0 x8 or 2.0 x16 bus. The difference in performance is ~2%, which should mean that a PCI-E 2.0 x8 connection has ~98% _more_ bandwidth than a 7970 (AKA 280X) can output. As PCI-E 2.0 x8 = PCI-E 1.1 x16 (bandwidth-wise), that means that even a 1.1 x16 connection has more than enough bandwidth for a 7970 and a 1.1 x8 (= 2.0 x4) connection will only slightly hinder a single 7970!








Now I've researched this I wonder why they even bothered with PCI-E 3.0 standards?

*tl:dr
A PCI-E 2.0 x4 bus will only bottleneck a 7970 (AKA 280X) by ~2%. There is no substantial difference between PCI-E 2.0 x8 and PCI-E 2.0 x16 for a 7970 (AKA 280X).*


----------



## imran27

Quote:


> Originally Posted by *BruceB*
> 
> Now I've researched this I wonder why they even bothered with PCI-E 3.0 standards?


LOL

They even have PCIe 4.0 under works which will soon be finalized and we'll see it on an AMD platform by probably next year. Development follows same pattern, bandwidth has doubled again, was it required? Why?


----------



## BruceB

Quote:


> Originally Posted by *imran27*
> 
> LOL
> 
> They even have PCIe 4.0 under works which will soon be finalized and we'll see it on an AMD platform by probably next year. Development follows same pattern, bandwidth has doubled again, was it required? Why?


I didn't know PCIe 4.0 was already in planning, who is the driving force behind it? It's clearly not the consumer.








It seems to me to be not much more than a gimmick at the moment, I mean these new buses _do_ have a larger badwidth, but you couldn't possibly use it.


----------



## Dasboogieman

Quote:


> Originally Posted by *BruceB*
> 
> I found quite a lot of stuff on the internet about this, it's a question that lots of people are interested in apparently.
> I've tried to summerise it here as best I can but it's quite a large subject so if I haven't explained something clearly you can always respond or send me a PM about it!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The general consensus is that a PCI-E 2.0 x4 connection will very slightly bottleneck a 7970, see this image from _logicalincrements_ (we're looking at the right-hand-side because the test on the left doesn't have enough throughput to saturate any of the buses):
> 
> 
> When you read this table, it's improtant to know this:
> Standards from highest bandwidth to lowest bandwidth:
> PCI-E 3.0 x16
> PCI-E 2.0 x16 = PCI-E 3.0 x8
> PCI-E 1.1 x16 = PCI-E 2.0 x8 = PCI-E 3.0 x4
> PCI-E 1.1 x8 = PCI-E 2.0 x4
> PCI-E 1.1 x4
> (when I say "=" I mean _has the same bandwidth as_ not _is_)
> 
> The image shows that a 7970 gets slightly (1FPS) lower FPS on a PCI-E 2. x4 bus than on a 2.0 x8 or 2.0 x16 bus. The difference in performance is ~2%, which should mean that a PCI-E 2.0 x8 connection has ~98% _more_ bandwidth than a 7970 (AKA 280X) can output. As PCI-E 2.0 x8 = PCI-E 1.1 x16 (bandwidth-wise), that means that even a 1.1 x16 connection has more than enough bandwidth for a 7970 and a 1.1 x8 (= 2.0 x4) connection will only slightly hinder a single 7970!
> 
> 
> 
> 
> 
> 
> 
> 
> Now I've researched this I wonder why they even bothered with PCI-E 3.0 standards?
> 
> *tl:dr
> A PCI-E 2.0 x4 bus will only bottleneck a 7970 (AKA 280X) by ~2%. There is no substantial difference between PCI-E 2.0 x8 and PCI-E 2.0 x16 for a 7970 (AKA 280X).*


Because of PCIE 3.0, XDMA is possible. Crossfire XDMA scaling is really bad with x8 PCIe 2.0. Plus also, PCIe 3.0 also futureproofs against new SSDs too.


----------



## BruceB

Quote:


> Originally Posted by *Dasboogieman*
> 
> Because of PCIE 3.0, XDMA is possible. Crossfire XDMA scaling is really bad with x8 PCIe 2.0. Plus also, PCIe 3.0 also futureproofs against new SSDs too.


IIRC I've only seen SSDs use the x1 Slots (500MB/s) so far.

On a 2.0 x8 bus in xFire you should (theoretically) see a 2% bottleneck without XDMA, I could imagine with XDMA soaking up even more bandwidth a 2.0 x16 bus would probably get better results in xFire.

According to AMD, XDMA replaces the crossfire Bridge because the Bridge didn't have the bandwitdh to Keep up with 4K resolutions. The Bridge had a bandwidth of 900MB/s; a PCIe 2.0 x16 bus has a bandwidth of 8GB/s.

A single 7970 uses ~2040MB/s of bandwidth, so PCIe 3.0 could make sense when your runing 3 or more GPUs in xFire (so it's _not_ totally useless!







) but a single GPU or 2-way xFire can be done on PCIe 2.0 x16 with bandwitdth to spare.


----------



## imran27

Quote:


> Originally Posted by *BruceB*
> 
> I didn't know PCIe 4.0 was already in planning, who is the driving force behind it? It's clearly not the consumer.
> 
> 
> 
> 
> 
> 
> 
> 
> It seems to me to be not much more than a gimmick at the moment, I mean these new buses _do_ have a larger badwidth, but you couldn't possibly use it.


PCIe 4.0 will first come to AMD's server and HEDT platforms, I don't know exactly when but expected to be sometime near to the release of their next flagship GPU since they need a CPU capable to drive them, they can't just keep recommending Intel CPU's for their flagship GPU's









So driving force is...AMD. What?? How?? Even a 295x2 is not bottlenecked by PCIe 2.0!
The reason is HSA & hUMA. AMD wants HSA & hUMA to work across the board for the dGPU, not just for iGPU and there they might need that lot of bandwidth or probably even more.

So my guess is that HSA and hUMA are driving forces for PCIe 4.0.


----------



## BruceB

Quote:


> Originally Posted by *imran27*
> 
> PCIe 4.0 will first come to AMD's server and HEDT platforms, I don't know exactly when but expected to be sometime near to the release of their next flagship GPU since they need a CPU capable to drive them, they can't just keep recommending Intel CPU's for their flagship GPU's
> 
> 
> 
> 
> 
> 
> 
> 
> So driving force is...AMD. What?? How?? Even a 295x2 is not bottlenecked by PCIe 2.0!
> The reason is HSA & hUMA. AMD wants HSA & hUMA to work across the board for the dGPU, not just for iGPU and there they might need that lot of bandwidth or probably even more.
> So my guess is that HSA and hUMA are driving forces for PCIe 4.0.


That makes sense, memory can eat up a whole lot of bandwidth. Although HSA and hUMA are no use to me and my AM3+ MB







it would be nice to see these things in future AMD CPUs/GPUs.


----------



## Dasboogieman

Quote:


> Originally Posted by *BruceB*
> 
> IIRC I've only seen SSDs use the x1 Slots (500MB/s) so far.
> 
> On a 2.0 x8 bus in xFire you should (theoretically) see a 2% bottleneck without XDMA, I could imagine with XDMA soaking up even more bandwidth a 2.0 x16 bus would probably get better results in xFire.
> 
> According to AMD, XDMA replaces the crossfire Bridge because the Bridge didn't have the bandwitdh to Keep up with 4K resolutions. The Bridge had a bandwidth of 900MB/s; a PCIe 2.0 x16 bus has a bandwidth of 8GB/s.
> 
> A single 7970 uses ~2040MB/s of bandwidth, so PCIe 3.0 could make sense when your runing 3 or more GPUs in xFire (so it's _not_ totally useless!
> 
> 
> 
> 
> 
> 
> 
> ) but a single GPU or 2-way xFire can be done on PCIe 2.0 x16 with bandwitdth to spare.


Phew!
Well, I won't be going Tri Fire or 4k anytime soon so I guess I can rest happy that there is only a 2% bottleneck

Cheers


----------



## BruceB

Quote:


> Originally Posted by *Dasboogieman*
> 
> Phew!
> Well, I won't be going Tri Fire or 4k anytime soon so I guess I can rest happy that there is only a 2% bottleneck
> 
> Cheers


Glad I could help









You could easily remove the bottleneck alltogether by OC'ing your MB's North Bridge (because this also OC's the PCI buses) by 2% or more









Right, I've _really_ got to stop going on OCN and get on with revision now


----------



## imran27

Yes he's right, *2%* OC on CPU/NB will nullify the bottleneck. So is you're on 2200 MHz you need 44 MHz of OC minimum to nullify your *2%* bottleneck









I guess that hUMA won't apply to 3D stuff meaning that it won't be used for rendering and other stuff in games, but can be very useful to assist CPU in other compute tasks needed to drive the GPU, probably those complex AI calculations can be done on GPU, remember I'm telling done not offloaded to the GPU, GPU will directly access the system memory. That I think is really cool. In this way users can go for a very humble CPU and a beast GPU and still not suffer from bottleneck, nit even *2%*


----------



## NameUnknown

Now I just need to decide if I'm happy with 2 R9 280Xs on x16 lanes or if I want 3 of them on an x16, x16, and x4 lane.. My board has 2 x16 lanes, 1 x8 lane, and 1 x4 lane. Unfortunately the x8 lane is right below the second x16 so I can't use it unless I go to WCing which with 3 cards would be prohibitively expensive. The whole point of all this power is so I can sell my monitors and go to 3 1440p monitors down the road.

Of course if the company I am contracting with right now doesn't hire me its all just dreams anyway


----------



## BruceB

Quote:


> Originally Posted by *NameUnknown*
> 
> Now I just need to decide if I'm happy with 2 R9 280Xs on x16 lanes or if I want 3 of them on an x16, x16, and x4 lane.. My board has 2 x16 lanes, 1 x8 lane, and 1 x4 lane. Unfortunately the x8 lane is right below the second x16 so I can't use it unless I go to WCing which with 3 cards would be prohibitively expensive. The whole point of all this power is so I can sell my monitors and go to 3 1440p monitors down the road.
> Of course if the company I am contracting with right now doesn't hire me its all just dreams anyway


If you're using xFire bridges, PCIe 2.0 x4 shouldn't hinder you too much (not at all with some northbridge OC'ing, see my previous post). If you're using this new-fangled _XDMA_ (ie. no xFire bridges) then you'll want to Keep it at least x8, ideally x16









Good luck with the Job!









[EDIT]
Check your MB Manual, some MBs dynamically Change the bandwidth depending on where Cards are inserted, you may get x16, x8, x8.


----------



## Tobe404

I've been tossing up about adding a second 280x to crossfire on my 16x/4x PCIE 2 Mobo...
Really depends on the game I think...
Some can have as much as a 30% drop in performance on 16x/4x comapared to 16x/16x.
So we could assume that 8x/8x 'could' have a 15% bottleneck.


----------



## BruceB

Quote:


> Originally Posted by *Tobe404*
> 
> I've been tossing up about adding a second 280x to crossfire on my 16x/4x PCIE 2 Mobo...
> Really depends on the game I think...
> Some can have as much as a 30% drop in performance on 16x/4x comapared to 16x/16x.
> So we could assume that 8x/8x 'could' have a 15% bottleneck.


The xFire scaling depends on the game (because it's Driver based) but the bandwidth of the PCIe buses will always be the same (because it's Hardware based).
You should't see a difference between x8/x8 and x16/x16 if you're using xfire bridges.

According to this:

The difference between x8/x8 and x16/x4 is quite noticeable. I'm not totally sure how PCIe buses work together but I'd _assume_ that when running x16/x4 the faster bus would 'downclock' to match the slower bus, so effectively x16/x4 = x4/x4, I'll see if I can find some proof and post back.


----------



## Tobe404

Guess I could always put the 280x in the 4x slot and see how it runs... But I think I'll wait and see if you find something first because it would save a lot of stuffing around. Haha.


----------



## NameUnknown

Quote:


> Originally Posted by *BruceB*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NameUnknown*
> 
> Now I just need to decide if I'm happy with 2 R9 280Xs on x16 lanes or if I want 3 of them on an x16, x16, and x4 lane.. My board has 2 x16 lanes, 1 x8 lane, and 1 x4 lane. Unfortunately the x8 lane is right below the second x16 so I can't use it unless I go to WCing which with 3 cards would be prohibitively expensive. The whole point of all this power is so I can sell my monitors and go to 3 1440p monitors down the road.
> Of course if the company I am contracting with right now doesn't hire me its all just dreams anyway
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you're using xFire bridges, PCIe 2.0 x4 shouldn't hinder you too much (not at all with some northbridge OC'ing, see my previous post). If you're using this new-fangled _XDMA_ (ie. no xFire bridges) then you'll want to Keep it at least x8, ideally x16
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good luck with the Job!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [EDIT]
> Check your MB Manual, some MBs dynamically Change the bandwidth depending on where Cards are inserted, you may get x16, x8, x8.
Click to expand...

I'll dig up that manual tonight while I'm inventorying all my crap that is going to be funding everything I am doing to my computer. But from the sound of it, the best way to go would be trifire with bridges. I suspect a hardware link will provide slightly better performance than a software based link like XDMA. Less overhead on the CPU, less bandwidth consumption on your lanes. All around better imo to do bridges.


----------



## BruceB

Quote:


> Originally Posted by *Tobe404*
> 
> Guess I could always put the 280x in the 4x slot and see how it runs... But I think I'll wait and see if you find something first because it would save a lot of stuffing around. Haha.


Quote:


> Originally Posted by *NameUnknown*
> 
> I'll dig up that manual tonight while I'm inventorying all my crap that is going to be funding everything I am doing to my computer. But from the sound of it, the best way to go would be trifire with bridges. I suspect a hardware link will provide slightly better performance than a software based link like XDMA. Less overhead on the CPU, less bandwidth consumption on your lanes. All around better imo to do bridges.


Actually XDMA is a hardware solution embedded in the GFX card, it has the lower overhead and a larger bandwidth than the bridges, if you're going to be using a large resolution (4K or above) then you could benifit from XDMA (assuming you have got the PCIe bandwidth to spare)


----------



## NameUnknown

Trifire on my board is x16 x16 x8
























Will an HX1000 run all three of them?


----------



## smoke2

I'm wondering between two cards.
Sapphire 280 Dual-X and more powerful Sapphire 280X Vapor-X Tri-X.

According this site, Vapor-X Tri-X is 6dB quieter:
http://ht4u.net/reviews/2014/sapphire_radeon_r9_280_dual-x_im_test/index12.php

How much can be difference in loudness between these two cards?
I have Fractal R4 case.


----------



## Dasboogieman

Quote:


> Originally Posted by *smoke2*
> 
> I'm wondering between two cards.
> Sapphire 280 Dual-X and more powerful Sapphire 280X Vapor-X Tri-X.
> 
> According this site, Vapor-X Tri-X is 6dB quieter:
> http://ht4u.net/reviews/2014/sapphire_radeon_r9_280_dual-x_im_test/index12.php
> 
> How much can be difference in loudness between these two cards?
> I have Fractal R4 case.


6DbA is a lot. The decibel scale is logarithmic.
Anyhoo the vapor x should have a significant theoretical advantage in the area of the VRAM and VRM cooling by the virtue of the vapor chamber. This should translate to slightly higher ivervoltageheadroom.


----------



## anubis1127

Anybody else in this thread have MSI Gaming R9 270s? I cannot get Afterburner to unlock voltage control for the life of me. Sapphire Trixx adjusts voltage just fine, but I prefer using AB.


----------



## miraldo

Hello guys.

I need your help.

I buying R9 280x and I can not decide which manufacturer to choose.

I Heard thath the ASUS 280x have some problems.

Which 280x do you recommend me? I love the quiet and cool GPU.


----------



## anubis1127

I like MSI for amd cards. Top notch RMA and service department.


----------



## miraldo

What abouth Gigabyte? Right nowI own Gigaybyte 7870 and I like it.

I just dont know what to chose, little confused. I canot find any reviev for 280x difrent manufacture :S


----------



## anubis1127

Quote:


> Originally Posted by *miraldo*
> 
> What abouth Gigabyte? Right nowI own Gigaybyte 7870 and I like it.
> 
> I just dont know what to chose, little confused. I canot find any reviev for 280x difrent manufacture :S


I believe the Gigabyte r9 280X has "locked" voltage, if you don't care about that it should be fine. The windforce cooler seemed to do an OK job on my old 7950 I had from them.


----------



## BruceB

Quote:


> Originally Posted by *miraldo*
> 
> Hello guys.
> I need your help.
> I buying R9 280x and I can not decide which manufacturer to choose.
> I Heard thath the ASUS 280x have some problems.
> Which 280x do you recommend me? I love the quiet and cool GPU.


Many reviewers say that the Powercolor 280X is the quietest (which is part of the reason I got one), but it's not the best for OC'ing. I can also say their customer Service is top notch.


----------



## Recr3ational

I have PowerColor too. Both of mine are awesome, not voltaged locked and overclocks fine. I can't vouch for the cooler as I never used it.


----------



## MoraisGT

I have a VTX3D and when it was on air it was super quiet, the fans hardly spined above the default 20%.

Of course you could set a manual fan profile so that the temperaturas are a bit better.

Mine overclocks like crazy, it does 1920MHz on the memories without touching the voltage, and without messing with the core.
It does 1260/1800MHz 24/7 super stable


----------



## anubis1127

How well a card OCs has little to do with the AIB, and more to do with the silicon lottery. I bet we could find users from all the different AIB partners with good OCing cards.


----------



## MoraisGT

Quote:


> Originally Posted by *anubis1127*
> 
> How well a card OCs has little to do with the AIB, and more to do with the silicon lottery. I bet we could find users from all the different AIB partners with good OCing cards.


Exactly, it's just like the cpu's.
However brands like Gigabyte don't help by locking the voltage of it's cards.


----------



## austinmrs

Hi guys!

I got a MSI R9 270x Hawk Edition.

I try to oc a little bit with msi afterburner, and it runs valley benchmark with no problem.

But then after play some cs go, my pc just crash. The image crash, not the sound. Every time i have my gpu with oc, cs go crash.. Although it runs valley benchmark with no problems. What can i do?


----------



## anubis1127

Quote:


> Originally Posted by *austinmrs*
> 
> Hi guys!
> 
> I got a MSI R9 270x Hawk Edition.
> 
> I try to oc a little bit with msi afterburner, and it runs valley benchmark with no problem.
> 
> But then after play some cs go, my pc just crash. The image crash, not the sound. Every time i have my gpu with oc, cs go crash.. Although it runs valley benchmark with no problems. What can i do?


Moar volts???


----------



## imran27

Toxic IMO is the best for overclocking as well as silence, custom PCB, Tri-X with Vapor Chamber and many more features that many of the guys here may know well...

One lower I like the Tri-X...


----------



## austinmrs

Quote:


> Originally Posted by *anubis1127*
> 
> Moar volts???


I dont want to mess with voltage. I just want to use the +20 power on msi afterburner.

The stocks are 1150 core clock and 1400 memory.

I have try

Power Target 120%
GPU clock 1206 MHz
Memory clock 1550 MHz

It passes valley without any problems, but then playing cs go after some time, it crashes. Eventhough my temps are good.


----------



## agrims

Quote:


> Originally Posted by *MoraisGT*
> 
> Exactly, it's just like the cpu's.
> However brands like Gigabyte don't help by locking the voltage of it's cards.


I own a Gigglebite and you can unlock the voltage, if one would like to. There are a few bios mods out there that allow up to 1.3v...


----------



## anubis1127

Quote:


> Originally Posted by *austinmrs*
> 
> I dont want to mess with voltage. I just want to use the +20 power on msi afterburner.
> 
> The stocks are 1150 core clock and 1400 memory.
> 
> I have try
> 
> Power Target 120%
> GPU clock 1206 MHz
> Memory clock 1550 MHz
> 
> It passes valley without any problems, but then playing cs go after some time, it crashes. Eventhough my temps are good.


Which type of vram does your card have? Both of my MSI r9 270s have Elpida, if yours do too, that may be a bit high on the memory clock.

And valley isn't a very good stability test, at all.


----------



## austinmrs

Quote:


> Originally Posted by *anubis1127*
> 
> Which type of vram does your card have? Both of my MSI r9 270s have Elpida, if yours do too, that may be a bit high on the memory clock.
> 
> And valley isn't a very good stability test, at all.


How can i see what kind of vram it have?

So what would you recommend me run to test the stability of my oc?

Also, what values do you recommend to start?


----------



## anubis1127

Quote:


> Originally Posted by *austinmrs*
> 
> How can i see what kind of vram it have?
> 
> So what would you recommend me run to test the stability of my oc?
> 
> Also, what values do you recommend to start?


The latest version of GPU-Z should tell you:



I generally run real DX 11 game benchmarks / games. BF4, Crysis 3, Far Cry 3, Tomb Raider, are all pretty good tests for stability. Even Borderlands 2 is pretty sensitive to OCs, and not a bad test.

Try stock core clock, and 1500Mhz on the memory, see if you crash still. If not, then gradually increase the core clock by 10-20mhz until you do crash.

You can also do a registry hack to get more than 20% Power Limit, that may help with stability too.


----------



## austinmrs

Yes, its Elpida.


----------



## anubis1127

Right on, your mem clocks may be a bit too aggressive. I'll try OCing mine before I get ready to do some gaming. I generally just run them at stock, because I'm lame like that.


----------



## austinmrs

Quote:


> Originally Posted by *anubis1127*
> 
> Right on, your mem clocks may be a bit too aggressive. I'll try OCing mine before I get ready to do some gaming. I generally just run them at stock, because I'm lame like that.


Why you dont oc yours? I think Hawk version got a pretty decent cooling, right?

So to start, 1500 on memory and +20% on power with stock clock, right?


----------



## anubis1127

Quote:


> Originally Posted by *austinmrs*
> 
> Why you dont oc yours? I think Hawk version got a pretty decent cooling, right?
> 
> So to start, 1500 on memory and +20% on power with stock clock, right?


I have two of the MSI Gaming R9 270s, these cards:



They run cool, and quiet, I just don't game very much honestly outside of the occasional hour or 2 of diablo 3, which doesn't need OC. This weekend will be the first time I try playing some FPS titles in a while, Titanfall, Payday 2, maybe some Bf4, so I will try OCing for some more performance.

I would start there, its up to you though, its not an exact science.


----------



## austinmrs

But will not worth to oc my 270x hawk a bit, right? If it stays cool


----------



## Arkanon

Quote:


> Originally Posted by *austinmrs*
> 
> Hi guys!
> 
> I got a MSI R9 270x Hawk Edition.
> 
> I try to oc a little bit with msi afterburner, and it runs valley benchmark with no problem.
> 
> But then after play some cs go, my pc just crash. The image crash, not the sound. Every time i have my gpu with oc, cs go crash.. Although it runs valley benchmark with no problems. What can i do?


for actual gaming you might have to reduce the clocks a bit. 3dm/valley stable doesn't mean it's actually game stable. game stable clocks are usually a bit lower than benchmark stable clocks. hope it helps. that said, there's no real risk in upping the voltage of that card to 1.3v or even higher.


----------



## anubis1127

Quote:


> Originally Posted by *austinmrs*
> 
> But will not worth to oc my 270x hawk a bit, right? If it stays cool


Oh it is definitely worth it to OC your card, that card was built to be OC'd. If I had a single card I would definitely be OCing to the limit. Its just with two in crossfire I don't feel the need as much at my current resolution (1920x1200).


----------



## austinmrs

Ok running mine at stock click, and 1500 memory and +20% on power. lets see if it crash


----------



## austinmrs

So should i just find my max memory oc with stock clock? Im at 1500 now, so trying to increase 10 by 10 to see my max memory oc. After i find it, pushing 10 by 10 the clock to find the max?


----------



## anubis1127

Quote:


> Originally Posted by *austinmrs*
> 
> So should i just find my max memory oc with stock clock? Im at 1500 now, so trying to increase 10 by 10 to see my max memory oc. After i find it, pushing 10 by 10 the clock to find the max?


Either way is fine. I typically try to find my max core OC first. I was just telling you to about the memory clocks because it seemed too high for elpida chips.

Maybe its not though, once my card is done folding this current WU, I'm going to try maxing out the vram clock, and see what my elpida chips can do too.

[edit] Just finished up, now for some OCing.


----------



## austinmrs

Quote:


> Originally Posted by *anubis1127*
> 
> Either way is fine. I typically try to find my max core OC first. I was just telling you to about the memory clocks because it seemed too high for elpida chips.
> 
> Maybe its not though, once my card is done folding this current WU, I'm going to try maxing out the vram clock, and see what my elpida chips can do too.
> 
> [edit] Just finished up, now for some OCing.


So what program you recommend to stress the gpu? Dont want to play a game everytime i oc a bit more


----------



## anubis1127

Quote:


> Originally Posted by *austinmrs*
> 
> So what program you recommend to stress the gpu? Dont want to play a game everytime i oc a bit more


Games are the best test, but Heaven is slightly harder on OCs than Valley it seems, maybe I'm crazy. 3DMark I can usually run clocks way higher than what is actually stable for gaming, so I wouldn't recommend that, unless you take what is stable in 3dmark, and then subtract like 50mhz.


----------



## austinmrs

Quote:


> Originally Posted by *anubis1127*
> 
> Games are the best test, but Heaven is slightly harder on OCs than Valley it seems, maybe I'm crazy. 3DMark I can usually run clocks way higher than what is actually stable for gaming, so I wouldn't recommend that, unless you take what is stable in 3dmark, and then subtract like 50mhz.


Downloading Heaven then. Will try 1200/1500


----------



## anubis1127

Quote:


> Originally Posted by *austinmrs*
> 
> Downloading Heaven then. Will try 1200/1500


So far my R9 270s with Elpida seem stable at 1500mhz on the vram, I'll try bumping it up.


----------



## miraldo

Sapphire toxic looks like the best buy.

Now Im looking for 290. Is worth the extra money?


----------



## anubis1127

Quote:


> Originally Posted by *miraldo*
> 
> Sapphire toxic looks like the best buy.
> 
> Now Im looking for 290. Is worth the extra money?


Yes. Well depending on how much more it is. But they are definitely a good bit faster.


----------



## Exxlir

just bought a r9 280x toxic sapphire and its brilliant


----------



## BruceB

Quote:


> Originally Posted by *miraldo*
> 
> Sapphire toxic looks like the best buy.
> 
> Now Im looking for 290. Is worth the extra money?


A toxic is very powerful but it's so load it'll pobably make your ears bleed.


----------



## Dasboogieman

Quote:


> Originally Posted by *miraldo*
> 
> Sapphire toxic looks like the best buy.
> 
> Now Im looking for 290. Is worth the extra money?


The 290 is built more for high resolution gaming or heavy usage of MSAA or SSAA at lower resolutions. If you can already get your games to a decent level of image quality, at a steady 60FPS on 1080 or lower then the 290 is not really warranted.

That being said, they're cheap as chips at the moment due to the post mining boom so you can probably pick up one for cheaper than a 280X if you look hard enough.


----------



## Devildog83

Question, I just tried to run 3DMark11 with my CPU bumped up to 5.0 Ghz but for some reason my GPU's are running at 50% max. I can't figure out why. The last run I did they ran at almost 100%. Any thoughts why this would happen?

Edit: I don't know what happened but I ran Valley and it ran at near 100% the tried 3Dmark11 again and it run at 100% so it fixed itself I guess.


----------



## miraldo

Tnx for help.

Need some help:

Im currently using a Corsair CX 600 v2(40A_12v). Do I need to upgrade my PSU if I buy 290 Tri-X?

My PC spec:

Gigabyte Z77M-D3H
Intel i5 2500K 3.3GHZ
HDD Samsung Spinpoint F3 1TB
SSD Crucial M500 120GB
Crucial BX 8GB
Corsair 600 CX V2


----------



## Dasboogieman

Quote:


> Originally Posted by *miraldo*
> 
> Tnx for help.
> 
> Need some help:
> 
> Im currently using a Corsair CX 600 v2(40A_12v). Do I need to upgrade my PSU if I buy 290 Tri-X?
> 
> My PC spec:
> 
> Gigabyte Z77M-D3H
> Intel i5 2500K 3.3GHZ
> HDD Samsung Spinpoint F3 1TB
> SSD Crucial M500 120GB
> Crucial BX 8GB
> Corsair 600 CX V2


You will get away with it but i strongly recommend an upgrade. Your PSU is known to be very noisy at heavy loads http://www.xbitlabs.com/articles/cases/display/chieftec-coolermaster-corsair-zalman_13.html#sect0.

A Sapphire 290 Tri X pulls about 175W under light game load, 225W in heavy game load, up to 300W with overvolted. Your CPU alone at stock can use up to 95W at stock.
The following calculations are assuming your PSU's capacitors are pristine.
So your headroom on the 12V rail alone is about 480-(50+175)= 255W or 46.8% 12 V rail capacity used in the best case possible scenario.
About 480-(75+225)= 180W headroom or 62.5% 12V rail capacity used under heavier load (Metro 2033 or Far Cry 3 style load), this loading scenario is likely your limit before your PSU fans really ramp up.
If you don't upgrade the PSU, you basically won't have any overvoltage headroom for your CPU or GPU.


----------



## Neocoolzero

Ad me in,new member









After literally years of waiting to replace my old GTS 250,finally bought a new gpu,and went red team this time,here she is











Now waiting for a EVGA Supernova 750W G2 so i can finally use it lol.


----------



## Devildog83

Quote:


> Originally Posted by *Neocoolzero*
> 
> Ad me in,new member
> 
> 
> 
> 
> 
> 
> 
> 
> 
> After literally years of waiting to replace my old GTS 250,finally bought a new gpu,and went red team this time,here she is
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now waiting for a EVGA Supernova 750W G2 so i can finally use it lol.


You Have been added. Nice Card! When you are up and running post your clocks. Thanks.


----------



## anubis1127

Quote:


> Originally Posted by *Dasboogieman*
> 
> Your CPU alone at stock can use up to 95W at stock.


TDP is not actual usage, its just a number given to cooler manufacturers so they know how much heat they need to dissipate. A stock i5 2500k isn't going to use anywhere near 95W, more like around 65-70W.


----------



## Neocoolzero

Quote:


> Originally Posted by *Devildog83*
> 
> You Have been added. Nice Card! When you are up and running post your clocks. Thanks.


Will do,ty


----------



## miraldo

Quote:


> Originally Posted by *Dasboogieman*
> 
> You will get away with it but i strongly recommend an upgrade. Your PSU is known to be very noisy at heavy loads http://www.xbitlabs.com/articles/cases/display/chieftec-coolermaster-corsair-zalman_13.html#sect0.
> 
> A Sapphire 290 Tri X pulls about 175W under light game load, 225W in heavy game load, up to 300W with overvolted. Your CPU alone at stock can use up to 95W at stock.
> The following calculations are assuming your PSU's capacitors are pristine.
> So your headroom on the 12V rail alone is about 480-(50+175)= 255W or 46.8% 12 V rail capacity used in the best case possible scenario.
> About 480-(75+225)= 180W headroom or 62.5% 12V rail capacity used under heavier load (Metro 2033 or Far Cry 3 style load), this loading scenario is likely your limit before your PSU fans really ramp up.
> If you don't upgrade the PSU, you basically won't have any overvoltage headroom for your CPU or GPU.


I hope 290 will run without any problems.

I just bought new CX600 2 months ago









My two TOP models of 290 are MSI and Sapphire Tri-X. I think the both are great


----------



## Dasboogieman

Quote:


> Originally Posted by *miraldo*
> 
> I hope 290 will run without any problems.
> 
> I just bought new CX600 2 months ago
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My two TOP models of 290 are MSI and Sapphire Tri-X. I think the both are great


Oh you will be fine, don't get me wrong, as long as everything's at stock. Your biggest problem won't be whether there is enough wattage but how loud the PSU will be under load since that is a known design point.

Are you intending to watercool? If you are then get the Tri-X. If you aren't and ever going to watercool then get the MSI gaming edition.

Its because there is actually 2 models of the gaming edition. The early models were pure reference with the twin frozr cooler but the latest model (I can't guess when they brought it out but its the model I have) has a larger 6+1+1 VRM assembly (instead of the stock 5+1+1). This means that the current crop of waterblocks won't fit but the card will have much cooler VRMs under air cooling compared to the tri X.


----------



## miraldo

Quote:


> Originally Posted by *Dasboogieman*
> 
> Oh you will be fine, don't get me wrong, as long as everything's at stock. Your biggest problem won't be whether there is enough wattage but how loud the PSU will be under load since that is a known design point.
> 
> Are you intending to watercool? If you are then get the Tri-X. If you aren't and ever going to watercool then get the MSI gaming edition.
> 
> Its because there is actually 2 models of the gaming edition. The early models were pure reference with the twin frozr cooler but the latest model (I can't guess when they brought it out but its the model I have) has a larger 6+1+1 VRM assembly (instead of the stock 5+1+1). This means that the current crop of waterblocks won't fit but the card will have much cooler VRMs under air cooling compared to the tri X.


Tnx









No I dont thin I will ever instal watercooling.

So which model of MSI is better, the one with 2 vents, or with 3 vents?


----------



## Dasboogieman

Quote:


> Originally Posted by *miraldo*
> 
> Tnx
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No I dont thin I will ever instal watercooling.
> 
> So which model of MSI is better, the one with 2 vents, or with 3 vents?


The 3 slot one (the Lightning) is better in terms of cooling but costs as much as a 780Ti. I wouldn't get it unless its really cheap.


----------



## miraldo

Quote:


> Originally Posted by *Dasboogieman*
> 
> The 3 slot one (the Lightning) is better in terms of cooling but costs as much as a 780Ti. I wouldn't get it unless its really cheap.


I can get 3months old 290 MSI GAMING for 280Eur, or 3months old 290 Sapphire Tri-X for 300Eur.

What is better choise? I like silent/cool GPU and offcourse OC performance lately


----------



## Dasboogieman

Quote:


> Originally Posted by *miraldo*
> 
> I can get 3months old 290 MSI GAMING for 280Eur, or 3months old 290 Sapphire Tri-X for 300Eur.
> 
> What is better choise? I like silent/cool GPU and offcourse OC performance lately


This is a hard choice
1. The Tri X is much cooler on the core and completely silent up to 50% fan speed. The VRMs are hotter though so when you overvolt, the fan speed has to be higher, thus louder.
2. The Gaming edition (assuming you got the late model gold choke one) is completely silent at 40% fan speed and gets loud fairly gently after that. The core isn't as cool as the Tri X (about 10-15 degrees worse at the same fan %) but the VRM 1 is much cooler (about 10 degrees better). Its a bit more balanced than the Tri X if you want to overvolt but isn't as silent. Basically, you won't know if you got the early or late model Gaming edition unless you see it in person or request photos.

I'd probably go with the Tri X if you had the choice because it is really silent for stock and mild overclocks. It really only loses to the Gaming edition when you apply more than +100mV.


----------



## Crowe98

Add me to deh club!








*MSI R9 280x Gaming - Running 1020 core and 1500 mem at time of post*






*Could anyone recommend some stable overclocks for this card?*


----------



## BruceB

Quote:


> Originally Posted by *Crowe98*
> 
> Add me to deh club!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *MSI R9 280x Gaming - Running 1020 core and 1500 mem at time of post*
> 
> 
> 
> 
> 
> 
> 
> 
> *Could anyone recommend some stable overclocks for this card?*


Welcome to the club!
I'm afraid that every card is different, not all MSI 280X's will OC the same, it's best to try it yourself (in small, incremental steps to avoid damaging anything) and see where your particular card becomes unstable. Good luck and have fun!









[EDIT]
And tell us how far you get!


----------



## Recr3ational

What up guys,
Major heat issues with my vrams!
On both of my cards, mines underwater so i have no idea why its overheating.
Advice is appreciated.

Cheers.


----------



## anubis1127

Quote:


> Originally Posted by *Recr3ational*
> 
> What up guys,
> Major heat issues with my vrams!
> On both of my cards, mines underwater so i have no idea why its overheating.
> Advice is appreciated.
> 
> Cheers.


That is rather odd, with EK blocks the vrm temps should be pretty low.

Did you just install them? Could the thermal pads be the wrong thickness and not making proper contact with the blocks? I'm not sure what else would cause that, unless the blocks aren't getting proper liquid flowing through them/giant air bubble trapped inside.


----------



## miraldo

Quote:


> Originally Posted by *Dasboogieman*
> 
> This is a hard choice
> 1. The Tri X is much cooler on the core and completely silent up to 50% fan speed. The VRMs are hotter though so when you overvolt, the fan speed has to be higher, thus louder.
> 2. The Gaming edition (assuming you got the late model gold choke one) is completely silent at 40% fan speed and gets loud fairly gently after that. The core isn't as cool as the Tri X (about 10-15 degrees worse at the same fan %) but the VRM 1 is much cooler (about 10 degrees better). Its a bit more balanced than the Tri X if you want to overvolt but isn't as silent. Basically, you won't know if you got the early or late model Gaming edition unless you see it in person or request photos.
> 
> I'd probably go with the Tri X if you had the choice because it is really silent for stock and mild overclocks. It really only loses to the Gaming edition when you apply more than +100mV.


Which program/game do you recommend the most, so I can do best stress stability GPU test?

I know just: Heaven 4.0, Furmark 3D, Crysis 1 or 3, Metro Last Light?


----------



## Dasboogieman

Quote:


> Originally Posted by *miraldo*
> 
> Which program/game do you recommend the most, so I can do best stress stability GPU test?
> 
> I know just: Heaven 4.0, Furmark 3D, Crysis 1 or 3, Metro Last Light?


Yeah all those are good. Especially metro on loop.
If you want to test memory stability, get the 14.6 beta and use MemtestCL. Its really sensitive and can weed out subtle memory instabilities.


----------



## anubis1127

I wouldn't use furmark, no point really, other than drawing a lot of power, and heating up your GPUs.


----------



## miraldo

I buying 3 months old 290 and I'm afraid that there's anything wrong with GPU.

The owner says that he using 290 for minig for 1 month and then he gave up.

What doyou thing guys? Is it wise to buy 3 months old 290?


----------



## miraldo

Quote:


> Originally Posted by *Dasboogieman*
> 
> Yeah all those are good. Especially metro on loop.
> If you want to test memory stability, get the 14.6 beta and use MemtestCL. Its really sensitive and can weed out subtle memory instabilities.


Metro Last Light on the loop? I dont understand, do you mean on Ultra settings?


----------



## anubis1127

Quote:


> Originally Posted by *miraldo*
> 
> Metro Last Light on the loop? I dont understand, do you mean on Ultra settings?


You can loop the benchmark utility.


----------



## NameUnknown

I'm sad now, got a quote for new tires, 1k out the door. There goes more GPUs for a while


----------



## Recr3ational

Quote:


> Originally Posted by *anubis1127*
> 
> That is rather odd, with EK blocks the vrm temps should be pretty low.
> 
> Did you just install them? Could the thermal pads be the wrong thickness and not making proper contact with the blocks? I'm not sure what else would cause that, unless the blocks aren't getting proper liquid flowing through them/giant air bubble trapped inside.


I'm thinking air as it was perfectly fine before


----------



## agrims

Quote:


> Originally Posted by *miraldo*
> 
> Which program/game do you recommend the most, so I can do best stress stability GPU test?
> 
> I know just: Heaven 4.0, Furmark 3D, Crysis 1 or 3, Metro Last Light?


Quote:


> Originally Posted by *Dasboogieman*
> 
> Yeah all those are good. Especially metro on loop.
> If you want to test memory stability, get the 14.6 beta and use MemtestCL. Its really sensitive and can weed out subtle memory instabilities.


Do not use furmark!! It has a good possibility of screwing cards up.. I think I can find some info on it. These new cards shouldn't be running furmark, as it places too much stress on the chipsets. However valley, 3d mark and the like are ok to use as they are basically supped up game graphics...


----------



## BruceB

Quote:


> Originally Posted by *agrims*
> 
> Do not use furmark!! It has a good possibility of screwing cards up.. I think I can find some info on it. These new cards shouldn't be running furmark, as it places too much stress on the chipsets. However valley, 3d mark and the like are ok to use as they are basically supped up game graphics...


^^No.









If your card's cooling and your PSU are capable your Card will be fine. Just Keep an eye on the temps. 85°C is the max for a 7970 / 280x









If your Card goes over 85°C during a _Furmark_ run then there's something wrong with it and you Need to RMA it. End of.


----------



## anubis1127

Until the vrms go pop, boom, and your card dies.


----------



## BruceB

Quote:


> Originally Posted by *anubis1127*
> 
> Until the vrms go pop, boom, and your card dies.


*If your card's cooling* and your PSU *are capable your Card will be fine*. Just Keep an eye on the temps. 85°C is the max for a 7970 / 280x









Max temp for VRMs is 100°C, if they get that hot then there's something wrong with your cooling.

They're talking about stress testing a GPU, Furmark is king when it Comes to finding out if your GPU's cooler and your PSU can hack it.


----------



## Kuhl

Got my 280x installed finally. Can I join the club now? No big overclocks until I overhaul my entire rig.


----------



## BruceB

Quote:


> Originally Posted by *Kuhl*
> 
> 
> 
> 
> 
> 
> 
> 
> Got my 280x installed finally. Can I join the club now? No big overclocks until I overhaul my entire rig.


Which Card have you got? Can we haz a rig-shot plz?


----------



## austinmrs

I got a MSI R9 270x hawk

isnt a big overclocker, since at 1550 memory i crash on cs go..

Overvolt it is worthed?


----------



## Kuhl

Quote:


> Originally Posted by *BruceB*
> 
> Which Card have you got? Can we haz a rig-shot plz?


I have a Asus r9 280x DC2. My rig isn't pretty as I had to rewire it a bit. I'll take a picture later today though.


----------



## Takla

Quote:


> Originally Posted by *BruceB*
> 
> Which Card have you got? Can we haz a rig-shot plz?




check the red box


----------



## anubis1127

Quote:


> Originally Posted by *Takla*
> 
> check the red box


I looked earlier, but all they had were films..


----------



## Devildog83

Quote:


> Originally Posted by *anubis1127*
> 
> I looked earlier, but all they had were films..










nope no Gpu's there.

I agree, Furmark is only good for stress testing and from what I have heard has killed some components so use at your own risk. I for one do not see the need for torture tests but that's just me.


----------



## Dasboogieman

Quote:


> Originally Posted by *anubis1127*
> 
> I wouldn't use furmark, no point really, other than drawing a lot of power, and heating up your GPUs.


Quote:


> Originally Posted by *agrims*
> 
> Do not use furmark!! It has a good possibility of screwing cards up.. I think I can find some info on it. These new cards shouldn't be running furmark, as it places too much stress on the chipsets. However valley, 3d mark and the like are ok to use as they are basically supped up game graphics...


Furmark is more of a quick and dirty way of testing the cooling system. It should never be run for more than 3 minutes. I often use it say when I've upgraded the thermal paste or the VRM cooling and need something to validate the improvement. Instead of waiting for Metro to run for 5 minutes (for temperatures to level at their true state) I just run furmark for 1-2 minutes. Though I agree, it definitely isn't a stability test, nor is it safe to run for long periods.


----------



## miraldo

Omg. I f*** good.

I just buy r9 290 and then I get it that it has only DVI-D for display. But my display Samsung BX 2235 has VGA :S

Is there any option to connect VGA to DVI-D/DVI-I?


----------



## anubis1127

Quote:


> Originally Posted by *miraldo*
> 
> Omg. I f*** good.
> 
> I just buy r9 290 and then I get it that it has only DVI-D for display. But my display Samsung BX 2235 has VGA :S
> 
> Is there any option to connect VGA to DVI-D/DVI-I?


Check here:

http://www.kvmswitchtech.com/dvi-d-to-vga-converter-c11647.htm

It may be cheaper to get a new monitor though.


----------



## agrims

Quote:


> Originally Posted by *miraldo*
> 
> Omg. I f*** good.
> 
> I just buy r9 290 and then I get it that it has only DVI-D for display. But my display Samsung BX 2235 has VGA :S
> 
> Is there any option to connect VGA to DVI-D/DVI-I?


http://www.amazon.com/dp/B0002OF2EI?psc=1

30 bucks. Or do yourself a favor and buy a monitor for yourself..


----------



## anubis1127

Quote:


> Originally Posted by *agrims*
> 
> http://www.amazon.com/dp/B0002OF2EI?psc=1
> 
> 30 bucks. Or do yourself a favor and buy a monitor for yourself..


Um, no. That will not work for him. A) its not even the correct plug. B) his card doesn't have dvi-i. If it did he could just pick up a $2 dvi-i to vga adapter. (Actually the card would probably come with one in the box if that were the case)


----------



## agrims

Quote:


> Originally Posted by *anubis1127*
> 
> Um, no. That will not work for him. A) its not even the correct plug. B) his card doesn't have dvi-i. If it did he could just pick up a $2 dvi-i to vga adapter. (Actually the card would probably come with one in the box if that were the case)


My apologies. Totally overlooked DVI-A.... In that case, I would look for option #2. Decent monitors can be had at 1080p for less...


----------



## miraldo

Quote:


> Originally Posted by *agrims*
> 
> My apologies. Totally overlooked DVI-A.... In that case, I would look for option #2. Decent monitors can be had at 1080p for less...


But I dont wanna new monitor. Im happy with Samsung bx2335. I have it only two years. It is great. And it has 2ms.

So if I dont buy new monitor is r9 290 usseles for me?









If I new thath 290 have only dvi-d..i should buy 280x with dvi-i


----------



## anubis1127

Quote:


> Originally Posted by *miraldo*
> 
> But I dont wanna new monitor. Im happy with Samsung bx2335. I have it only two years. It is great. And it has 2ms.
> 
> So if I dont buy new monitor is r9 290 usseles for me?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I new thath 290 have only dvi-d..i should buy 280x with dvi-i


Found a cheaper cable for you:

http://www.startech.com/m/AV/Converters/Video/DVI-D-to-VGA-Active-Adapter-Converter-Cable-1920x1200~DVI2VGAE

That one is only $45. I don't know how good it is.

You need an active dvi-d to VGA convertor if you want to keep that monitor.

Although I don't know why you would, VGA is pretty terrible for image quality.


----------



## anubis1127

Wait a minute..

This monitor?
http://www.samsung.com/us/computer/monitors/LS23B3UVMV/ZA-specs

http://www.newegg.com/Product/Product.aspx?Item=N82E16824001419

It lists 1 DVI input.

Just use that.


----------



## Kuhl

What drivers work best for you guys? I'm currently using 14.6 or whatever was the supposed watch_dogs fix. My 280x has ran fine for today besides some errors in BF3 that I can overlook because that game is meh anyway.


----------



## anubis1127

I'm using 14.6 beta for folding and gaming without issue thus far.


----------



## BruceB

Quote:


> Originally Posted by *anubis1127*
> 
> I'm using 14.6 beta for folding and gaming without issue thus far.


Do you Play BF4 on with the 14.6 beta? I find I get random crashes with it (I'm using a Powercolor 280X).


----------



## Recr3ational

Right air bubbles is screwing my rig.
When I use one gpu, it doesn't overheat.


----------



## anubis1127

Quote:


> Originally Posted by *BruceB*
> 
> Do you Play BF4 on with the 14.6 beta? I find I get random crashes with it (I'm using a Powercolor 280X).


I'll try tonight, I was using my 780 last time I played that game.


----------



## miraldo

Quote:


> Originally Posted by *anubis1127*
> 
> Wait a minute..
> 
> This monitor?
> http://www.samsung.com/us/computer/monitors/LS23B3UVMV/ZA-specs
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16824001419
> 
> It lists 1 DVI input.
> 
> Just use that.


Yes. This is my monitor.

Are you saying that this monitor haa the DVI input?

But where I cannot see it.

How can I connect GPU and monitor? I'm confused :S


----------



## anubis1127

Quote:


> Originally Posted by *miraldo*
> 
> Yes. This is my monitor.
> 
> Are you saying that this monitor haa the DVI input?
> 
> But where I cannot see it.
> 
> How can I connect GPU and monitor? I'm confused :S


According to the Samsung, and newegg specs sheet, yes.

Google Image search turned up this:


----------



## miraldo

Quote:


> Originally Posted by *anubis1127*
> 
> According to the Samsung, and newegg specs sheet, yes.
> 
> Google Image search turned up this:


This is my monitor. I have dvi-i input.

But I ned dvi-d for r9 290 right?

Do I need just to buy a DVI-I to DVI-D cable?


----------



## anubis1127

Quote:


> Originally Posted by *miraldo*
> 
> This is my monitor. I have dvi-i input.
> 
> But I ned dvi-d for r9 290 right?
> 
> Do I need just to buy a DVI-I to DVI-D cable?


I think you should be fine. DVI-I supports both Digital and Analog, so your input should accept the digital signal from the DVI-D output of the R9 290.

Somebody correct me if I'm wrong.


----------



## BruceB

Quote:


> Originally Posted by *anubis1127*
> 
> I'll try tonight, I was using my 780 last time I played that game.


That would be cool, I'm using mantle too, if that makes a difference(?).


----------



## miraldo

Quote:


> Originally Posted by *anubis1127*
> 
> I think you should be fine. DVI-I supports both Digital and Analog, so your input should accept the digital signal from the DVI-D output of the R9 290.
> 
> Somebody correct me if I'm wrong.


Thank you anubis1127 !!! You made my day
Quote:


> Originally Posted by *anubis1127*
> 
> I think you should be fine. DVI-I supports both Digital and Analog, so your input should accept the digital signal from the DVI-D output of the R9 290.
> 
> Somebody correct me if I'm wrong.


Thank you anubis1127 !!! Now I can finally enjoy playing games on MAX









I just bought cable with DVI-D conector on both sides and woooola























Howewer. This is the best forum I ever see. Friendly people who help immediately


----------



## anubis1127

You're welcome.


----------



## PCSarge

Quote:


> Originally Posted by *anubis1127*
> 
> You're welcome.


your...EVERYWHERE.....


----------



## anubis1127

Haha, I just don't unsubscribe from threads much. I try to help when I can.

Other times I try to be a smart "donkey" when I can.


----------



## muhd86

How do.we check a used.card...I plan to get 4 r280x twin frozer.gaming gpus

How do I check.if each is good to go..
Physically they look.like.new.but still

Just running Furmark.is enough or 5 loops.of metro 2033 is enough per gpu.

Plus how high can these cards be over clocked to

Sent from my C6802 using Tapatalk


----------



## buttface420

Quote:


> Originally Posted by *BruceB*
> 
> Do you Play BF4 on with the 14.6 beta? I find I get random crashes with it (I'm using a Powercolor 280X).


im using 14.6 beta on a r9 280x with mantle...no problems here.

bf4 has random crashes anyways


----------



## buttface420

Quote:


> Originally Posted by *miraldo*
> 
> Omg. I f*** good.
> 
> I just buy r9 290 and then I get it that it has only DVI-D for display. But my display Samsung BX 2235 has VGA :S
> 
> Is there any option to connect VGA to DVI-D/DVI-I?


dude just get a hdmi cord and hook it up to a flatscreen tv thats what i do

my monitor is a 55 inch plasma,makes gaming so much fun in living room with 360 controller and wireless mouse/keyboard combo!


----------



## anubis1127

Quote:


> Originally Posted by *buttface420*
> 
> dude just get a hdmi cord and hook it up to a flatscreen tv thats what i do
> 
> my monitor is a 55 inch plasma,makes gaming so much fun in living room with 360 controller and wireless mouse/keyboard combo!


Mmmm....motion blur.


----------



## BruceB

Quote:


> Originally Posted by *buttface420*
> 
> im using 14.6 beta on a r9 280x with mantle...no problems here.
> bf4 has random crashes anyways


I've heard from other People that BF4 crashes on them but for me it's only crashed since I updated to 14.6 beta. When it's not crashing I get smoother Performance so I don't really want to downgrade again...









Quote:


> Originally Posted by *anubis1127*
> 
> Mmmm....motion blur.


lol beat me to it!


----------



## buttface420

Quote:


> Originally Posted by *anubis1127*
> 
> Mmmm....motion blur.


not so bad on a panasonic plasma, i dont really notice it at all.


----------



## creationsh

Quote:


> Originally Posted by *BruceB*
> 
> I've heard from other People that BF4 crashes on them but for me it's only crashed since I updated to 14.6 beta. When it's not crashing I get smoother Performance so I don't really want to downgrade again...
> 
> 
> 
> 
> 
> 
> 
> 
> lol beat me to it!


No crash here, once I found the right setting. BF4 ran beautifully since. There is still artifact issues with shadows (bf4, thief, metro (the new metro patch fixed shadow issue)), other than that it is perfect. Kind of hard to know what to blame when things don't go as it was intended. Could be the game, driver, hardware, voltage, and the list goes on.


----------



## Devildog83

Quote:


> Originally Posted by *creationsh*
> 
> No crash here, once I found the right setting. BF4 ran beautifully since. There is still artifact issues with shadows (bf4, thief, metro (the new metro patch fixed shadow issue)), other than that it is perfect. Kind of hard to know what to blame when things don't go as it was intended. Could be the game, driver, hardware, voltage, and the list goes on.


I just bought Metro Last light for about $7 off of Steam and I will play and let you know how it all works with 14.6 and X-Fire.


----------



## muhd86

i am about to install r280x msi twin frozer 3 gpus n my sr2 , just wanted to know which amd drivers are more stable the 14.6 latest ones on guru3d.com ,

does bf4 and mantle play smoothly with no issues , coz before on nvidia i never had no issues with it


----------



## anubis1127

Quote:


> Originally Posted by *muhd86*
> 
> i am about to install r280x msi twin frozer 3 gpus n my sr2 , just wanted to know which amd drivers are more stable the 14.6 latest ones on guru3d.com ,
> 
> does bf4 and mantle play smoothly with no issues , coz before on nvidia i never had no issues with it


14.6 seems fine for me for both gaming and folding.

BF4 and Mantle still isn't a smooth experience on GCN 1.0 cards AFAIK. At least not on my R9 270.


----------



## DiceAir

Quote:


> Originally Posted by *anubis1127*
> 
> 14.6 seems fine for me for both gaming and folding.
> 
> BF4 and Mantle still isn't a smooth experience on GCN 1.0 cards AFAIK. At least not on my R9 270.


I think it's more of a BF4 issue. I hope Dice can fix this soon although I'm running a r9 280x I still want more people to experience mantle


----------



## rdr09

Quote:


> Originally Posted by *DiceAir*
> 
> I think it's more of a BF4 issue. I hope Dice can fix this soon although I'm running a r9 280x I still want more people to experience mantle


i agree. once you experience mantle in BF4 working as it should - there is no going back.


----------



## anubis1127

Quote:


> Originally Posted by *DiceAir*
> 
> I think it's more of a BF4 issue. I hope Dice can fix this soon although I'm running a r9 280x I still want more people to experience mantle


Oh I don't doubt that. I'm looking forward to trying some more titles as they come out. Been waiting for Thief to go on sale to pick that one up and give it a go.


----------



## ad hoc

Has anyone here ran Furmark on their 270 or 270x's?


----------



## buttface420

Quote:


> Originally Posted by *rdr09*
> 
> i agree. once you experience mantle in BF4 working as it should - there is no going back.


this.mantle with 14.6 on my sapphire dualx r9 280x, there is no going back.

although, they say mantle works better on pc's that are lacking. like if you have top notch equipment than there is no real difference than directx11,in my case my cpu is a e5450 which bottlenecks my 280x, so using dx11 i was getting on 64 player maps around 30-50fps, downloaded 14.6 and switch to mantle all of a sudden im getting 50-70 lows and high of up to 120 with an average of 80-90! .this is maxed out everything at 1080p.

so for me , mantle was a huge improvement.


----------



## Roaches

Just got home and huge brown a box at my front door












Gonna let the card cool down since it was pretty warm when I removed it from the Antistatic bag. UPS man must've left it hours ago exposed the box in the open heat.


----------



## Crowe98

Quote:


> Originally Posted by *Roaches*
> 
> Just got home and huge brown a box at my front door
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna let the card cool down since it was pretty warm when I removed it from the Antistatic bag. UPS man must've left it hours ago exposed the box in the open heat.


UPS man.. Haha. Over in Australia we call them the 'Posty'.


----------



## Roaches

Quote:


> Originally Posted by *Crowe98*
> 
> UPS man.. Haha. Over in Australia we call them the 'Posty'.


Well to be honest, I'd rather prefer UPS than USPS in my area, the post men are some of the worst in my city and post office has horrendous service. I mean they're so lazy that they'll leave a notice slip in my mailbox instead of knocking my front door when I'm home. Even when the packages are small enough to fit in a mailbox.


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> Just got home and huge brown a box at my front door
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna let the card cool down since it was pretty warm when I removed it from the Antistatic bag. UPS man must've left it hours ago exposed the box in the open heat.


Most excellent dude!!!!


----------



## Crowe98

So. How are we?

My MSI R9 280x Gaming has been artifacting a fair bit lately. In fact as I am typing this this page is getting weird glitches and patterns across the screen.

Upon running 3DMark Demo, (Firestrike) the screen was completely black for the first test, then for the second, I could barely see what was going on. There was all kinds of colours filling up the screen, jittering around and artifacting completely.

I'm not sure what to do guys.

*3DMark error message*

Unexpected error running tests.
Workload Single init returned error message: DXGI call IDXGISwapChain::SetFullscreenState failed [-2005270494]:

The requested functionality is not supported by the device or the driver.

DXGI_ERROR_NOT_CURRENTLY_AVAILABLE


----------



## Roaches

Quote:


> Originally Posted by *Devildog83*
> 
> Most excellent dude!!!!


Yep, right now in the progress of testing the new arrival







gonna have some crossfire fun once after stress testing a bit.




Quote:


> Originally Posted by *Crowe98*
> 
> So. How are we?
> 
> My MSI R9 280x Gaming has been artifacting a fair bit lately. In fact as I am typing this this page is getting weird glitches and patterns across the screen.
> 
> Upon running 3DMark Demo, (Firestrike) the screen was completely black for the first test, then for the second, I could barely see what was going on. There was all kinds of colours filling up the screen, jittering around and artifacting completely.
> 
> I'm not sure what to do guys.
> 
> *3DMark error message*
> 
> Unexpected error running tests.
> Workload Single init returned error message: DXGI call IDXGISwapChain::SetFullscreenState failed [-2005270494]:
> 
> The requested functionality is not supported by the device or the driver.
> 
> DXGI_ERROR_NOT_CURRENTLY_AVAILABLE


I'd try down clocking the core and memory a bit to find the cause of instability...though it sounds like a VRAM issue on your end.


----------



## NameUnknown

What are some sub $20 games that can push the 280X that are still fun and have a story?


----------



## Roaches

I just started testing it on Minecraft and I'm getting unusually high temps around 80s Celsius with the optifine + Sonic unbelievable shaders mod on the first few minutes of playing and flying around the scenery in creative mode. I got the feeling its either bad cooler contact pressure (backplate screws holding the cooler) or badly applied TIM from the OEM. Gonna have to pull it out and tighten it up.

Normally my 680s and first devil card usually floats around low 60s Celsius under intense load in this shader mod.

EDIT: decided to remove the cooler and replace the TIM with Gelid GC-Extreme.


----------



## Roaches

Update, It seems the previous owner applied CLU to the die but failed to conduct heat away properly...I just cleaned the die out with swab dipped in acetone. and noticed the liquid metal on the cooler base itself. Quite interesting and dangerous attemp since CLU would probably kill the card if I didn't remove the cooler sooner. Even worse if it contacted an SMD capacitor and other PCB devices.


----------



## tongerks

im using r9 280x dual x, is there any known issue about the fan making noise? its like hitting the fan frame?


----------



## Crowe98

Quote:


> Originally Posted by *Roaches*
> 
> I just started testing it on Minecraft and I'm getting unusually high temps around 80s Celsius with the optifine + Sonic unbelievable shaders mod on the first few minutes of playing and flying around the scenery in creative mode. I got the feeling its either bad cooler contact pressure (backplate screws holding the cooler) or badly applied TIM from the OEM. Gonna have to pull it out and tighten it up.
> 
> Normally my 680s and first devil card usually floats around low 60s Celsius under intense load in this shader mod.


Quote:


> Originally Posted by *Roaches*
> 
> Update, It seems the previous owner applied CLU to the die but failed to conduct heat away properly...I just cleaned the die out with swab dipped in acetone. and noticed the liquid metal on the cooler base itself. Quite interesting and dangerous attemp since CLU would probably kill the card if I didn't remove the cooler sooner. Even worse if it contacted an SMD capacitor and other PCB devices.


Lucky you did it now and not later man, good that you got it sorted.

As per my previous post, I rolled back my driver to 14.4 (was using the beta 14.6 driver) and restarted, all is good.









I'm now starting to overclock my card, and I'm indexing my results in an excel graph. I will post that later when I'm finished if anyone is interested.


----------



## Roaches

Quote:


> Originally Posted by *Crowe98*
> 
> Lucky you did it now and not later man, good that you got it sorted.
> 
> As per my previous post, I rolled back my driver to 14.4 (was using the beta 14.6 driver) and restarted, all is good.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm now starting to overclock my card, and I'm indexing my results in an excel graph. I will post that later when I'm finished if anyone is interested.


I'm glad It wasn't hardware related...I'm still on WHQL 14.4 drivers.









Also

Color me surprised! Gelid GC-Extreme drop temps to 54 degree Celsius average during the 30 minute burn in test in Minecraft with Optifine + Sonic's Shaders mod with all settings cranked to the max. Thats a huge drop from around 62 +-1 degree average to 54 Celsius average on average loads of 80-90 percent the whole time!







Dunno if its worth voiding the warranty to replace the stock TIM on my other Devil 270x though it feels real tempting though


----------



## Crowe98

Quote:


> Originally Posted by *Roaches*
> 
> I'm glad It wasn't hardware related...I'm still on WHQL 14.4 drivers.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also
> 
> Color me surprised! Gelid GC-Extreme drop temps to 54 degree Celsius average during the 30 minute burn in test in Minecraft with Optifine + Sonic's Shaders mod with all settings cranked to the max. Thats a huge drop from around 62 +-1 degree average to 54 Celsius average on average loads of 80-90 percent the whole time!
> 
> 
> 
> 
> 
> 
> 
> Dunno if its worth voiding the warranty to replace the stock TIM on my other Devil 270x though it feels real tempting though


At the moment during overclocking the highest I'm hitting is 70C right now on just under 1.2Ghz core on the Firestrike benchmark. Do you think replacing the TIM on my card will make any improvements?


----------



## Roaches

Quote:


> Originally Posted by *Crowe98*
> 
> At the moment during overclocking the highest I'm hitting is 70C right now on just under 1.2Ghz core on the Firestrike benchmark. Do you think replacing the TIM on my card will make an improvements?


Thats not bad depending on your clocks...Usually anything greater than 80 degrees is unacceptable IMO.


----------



## creationsh

Quote:


> Originally Posted by *Crowe98*
> 
> UPS man.. Haha. Over in Australia we call them the 'Posty'.


In America at this day and age, i call them Santa Claus.
Quote:


> Originally Posted by *NameUnknown*
> 
> What are some sub $20 games that can push the 280X that are still fun and have a story?


Other than Crysis 3. Nope, but close to being a push are: Tomb Raider, Metro Last Light, Bioshock Infinite, witcher 2,
Quote:


> Originally Posted by *Crowe98*
> 
> At the moment during overclocking the highest I'm hitting is 70C right now on just under 1.2Ghz core on the Firestrike benchmark. Do you think replacing the TIM on my card will make any improvements?


Hell yeah. Make sure you you don't use too much. Just a ¼ of a pea in the center will do you just fine.
Quote:


> Originally Posted by *Roaches*
> 
> Thats not bad depending on your clocks...Usually anything greater than 80 degrees is unacceptable IMO.


^ Vrm temperate seems to cause me the most worries. 110'C+ without a warning.


----------



## Roaches

Quote:


> ^ Vrm temperate seems to cause me the most worries. 110'C+ without a warning.


Yeah I really wish MSI AB was updated to keep tracks of VRM temps. I hate poping up GPUZ during a test when I know MSI AB could be better on the sensor side.

Just finish installing the second GPU though won't be able to do any real testing until late tomorrow due to work....I'm really happy how quickly I troubleshooted potential thermal problems before they get any worse...Can't wait to see how well CFX goes against my main system SLI setup in terms of player experience.












I'll try to get better pics next time since the lighting is dimm in the living room







My bad choice of exposure time on camera due to haste.

Oh and for reference, these cards are roughly 11.5 inches long, I wasn't expecting that since Newegg says 10.5 inches long at their product page.


----------



## Crowe98

Righto fellas. Here is my finished graph with the results of my overclocking.



I really wanted to try to get to 1.2Ghz, but it just wouldn't go. If i tried i would get severe artifacting (yet I still don't know why) and If i downclock back to 1175Mhz I still get this artifacting until i restart the PC, and run the test again. I don't know.









Even now still, I'm getting artifacts.



I've tried down clocking the Memory to 1400Mhz, still the same. Although I have heard some people say that the MSI Gaming R9 280x lacks cooling on the VRM's, but I'm not too sure.
Quote:


> Originally Posted by *TechPowerup*
> We also see very high VRM temperatures, well above 115°C. This suggests that the cooler does not cool the voltage regulation circuitry properly despite having a baseplate that should do so. Its lack of proper VRM cooling caused the card to throttle in many of our non-gaming stress tests, though gaming performance seems unaffected.


Source

*EDIT:* I think I may have found a fix for the artifacting, a thread on TomsHardware regarding the same artifacting issue says that when the GPU is at idle, it lowers the memory clock down to around 150Mhz and the core to 300Mhz. Someone on this link says, *I had similar problem with my XFX R9 280X. This seems to be common problem due to low memory clock when idle.
I managed to temporarily fix the problem by forcing the card to have constant 1500Mhz memory clock using Catalyst.
* I'll give it a try and come back.


----------



## Roaches

Disable ULPS if that helps....I just disabled mine shortly after installing my crossfire setup...

Startup Regedit, search for DisableULPS. And set the value to 0.


----------



## Crowe98

Quote:


> Originally Posted by *Roaches*
> 
> Disable ULPS if that helps....I just disabled mine shortly after installing my crossfire setup...
> 
> Startup Regedit, search for DisableULPS. And set the value to 0.


I looked around and found that if you create a profile on CCC and enable OverDrive, then edit the Profile.xml and change the want_0 and want_1 values to the want_2 values, you can set it so the memory and core speed do not drop when not running 3D applications.

I did that, and now when I'm on the desktop it's running at 1020Mhz core and in games 1175Mhz. Now I'm going to see if i can get to 1.2Ghz.


----------



## Crowe98

Still. Getting. *ARTIFACTS







*

EDIT: Couldn't find ULPS in the registry, isn't there a setting in CCC?

EDIT2: Just got this error whilst playing BF4 on 1175/1500.


----------



## miraldo

Hello.

Is it temperature 77 °C normal for R9 290 Sapphire TRI-X in full load(battlefield 4)?

Fan control is put on AUTO and go to 45% in full load.

In idle I get aroun 38 °C and fan is on 20%.


----------



## NameUnknown

I get 35C on idle with 20% fan (fan is on auto). My upstairs is probably around 74-76F.


----------



## Roaches

Quote:


> Originally Posted by *Crowe98*
> 
> Still. Getting. *ARTIFACTS
> 
> 
> 
> 
> 
> 
> 
> *
> 
> EDIT: Couldn't find ULPS in the registry, isn't there a setting in CCC?
> 
> EDIT2: Just got this error whilst playing BF4 on 1175/1500.


I'd try doing a manual clean install of the drivers....If all else fails, It might be the time to RMA

http://www.overclock.net/t/988215/how-to-remove-your-amd-ati-gpu-drivers


----------



## miraldo

Quote:


> Originally Posted by *NameUnknown*
> 
> I get 35C on idle with 20% fan (fan is on auto). My upstairs is probably around 74-76F.


This is 4-5C difrence, hmm

Do you have a case with a good airflow?


----------



## tsm106

Quote:


> Originally Posted by *Roaches*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Crowe98*
> 
> Still. Getting. *ARTIFACTS
> 
> 
> 
> 
> 
> 
> 
> *
> 
> EDIT: Couldn't find ULPS in the registry, isn't there a setting in CCC?
> 
> EDIT2: Just got this error whilst playing BF4 on 1175/1500.
> 
> 
> 
> 
> 
> I'd try doing a manual clean install of the drivers....If all else fails, It might be the time to RMA
> 
> http://www.overclock.net/t/988215/how-to-remove-your-amd-ati-gpu-drivers
Click to expand...

Don't take offense, but it is still amazing that some still refer to that thread as a guide. It is no different than hunt and pecking for amd references and hitting delete. If that is the goal, all are better off using AMD's own uninstaller app or any of the available driver wipers, etc.


----------



## Roaches

Quote:


> Originally Posted by *tsm106*
> 
> Don't take offense, but it is still amazing that some still refer to that thread as a guide. It is no different than hunt and pecking for amd references and hitting delete. If that is the goal, all are better off using AMD's own uninstaller app or any of the available driver wipers, etc.


No offense taken.
Manual guides exist because not all software and drivers removal apps deletes everything from left over registries and files. Its also a good measure to make sure to check that nothing left over behind when going through the automatic uninstall route. Methods used to remove something are always different from person to person as long it floats their boat.

Also welcome back, Its been a long time since your last post


----------



## BruceB

Quote:


> Originally Posted by *miraldo*
> 
> Hello.
> 
> Is it temperature 77 °C normal for R9 290 Sapphire TRI-X in full load(battlefield 4)?
> 
> Fan control is put on AUTO and go to 45% in full load.
> 
> In idle I get aroun 38 °C and fan is on 20%.


That's normal, my Powercolor is the same. AFAIK 77°C is the target temp for 280x's.


----------



## anubis1127

Quote:


> Originally Posted by *Roaches*
> 
> Just got home and huge brown a box at my front door
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna let the card cool down since it was pretty warm when I removed it from the Antistatic bag. UPS man must've left it hours ago exposed the box in the open heat.


Happy new (used) GPU day! Hah.

Those are some pretty large n beastly looking R9 270Xs, I have a couple MSI Gaming R9 270s, and they look like little baby GPUs in comparison.

Have you gotten to do any testing on them in CFX??

I'm wondering how much higher performance you'll see with a few extra hundred Mhz.


----------



## creationsh

Quote:


> Originally Posted by *Crowe98*
> 
> Still. Getting. *ARTIFACTS
> 
> 
> 
> 
> 
> 
> 
> *


Lets talk about voltage.


----------



## Roaches

Quote:


> Originally Posted by *anubis1127*
> 
> Happy new (used) GPU day! Hah.
> 
> Those are some pretty large n beastly looking R9 270Xs, I have a couple MSI Gaming R9 270s, and they look like little baby GPUs in comparison.
> 
> Have you gotten to do any testing on them in CFX??
> 
> I'm wondering how much higher performance you'll see with a few extra hundred Mhz.


Thanks! I've spent yesterday night testing and troubleshooting the new arrival before testing in crossfire, as I've said a few post back a page or two. I've yet to test them today once I get out of work today.








They're really huge for a midrange card. Unfortunately getting the best thermal performance would mean replacing the stock TIM. The cooler and build quality of the card really shines well for the cost and slight aftermarket effort.

IIRC Devildog said the cards are voltage locked. Though I'll try to squeeze some clocks out of them once I get home for more testing.


----------



## anubis1127

Quote:


> Originally Posted by *Roaches*
> 
> Thanks! I've spent yesterday night testing and troubleshooting the new arrival before testing in crossfire, as I've said a few post back a page or two. I've yet to test them today once I get out of work today.
> 
> 
> 
> 
> 
> 
> 
> 
> They're really huge for a midrange card. Unfortunately getting the best thermal performance would mean replacing the stock TIM. The cooler and build quality of the card really shines well for the cost and slight aftermarket effort.
> 
> IIRC Devildog said the cards are voltage locked. Though I'll try to squeeze some clocks out of them once I get home for more testing.


I think my MSI Gaming r9 270s are voltage locked too, at least on the stock BIOS, I'm not sure if there are any modded vBIOS for them. For my cards I have to use Trixx to bump up the voltage for whatever reason AB will not unlock voltage control for the life of me. In Trixx I am able to bump them up to 1.22V or so IIRC.

Unfortunately my "spare" PC (where the R9 270s are) is having other issues right now, BSOD 0x116 / screen goes to black then hard reboot, odd stuff. That is on a x58 mobo, which is like 5 years old, so it may be time to go pick up a z97 board, and a cheap Pentium G3258 for some old school Pentium fun. I have some troubleshooting to do though before just scraping it.


----------



## NameUnknown

Quote:


> Originally Posted by *miraldo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NameUnknown*
> 
> I get 35C on idle with 20% fan (fan is on auto). My upstairs is probably around 74-76F.
> 
> 
> 
> This is 4-5C difrence, hmm
> 
> Do you have a case with a good airflow?
Click to expand...

Yes, HAF932 with the side off atm while I work on it otherwise 4 fans on the side, 3 on top, 1 front, and 1 back. That said as the day has progressed and my house warmed, I am now idling at 37C. I just cranked the fan to 100% just to see where it stabilizes at if its at 100% on a warm day.


----------



## Roaches

Quote:


> Originally Posted by *anubis1127*
> 
> I think my MSI Gaming r9 270s are voltage locked too, at least on the stock BIOS, I'm not sure if there are any modded vBIOS for them. For my cards I have to use Trixx to bump up the voltage for whatever reason AB will not unlock voltage control for the life of me. In Trixx I am able to bump them up to 1.22V or so IIRC.
> 
> Unfortunately my "spare" PC (where the R9 270s are) is having other issues right now, BSOD 0x116 / screen goes to black then hard reboot, odd stuff. That is on a x58 mobo, which is like 5 years old, so it may be time to go pick up a z97 board, and a cheap Pentium G3258 for some old school Pentium fun. I have some troubleshooting to do though before just scraping it.


I recently bought a used MSI 270 from the OCN marketplace yesterday, though hopefully the issue isn't the same as you once I test it out when it arrives...Assuming MSI have a habit of pre-overclocking their TF model cards out of the box...I would try to drop the clocks to AMD reference card speeds and voltage if instabilities are experienced on factory OC during loads. If its a potential BIOS issue being the cause. Flashing it to reference card BIOS might help Iron out the black screen issues.

Kinda funny since I've replaced my EVGA 660 that went artifact a few months during late December on the former X58 system which now in my sister's possession. Which turns out the card was faulty and not the board and PSU. I would try to test the cards in a separate system board to isolate the root cause on your end if you have a spare available.

Good luck.


----------



## anubis1127

Quote:


> Originally Posted by *Roaches*
> 
> I recently bought a used MSI 270 from the OCN marketplace yesterday, though hopefully the issue isn't the same as you once I test it out when it arrives...Assuming MSI have a habit of pre-overclocking their TF model cards out of the box...I would try to drop the clocks to AMD reference card speeds and voltage if instabilities are experienced on factory OC during loads. If its a potential BIOS issue being the cause. Flashing it to reference card BIOS might help Iron out the black screen issues.
> 
> Kinda funny since I've replaced my EVGA 660 that went artifact a few months during late December on the former X58 system which now in my sister's possession. Which turns out the card was faulty and not the board and PSU. I would try to test the cards in a separate system board to isolate the root cause on your end if you have a spare available.
> 
> Good luck.


Oh no, the cards are fine. They function flawlessly in my x79 system, which is where they normally are.

The problem on the x58 one initially occurred Sunday night and I had a GTX 780 in it. In troubleshooting the 780 I put in the x79 rig, and it has been fine since Monday.

Out of curiosity I stuck the 270s in the x58 box and the same bsod/black issue persisted so I think my 780 is fine and just something with that rig is hosed up. I'm going to start troubleshooting the x58 rig tonight, I should have enough spare parts to narrow it down.

I would be interested to hear if the MSI 270 you get has locked voltage in AB when you get it in.

Thanks.


----------



## NameUnknown

So its been a bit over an hour and my card has dropped and is holding steady at 36C. Not a big drop, but with my house set to 76F while im out and the upstairs *always* being noticeably hotter I'll take a drop in card temps as ambients rise.


----------



## NameUnknown

What would be the best tool to use to OC my Diamond R9 280X? Will Afterburner work or do I want to use AMD Overdrive or something else?


----------



## Roaches

Quote:


> Originally Posted by *NameUnknown*
> 
> What would be the best tool to use to OC my Diamond R9 280X? Will Afterburner work or do I want to use AMD Overdrive or something else?


IMO Afterburner is like the best vendor neutral GPU monitoring app. If you're looking for better temp monitoring and control, I'd suggest trying it out and set yourself a custom fan curve to get a feel for it. It works just like other OC apps that are bundled from varying vendors.


----------



## anubis1127

+1 to Afterburner.


----------



## miraldo

Quote:


> Originally Posted by *NameUnknown*
> 
> Yes, HAF932 with the side off atm while I work on it otherwise 4 fans on the side, 3 on top, 1 front, and 1 back. That said as the day has progressed and my house warmed, I am now idling at 37C. I just cranked the fan to 100% just to see where it stabilizes at if its at 100% on a warm day.


Ok. I will put my fans on NZXT HADES on MAX and then report here my temperatures


----------



## creationsh

Quote:


> Originally Posted by *NameUnknown*
> 
> What would be the best tool to use to OC my Diamond R9 280X? Will Afterburner work or do I want to use AMD Overdrive or something else?


I tried many gpu utility, I ended up with Trixx. I would have stuck to HIS Turbo utility but I wasn't able to clock my VDDC higher than 12.5v. At lease with Trixx i was able to clock it up to 1.3v. Afterburner(new version and the old) will not work.


----------



## Crowe98

Quote:


> Originally Posted by *BruceB*
> 
> That's normal, my Powercolor is the same. AFAIK 77°C is the target temp for 280x's.


I'm idling right now at 29C


----------



## Roaches

Just finished installing 3Dmark, still in the progress of downloading StarSwarm, MetroLL, Hitman, and may some other games that have a built in benchmark.


----------



## Crowe98

I'm about to do a complete teardown of my rig... here we go.









EDIT: Pictures


Terrible photo, looked heaps better on the camera









This one's a goody

Another terrible shot

The workstation

A box of Mentats always doubles as a screw container









Dustay

I liked this photo.

I hate these fans with a passion. They're some $9 Coolermaster fans, and when I adjust their RPM with my fan controller anything above the lowest they sound like a turbine. The glow bright, but they just suck. + They're dusty. If you've never seen the NZXT Tempest 410 Elite they have a pretty cool front fan mounting system, the fans screw into a caddy that has a contact pad that connects to another pad when it is clipped into the case to transmit power to the fan. Pretty cool.


----------



## Roaches

Here my stock results, right now I'm kinda having trouble getting OC results as I can't clock my memory any higher than stock without running into a black screen system lockup (damn you elpida (second new card).

I'm currently just finished my FS score at 1200mzh core and gonna run FS extreme in a moment after this post.

http://www.3dmark.com/fs/2352748

http://www.3dmark.com/fs/2352709

EDIT: heres FS-E scores on 1200mhz...Gonna try to push it to 1250 on my next test.

http://www.3dmark.com/fs/2352824

http://www.3dmark.com/fs/2352846


----------



## Crowe98

Quote:


> Originally Posted by *Roaches*
> 
> Here my stock results, right now I'm kinda having trouble getting OC results as I can't clock my memory any higher than stock without running into a black screen system lockup (damn you elpida (second new card).
> 
> I'm currently just finished my FS score at 1200mzh core and gonna run FS extreme in a moment after this post.
> 
> http://www.3dmark.com/fs/2352748
> 
> http://www.3dmark.com/fs/2352709


Damn, didn't know 270x's had that much oomph. Nice!


----------



## Roaches

Quote:


> Originally Posted by *Crowe98*
> 
> Damn, didn't know 270x's had that much oomph. Nice!


Actually my second card has somewhat lower asic quality compared to my primary card. I can push the cores fine but shamefully I get a black screen if I beyond stock 1400 mzh on memory under the slightest 3d loads.


----------



## anubis1127

Nice @Roaches! I'm a bit jelly of that top card, too bad you didn't have two like that.

I got my x58 box tore down, put on the test bench, but ran out of steam shortly afterward. Going to be tomorrow before I can do some testing, hopefully get it stable again, and try some OCing on these 270s.


----------



## Crowe98

After I clean up my rig and put it all back together I'm going to do a clean wipe of the drivers the way you said before, and run some tests. This card should be able to easily run 1200Mhz core. And even if it couldn't is shouldn't be artifacting.

However if it does keep artifacting, how do I apply for a RMA? I've never done one before, and don't know where to start.

Thanks!


----------



## anubis1127

Quote:


> Originally Posted by *Crowe98*
> 
> After I clean up my rig and put it all back together I'm going to do a clean wipe of the drivers the way you said before, and run some tests. This card should be able to easily run 1200Mhz core. And even if it couldn't is shouldn't be artifacting.
> 
> However if it does keep artifacting, how do I apply for a RMA? I've never done one before, and don't know where to start.
> 
> Thanks!


With MSI?? It's super easy, just go to their support page, and request an online RMA number. Then you follow the instructions on where to send it, make sure you put the RMA number on the shipping label, and that's about it.


----------



## Crowe98

Quote:


> Originally Posted by *anubis1127*
> 
> With MSI?? It's super easy, just go to their support page, and request an online RMA number. Then you follow the instructions on where to send it, make sure you put the RMA number on the shipping label, and that's about it.


I can see what you're talking about on the USA page, but on the Asia/Pacific MSI webpage it's not there. Hm.


----------



## Roaches

Quote:


> Originally Posted by *anubis1127*
> 
> Nice @Roaches
> ! I'm a bit jelly of that top card, too bad you didn't have two like that.
> 
> I got my x58 box tore down, put on the test bench, but ran out of steam shortly afterward. Going to be tomorrow before I can do some testing, hopefully get it stable again, and try some OCing on these 270s.


LOL, these cards are nice but, the lack of unlocked voltages really makes them look like they're on chasity belts despite the high end feel these have and overkill mosfets and chokes on the PCB.

Also, got slightly lower scores compared to the previous. I'm guessing its a sign that it needs more voltage?

http://www.3dmark.com/fs/2352887

http://www.3dmark.com/fs/2352905

I'm temped to go higher on the core but don't think it will get any better without BIOS and soft volt mods from here on out....

Good to hear you've taken them out. If it happens to be stable on your other board, it could likely be the board or chip tripping out.


----------



## anubis1127

Quote:


> Originally Posted by *Crowe98*
> 
> I can see what you're talking about on the USA page, but on the Asia/Pacific MSI webpage it's not there. Hm.


Oh, sorry, not sure then. I'm using the mobile ocn site right now so I couldn't see your location.
Quote:


> Originally Posted by *Roaches*
> 
> LOL, these cards are nice but, the lack of unlocked voltages really makes them look like they're on chasity belts despite the high end feel these have and overkill mosfets and chokes on the PCB.
> 
> Also, got slightly lower scores compared to the previous. I'm guessing its a sign that it needs more voltage?
> 
> http://www.3dmark.com/fs/2352887
> 
> http://www.3dmark.com/fs/2352905
> 
> I'm temped to go higher on the core but don't think it will get any better without BIOS and soft volt mods from here on out....
> 
> Good to hear you've taken them out. If it happens to be stable on your other board, it could likely be the board or chip tripping out.


That does kind of stink about them being voltage locked.

I have a spare PSU, RAM, SSD, and CPU I can try, so if all those are replaced and the problem persists I will assume its the board. I think the first thing I will do is a clean install of Windows, then go from there.


----------



## Roaches

Quote:


> Originally Posted by *anubis1127*
> 
> Oh, sorry, not sure then. I'm using the mobile ocn site right now so I couldn't see your location.
> That does kind of stink about them being voltage locked.
> 
> I have a spare PSU, RAM, SSD, and CPU I can try, so if all those are replaced and the problem persists I will assume its the board. I think the first thing I will do is a clean install of Windows, then go from there.


Sounds good to me







hopefully you iron out the root of the issue. Looking forward to some OC results of your MSI 270.

I'm gonna try do some realistic gaming loads, via sniper elite v2 and some other titles on stock then OC to assess my CFX experience compared to my main SLI setup.


----------



## BruceB

Quote:


> Originally Posted by *Crowe98*
> 
> I'm idling right now at 29C


I meant under load, I should have made that clearer!








What are your load temps?


----------



## Crowe98

Quote:


> Originally Posted by *BruceB*
> 
> I meant under load, I should have made that clearer!
> 
> 
> 
> 
> 
> 
> 
> 
> What are your load temps?


At 1020 core I get around 65C


----------



## Crowe98

Quote:


> Originally Posted by *Roaches*
> 
> Here my stock results, right now I'm kinda having trouble getting OC results as I can't clock my memory any higher than stock without running into a black screen system lockup (damn you elpida (second new card).
> 
> I'm currently just finished my FS score at 1200mzh core and gonna run FS extreme in a moment after this post.
> 
> http://www.3dmark.com/fs/2352748
> 
> http://www.3dmark.com/fs/2352709
> 
> EDIT: heres FS-E scores on 1200mhz...Gonna try to push it to 1250 on my next test.
> 
> http://www.3dmark.com/fs/2352824
> 
> http://www.3dmark.com/fs/2352846


Those 1200Mhz scores are good as man! At the moment on 1175Mhz on my single 280x I got 7830 points... I've got a bit of catching up to do


----------



## miraldo

I put all my 5 fans on case to max and still get 78C in load and 39C in idle.

I think that the temperature should be lower. I have this case: http://www.newegg.com/Product/Product.aspx?Item=N82E16811146064

I wonder why auto fan speed on 290 not going higher then 45% in full load. If I put fan speed manualy on 50% I get 70C in load.

I'm concerned about temperature in idle. 39C is too high I think.

Which driver is best for 290? I curently using 14.6 BETA driver.


----------



## miraldo

After 5minutes of watching youtube video my GPU temperaturee increase to 48C.


----------



## Crowe98

Quote:


> Originally Posted by *miraldo*
> 
> I put all my 5 fans on case to max and still get 78C in load and 39C in idle.
> 
> I think that the temperature should be lower. I have this case: http://www.newegg.com/Product/Product.aspx?Item=N82E16811146064
> 
> I wonder why auto fan speed on 290 not going higher then 45% in full load. If I put fan speed manualy on 50% I get 70C in load.
> 
> I'm concerned about temperature in idle. 39C is too high I think.
> 
> Which driver is best for 290? I curently using 14.6 BETA driver.


For a 290 that's normal. They're known to run hot. Especially if its a reference design card.

*EDIT:*Try running 14.4 and see if anything changes.


----------



## miraldo

Quote:


> Originally Posted by *Crowe98*
> 
> For a 290 that's normal. They're known to run hot. Especially if its a reference design card.
> 
> *EDIT:*Try running 14.4 and see if anything changes.


I have Sapphirre 290 TRI-X.


----------



## Crowe98

Quote:


> Originally Posted by *miraldo*
> 
> I have Sapphirre 290 TRI-X.


Oh.

Maybe replace the TIM?


----------



## Crowe98

Just completely wiped the drivers, re-installed 14.4. Ran some tests ranging from 1100Mhz core with +20% working my way up. Got to 1150Mhz fine. Upped to 1200Mhz, and tested. Artifacts everywhere just like last time. Dropped it back down to 1180Mhz to see, still getting artifacts. Have to restart and go back to 1020 to get rid of the errors.

Looks like an RMA to me.. On the Australian MSI website you can RMA MSI products through MSI distrubutors; I emailed one just before so fingers crossed they can help me.


----------



## miraldo

I just install BluscreeViewer.

And obviously I have atikmdag.sys









Can I fix it somehow this problem?


----------



## anubis1127

Quote:


> Originally Posted by *miraldo*
> 
> I just install BluscreeViewer.
> 
> And obviously I have atikmdag.sys
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can I fix it somehow this problem?


I was getting BSODs due to that on 14.4 with my 270s. I ran DDU cleaned all the driver files, then installed 14.6, and that seemed to fix it for me.


----------



## tsm106

Quote:


> Originally Posted by *miraldo*
> 
> After 5minutes of watching youtube video my GPU temperaturee increase to 48C.


Youtube and just about all other web related apps/browsers can or will use gpu acceleration. If you don't want it to use gpu accel, then you'll have to disable it. If you also use afterburner, there will be extra steps there too.


----------



## Roaches

Just ran StarSwarm a few minutes ago and noticed only my primary GPU goes under load according to MSI AB logs....Does anyone know if it supports Crossfire yet?

Ran both direct X and mantle mode and secondary GPU remains idle even though CFX is enabled from the catalyst.


----------



## anubis1127

Quote:


> Originally Posted by *Roaches*
> 
> Just ran StarSwarm a few minutes ago and noticed only my primary GPU goes under load according to MSI AB logs....Does anyone know if it supports Crossfire yet?
> 
> Ran both direct X and mantle mode and secondary GPU remains idle even though CFX is enabled from the catalyst.


It should. Were you running in full screen?


----------



## Roaches

Yes the benchmark ran in fullscreen....I can give single card results for the moment...which I'll be posting in a few minutes.

Anywho I'm very impressed how Mantle can manage playable frame consistency in the most heaviest scenes of this Benchmark while Direct X mode remains unplayable throughout most of the time during the stress run in Extreme preset and it gets even worse in heavier scenes of the space battle. This from a *single* card under 99% load according to MSI AB logs.


----------



## anubis1127

Hrmm, odd. I haven't tried that benchmark on my two 270s, maybe I will tonight if my x58 system is stable. I have it stress testing right now on the test bench with a new PSU, hopefully thats all it was, so far it seems good, but I've only been stress testing for 20 minutes or so at this point, hah.


----------



## Roaches

Quote:


> Originally Posted by *anubis1127*
> 
> Hrmm, odd. I haven't tried that benchmark on my two 270s, maybe I will tonight if my x58 system is stable. I have it stress testing right now on the test bench with a new PSU, hopefully thats all it was, so far it seems good, but I've only been stress testing for 20 minutes or so at this point, hah.


Here are my results, both single card loads via Mantle and Direct X, Extreme preset +Follow camera flight.

Code:



Code:


===========================================================
Oxide Games
Star Swarm Stress Test - ©2013
C:\Users\Chimah\Documents\Star Swarm\Output_14_06_27_1433.txt
Version 1.10
06/27/2014 14:33
===========================================================

== Hardware Configuration =================================
GPU:            AMD Radeon R9 200 Series
CPU:            GenuineIntel
                       Intel(R) Core(TM) i5-3570K CPU @ 3.40GHz
Physical Cores:                 4
Logical Cores:                  4
Physical Memory:                8463740928
Allocatable Memory:             8796092891136
===========================================================

== Configuration ==========================================
API:                            Mantle
Scenario:                       ScenarioFollow.csv
User Input:                     Disabled
Resolution:                     1920x1080
Fullscreen:                     True
GameCore Update:                16.6 ms
Bloom Quality:                  High
PointLight Quality:             High
ToneCurve Quality:              High
Glare Overdraw:                 16
Shading Samples:                64
Shade Quality:                  Mid
Deferred Contexts (D3D11):              Disabled
Small Batch Optimized (Mantle):         Enabled
Temporal AA Duration:           16
Temporal AA Time Slice:         2
Detailed Frame Info:            Off
===========================================================

== Results ================================================
Test Duration:                  360 Seconds
Total Frames:                   14838

Average FPS:                    41.21
Average Unit Count:             4340
Maximum Unit Count:             5508
Average Batches/MS:             865.92
Maximum Batches/MS:             4527.92
Average Batch Count:            21713
Maximum Batch Count:            140069
===========================================================

Code:



Code:


===========================================================
Oxide Games
Star Swarm Stress Test - ©2013
C:\Users\Chimah\Documents\Star Swarm\Output_14_06_27_1506.txt
Version 1.10
06/27/2014 15:06
===========================================================

== Hardware Configuration =================================
GPU:            AMD Radeon R9 200 Series
CPU:            GenuineIntel
                       Intel(R) Core(TM) i5-3570K CPU @ 3.40GHz
Physical Cores:                 4
Logical Cores:                  4
Physical Memory:                8463740928
Allocatable Memory:             8796092891136
===========================================================

== Configuration ==========================================
API:                            DirectX
Scenario:                       ScenarioFollow.csv
User Input:                     Disabled
Resolution:                     1920x1080
Fullscreen:                     True
GameCore Update:                16.6 ms
Bloom Quality:                  High
PointLight Quality:             High
ToneCurve Quality:              High
Glare Overdraw:                 16
Shading Samples:                64
Shade Quality:                  Mid
Deferred Contexts (D3D11):              Disabled
Small Batch Optimized (Mantle):         Enabled
Temporal AA Duration:           16
Temporal AA Time Slice:         2
Detailed Frame Info:            Off
===========================================================

== Results ================================================
Test Duration:                  360 Seconds
Total Frames:                   9797

Average FPS:                    27.21
Average Unit Count:             4013
Maximum Unit Count:             5483
Average Batches/MS:             493.35
Maximum Batches/MS:             1080.25
Average Batch Count:            19603
Maximum Batch Count:            108708
===========================================================

The bench fails to run in crossfire mode







even in fullscreen...


----------



## Crowe98

Is it normal for a R9 280x to be under 99-100% load running an instance of Battlefield 4/Hardline on ultra?


----------



## Roaches

Tossing in some extras...Crossfire results from ingame benchmarks









Metro Last Light


This game is a bit Flakey, Microstuddering is noticeable but really not so different from my SLI 680s I've tested a year back. While I've had worse SLI experience in this game because of the random frame skip when running through levels and the Benchmark would show worse frame variance before, I should give it a second test on my SLI setup to see if that improved as well.

TombRaider Ultimate preset FXAA

Hitman Ultra 4x MSAA

Hitman Ultra No MSAA

Quote:


> Originally Posted by *Crowe98*
> 
> Is it normal for a R9 280x to be under 99-100% load running an instance of Battlefield 4/Hardline on ultra?


It should be normal, if your GPU is running at full cylinders it means theres no bottleneck holding you back.


----------



## anubis1127

Quote:


> Originally Posted by *Crowe98*
> 
> Is it normal for a R9 280x to be under 99-100% load running an instance of Battlefield 4/Hardline on ultra?


Depends on your resolution really.


----------



## Crowe98

Quote:


> Originally Posted by *Roaches*
> 
> Tossing in some extras...Crossfire results from ingame benchmarks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Metro Last Light
> 
> 
> This game is a bit Flakey, Microstuddering is noticeable but really not so different from my SLI 680s I've tested a year back. While I've had worse SLI experience in this game because of the random frame skip when running through levels and the Benchmark would show worse frame variance before, I should give it a second test on my SLI setup to see if that improved as well.
> 
> TombRaider Ultimate preset FXAA
> 
> Hitman Ultra 4x MSAA
> 
> Hitman Ultra No MSAA
> 
> It should be normal, if your GPU is running at full cylinders it means theres no bottleneck holding you back.


Thanks, was just wondering,

Bloody good scores btw.


----------



## Roaches

Heh thanks, I need to get a new tube of Gelid GC-Extreme, My primary GPU is way hotter than my second GPU.. stock OEM TIM must be garbage, its really holding the cooler back alot when it can do better.

Gonna search the web for more benches to try out.


----------



## anubis1127

Quote:


> Originally Posted by *Roaches*
> 
> Heh thanks, I need to get a new tube of Gelid GC-Extreme, My primary GPU is way hotter than my second GPU.. stock OEM TIM must be garbage, its really holding the cooler back alot when it can do better.
> 
> Gonna search the web for more benches to try out.


Here you go:


----------



## Roaches

Found a tube of MX-2 after digging around







, I guess it should be a temporary solution to the heat problems








GC-Extreme tube is exhausted of all TIM...Just ordered several tubes a minute ago.


----------



## miraldo

Well I try with Display Driver Uninstaller. and still get atikmdag.sys









I get this error before with 7870. Now with r9 290 it is the same.

atikmdag.sys+128f6f

If enyone know solution I will be very grateful. Fresh clean instalation of windows not helping..I even format HDD and SSD.


----------



## Roaches

I've taken some photos, and yeah Powercolor did a horrible job on the TIM, too much applied was the cause of heat issues...







Taken some mosfets pics for those interested in the VRMs...




http://www.irf.com/product-info/datasheets/data/irf8304mpbf.pdf

http://www.irf.com/product-info/datasheets/data/irf8327spbf.pdf


----------



## mikemykeMB

Wow, and always has been a proven slop apply job, esp.. when mfg of mass produced cards, get a slap stick of TIM, and go on forward to the the next station.


----------



## anubis1127

That seems about right for an AIB partner TIM application. Heh.





GTX 780 Lightning ^^





7970 Lightning ^^


----------



## Majentrix

Why so many heatpipes if the die doesn't even touch half of them?


----------



## Roaches

@Anubis It doesn't look like a healthy amount of TIM. Thats too much to me, I just finished and applied a 3mm size dot of MX-2 on my Devil just now and she's running like new with temps back at 29-30 idle vs 40-44 Celsius idle before.









Also I missed out on the voltage controller from my last post, so here it is for the PCB enthusiast as reference.



http://www.irf.com/product-info/datasheets/data/pb-chl8225.pdf


----------



## anubis1127

I meant that seems about right in that they put way too much on, lol.


----------



## Roaches

Quote:


> Originally Posted by *anubis1127*
> 
> I meant that seems about right in that they put way too much on, lol.


Sorry I misinterpret you.

Anywho, a quick burn in. SO much better temps than before. Around 10 degrees difference from OEM, but GC-Extreme is 10 degrees cooler than MX-2....I feel like tossing this tube away now








Oh well mind as well wait mid next week until my new tubes arrive.


----------



## Roaches

Some Unigine benches...These Devils price/performance wise puts my 680 SLI to shame.


----------



## tsm106

Quote:


> Originally Posted by *anubis1127*
> 
> That seems about right for an AIB partner TIM application. Heh.
> 
> 
> 
> 
> 7970 Lightning ^^


Looks like the paste was applied with a caulking gun lol.


----------



## anubis1127

Quote:


> Originally Posted by *tsm106*
> 
> Looks like the paste was applied with a caulking gun lol.


LOL, yeah, that one was particularly bad, temps were super high out of the box, I can't remember exactly what, but I want to say they almost immediately went up to 85-90C under load.


----------



## Roaches

Any CFX results from your MSI 270s yet?


----------



## anubis1127

Quote:


> Originally Posted by *Roaches*
> 
> Any CFX results from your MSI 270s yet?


No, sigh, I was just trying to get CCC installed, but it keeps failing, the only thing that will install is the Display Driver, which is fine for folding, but I can't enable CFX.









Just spent the last 15 minutes uninstalling with DDU and rebooting multiple times. Tomorrow they are getting moved back into my gaming rig, so hopefully that won't be an issue on that system.


----------



## Roaches

Still on your X58 board? have you tried manually deleting files and registry s. Sometimes automatic removal methods don't completely remove everything.

I've always done it this way in event I run into driver side issues on both camps.


----------



## anubis1127

Quote:


> Originally Posted by *Roaches*
> 
> Still on your X58 board? have you tried manually deleting files and registry s. Sometimes automatic removal methods don't completely remove everything.
> 
> I've always done it this way in event I run into driver side issues on both camps.


Yeah still on the x58 system. I didn't try that. Its getting late over here, and I don't feel like messing with it anymore, haha. I'm sure it will work fine in my main rig, I'll just swap the GPUs around in the morning.


----------



## darkelixa

In the market for a new gpu since my 770gtx just blue screens every time I install it into my case

Have been looking at the r9 280x, do they have alot of heating issues/shutting pc down problems and what would be a good brand to buy?

Have been looking at the sapphire r9 280 toxic/vapor so far

Pic of my new setup today


----------



## miraldo

Please doese anyone knows the solution.

I try to fix it with Display Driver Uninstaller, and still get atikmdag.sys










I get this error before with 7870. Now with r9 290 it is the same.

Error Name:

atikmdag.sys+128f6f

If enyone know solution I will be very grateful. Fresh clean instalation of windows not helping..I even format HDD and SSD.


----------



## mick2d2

Hello
Question for Anubis1127
Are you happy with your 2 x 270/270X in Xfire? Any problems gaming? I have 2 x 6850 which I am thinking of changing for 2 x 270s or 270Xs. Thanks for any insight.


----------



## anubis1127

Quote:


> Originally Posted by *mick2d2*
> 
> Hello
> Question for Anubis1127
> Are you happy with your 2 x 270/270X in Xfire? Any problems gaming? I have 2 x 6850 which I am thinking of changing for 2 x 270s or 270Xs. Thanks for any insight.


Hi, yeah, I like them for gaming. I have been pleasantly surprised by them. For 1080p, 1200p I think they are a fine solution. Any resolution higher than that and you would probably want cards with more vram. Check out some of the results Roaches has posted for proof, but they do pretty darn well IMO.


----------



## mick2d2

Quote:


> Originally Posted by *anubis1127*
> 
> Hi, yeah, I like them for gaming. I have been pleasantly surprised by them. For 1080p, 1200p I think they are a fine solution. Any resolution higher than that and you would probably want cards with more vram. Check out some of the results Roaches has posted for proof, but they do pretty darn well IMO.


Thanks for the info. Are you using 270s or 270Xs? I do a lot of my gaming at 5040 x 1050.


----------



## anubis1127

Quote:


> Originally Posted by *mick2d2*
> 
> Thanks for the info. Are you using 270s or 270Xs? I do a lot of my gaming at 5040 x 1050.


I'm using two MSI R9 270s:







I'll be running some benches later this afternoon to compare my results w/ @Roaches he posted.


----------



## mick2d2

Cheers!
That's one of the cards I was looking at, but for not much more on Amazon.es I could get the MSI 270X or the Sapphire Dual-X R9 280 3GB. Be good to see some triple screen benchmarks.


----------



## anubis1127

Right on, for not much more, then the X version would definitely give you a slight advantage over the non-X. The extra power delivery of the two 6-pins definitely allows for higher OCs. I can basically OC my 270s to stock 270X speeds.

The 280X may be a better option for you given you are using eyefinity.

I do have 3 1920x1200p monitors, but only have room for two on my desk, haha, so I've never tried eyefinity.


----------



## mick2d2

Quote:


> Originally Posted by *anubis1127*
> 
> Right on, for not much more, then the X version would definitely give you a slight advantage over the non-X. The extra power delivery of the two 6-pins definitely allows for higher OCs. I can basically OC my 270s to stock 270X speeds.
> 
> The 280X may be a better option for you given you are using eyefinity.
> 
> I do have 3 1920x1200p monitors, but only have room for two on my desk, haha, so I've never tried eyefinity.


The 280X costs quite a bit more though, The Sapphire 280 costs 190€ on Amazon.es whereas the 280X (one of the cheapest) is around 230€. Another reason I'm tempted by the 280 (apart from the extra memory) is that I would love to try triple Xfire at some point and the 280 is the entry level for that. I'm addicted to triple screen gaming as it makes a massive difference to FOV on racing games and FPShooters.


----------



## anubis1127

Quote:


> Originally Posted by *mick2d2*
> 
> The 280X costs quite a bit more though, The Sapphire 280 costs 190€ on Amazon.es whereas the 280X (one of the cheapest) is around 230€. Another reason I'm tempted by the 280 (apart from the extra memory) is that I would love to try triple Xfire at some point and the 280 is the entry level for that. I'm addicted to triple screen gaming as it makes a massive difference to FOV on racing games and FPShooters.


Ah, ok, I read that wrong, 280 then, either way, yes, like you mention, extra vram and potential for 3way or even quad CFX is there with the 280.

I can probably do some re-arrange and temporarily rig up my 3rd panel to test some eyefinity on these R9 270s. I have kept the 3rd panel around for when I finally build a sim rig I want to use these 3 panels for that, and then upgrade my main gaming rig screens to one or two 4K panels.


----------



## mick2d2

That would be great if you could. It's surprising how few Xfire and triple Xfire benchmarks you see at 3 x screen resolutions. You do need a wide desk!







But it's a cheap option these days to do multiple screens. It's not that long ago when even having one decent CRT monitor cost a fortune.
For any type of simulator, driving or otherwise it's a great option. I play quite a lot of mods like Prophesy Of pendor (Mount and Blade) and when you switch back to one screen after being used to three, it's a real game changer, it's like suddenly wearing blinkers!


----------



## Crowe98

anubis, you've probably heard about my issues with my MSI R9 280x from all my constant blabbering about it.

Do you know of any way that could stop my card from artifacting when it gets over 1180Mhz on the core? Am I doing something wrong?

When I overclock using MSI AB I turn the power limit to +20%, then start bumping the Core clock up by 25Mhz increments.

I then test the card using Firestrike Demo, and run through those.

When I reach the 1175Mhz clock speed, some tests will skip or blackscreen, then when I hit 1200Mhz its just a rainbow all over the screen for all the tests.

If you know of anything that could help that would be great.

I've emailed two retail companies regarding RMA's because over here in Australia the way to RMA a MSI product is to get it done through a MSI distributor.

Thanks.


----------



## anubis1127

Quote:


> Originally Posted by *Crowe98*
> 
> anubis, you've probably heard about my issues with my MSI R9 280x from all my constant blabbering about it.
> 
> Do you know of any way that could stop my card from artifacting when it gets over 1180Mhz on the core? Am I doing something wrong?
> 
> When I overclock using MSI AB I turn the power limit to +20%, then start bumping the Core clock up by 25Mhz increments.
> 
> I then test the card using Firestrike Demo, and run through those.
> 
> When I reach the 1175Mhz clock speed, some tests will skip or blackscreen, then when I hit 1200Mhz its just a rainbow all over the screen for all the tests.
> 
> If you know of anything that could help that would be great.
> 
> I've emailed two retail companies regarding RMA's because over here in Australia the way to RMA a MSI product is to get it done through a MSI distributor.
> 
> Thanks.


The only thing I can think to try would be adding more voltage. If that isn't an option, then you may have to dial it back a bit.

Not all GPUs are created equal, not every 280X hits 1200mhz+. I had a 7970 (basically 280X) that didn't really like doing anything over 1100Mhz, and I kept it at 1080Mhz for stability.


----------



## Crowe98

Quote:


> Originally Posted by *anubis1127*
> 
> The only thing I can think to try would be adding more voltage. If that isn't an option, then you may have to dial it back a bit.
> 
> Not all GPUs are created equal, not every 280X hits 1200mhz+. I had a 7970 (basically 280X) that didn't really like doing anything over 1100Mhz, and I kept it at 1080Mhz for stability.


Fair enough, I'm going to update the BIOS to apparently 'fix' VRAM and core overheating... hmm.


----------



## BruceB

Quote:


> Originally Posted by *Crowe98*
> 
> Fair enough, I'm going to update the BIOS to apparently 'fix' VRAM and core overheating... hmm.


Have you tried making custom fan curve with MSI AfterBurner? Might solve your problem without flashing your card


----------



## Crowe98

Quote:


> Originally Posted by *BruceB*
> 
> Have you tried making custom fan curve with MSI AfterBurner? Might solve your problem without flashing your card


I have... no luck.









Also, how much should I overclock memory? This is the first time doing it. I've gone from 1500Mhz to 1600 (At 1525 and 1550 I was getting lower FS points than at 1500Mhz, but at 1600 I'm getting higher - idk)


----------



## Recr3ational

Quote:


> Originally Posted by *Crowe98*
> 
> I have... no luck.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, how much should I overclock memory? This is the first time doing it. I've gone from 1500Mhz to 1600 (At 1525 and 1550 I was getting lower FS points than at 1500Mhz, but at 1600 I'm getting higher - idk)


I don't think memory does that much tbh. Mines at 1250/1600, raising the memory doesn't really do that much compare to core.
Core is king.


----------



## Crowe98

Quote:


> Originally Posted by *Recr3ational*
> 
> I don't think memory does that much tbh. Mines at 1250/1600, raising the memory doesn't really do that much compare to core.
> Core is king.


You're running 280x CF? Nice man. What vendor?


----------



## BruceB

Quote:


> Originally Posted by *Crowe98*
> 
> I have... no luck.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, how much should I overclock memory? This is the first time doing it. I've gone from 1500Mhz to 1600 (At 1525 and 1550 I was getting lower FS points than at 1500Mhz, but at 1600 I'm getting higher - idk)


Bad times.








Try returning your Memory to stock Speeds and see if you can push the core further before you get artiefacts.
Reduced Performance at higher clock rates is usually an indication of the Memory being at/over its Limit.


----------



## Recr3ational

Quote:


> Originally Posted by *Crowe98*
> 
> You're running 280x CF? Nice man. What vendor?


Running PowerColor Turbo Duos,
Getting really hot VRM's for some reason, even though it's under water. It's either contacts or the cards just seems to be hot...


----------



## Neocoolzero

Got my new PSU 2 days ago, so finally able to enjoy my 280x at last,love it









Haven't dabbled in ocing it yet,so just at stock values for now.


----------



## lluukkman

is there any 280 users? why I can't find it in first page? I'm HIS ICEQ R9 280 user. anyone with me?


----------



## Kuhl

Quote:


> Originally Posted by *lluukkman*
> 
> is there any 280 users? why I can't find it in first page? I'm HIS ICEQ R9 280 user. anyone with me?


Most people opt for the 280x as they are both rebrands of the 7950 (280) and 7970(280x) and the used prices for the 280x are a steal.


----------



## Devildog83

Quote:


> Originally Posted by *lluukkman*
> 
> is there any 280 users? why I can't find it in first page? I'm HIS ICEQ R9 280 user. anyone with me?


If you wish to be added let me know.


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> Some Unigine benches...These Devils price/performance wise puts my 680 SLI to shame.


Rig looks sweet.
I am right there with you.


----------



## Roaches

Neato







My devils were at stock at the time I did that Valley run.









My 3570K was at 4.4ghz during that run, I dunno why Unigine Valley and Heaven says 3.40ghz.

What about your clocks?


----------



## Crowe98

Hm. Okay so just did some testing, and came out with some interesting results.

Stock clocks for MSI R9 280x Gaming are *1020/1500*

I had previously got massive artifacts using *1200/1500*

So I set the memory to 1400Mhz and the core clock to 1175Mhz. No artifacts.

Intrigued, I kept going. 1200 Core and 1300 Memory. No artifacts. Hmm.

Bumped the core up to 1225, crashed the second test on Firestrike.

Back down to 1200, bumped the memory back up to 1400, I got a screen black flash for a millisecond, but other than that no artifacts.

Back up to *1200/1500* and tested. No artifacts.

Saved this as a profile and restarted. Now I'm here.

I don't understand.


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> Neato
> 
> 
> 
> 
> 
> 
> 
> My devils were at stock at the time I did that Valley run.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My 3570K was at 4.4ghz during that run, I dunno why Unigine Valley and Heaven says 3.40ghz.
> 
> What about your clocks?


That's at stock, nice.
I think they were at 1235/1450 and 4.8Ghz on the 8350 which is easy to reach, it's my everyday clocks.

New egg sent me a Z97 mobo to review so I will be trying the Intel side too. I am getting a G3258 to start because I heard they can overclock like crazy and I want to learn to overclock Intel chips.


----------



## Roaches

Nice! I can't seem to go anywhere over 1400 on the memory due to one of my Devils having Elpida memory...I can go up to 1250 Mhz on the cores but loses slight peformance due to need of additional voltage to compensate. I'll do a 1200mhz core tonight on Valley and report back.

Looking forward to your Z97 and Unlocked Pentium chip review.


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> Nice! I can't seem to go anywhere over 1400 on the memory due to one of my Devils having Elpida memory...I can go up to 1250 Mhz on the cores but loses slight peformance due to need of additional voltage to compensate. I'll do a 1200mhz core tonight on Valley and report back.
> 
> Looking forward to your Z97 and Unlocked Pentium chip review.


Weird, my 270x can get to almost 1600 on the memory but the core is limited to about 1235.


----------



## anubis1127

Well I tried to join the Unigine fun, but it seems both my CFX bridges are shot. The one I get white flashing everywhere, the other blue flashing.

I'm also picking up a Z97 board, and Pentium G3258 come Monday. Specifically the Gigabyte Z97X Gaming 7 board. It looks like it comes with some type of bridge, but I am betting its a SLI one knowing my luck.



[edit] Just checked a review site, it is indeed a SLI bridge, ofc.


----------



## Roaches

Quote:


> Originally Posted by *Devildog83*
> 
> Weird, my 270x can get to almost 1600 on the memory but the core is limited to about 1235.


I get Black screens + system lockup when trying to reach 1500 mhz

Crashes on 3Dmark


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> I get Black screens + system lockup when trying to reach 1500 mhz
> 
> Crashes on 3Dmark


HMM, is it the same just using 1 card?

When I tested my cards by themselves I got about 1275/1450 from the 7870 and 1235/1585 with the 270x and both have Elpida so I don't think that's what is holding you back.


----------



## Roaches

In Crossfire, it happens, I have not yet pushed the memory on a single card...since my Primary card has Hynix. I'm willing to give it another shot. Or Isolate them both to push them independently to find the root cause.

@Anubis, sorry to hear your CFX bridge is shot...I might have a spare available if you want me to ship you one. though mostly have SLI bridges in my possession.


----------



## anubis1127

Quote:


> Originally Posted by *Roaches*
> 
> In Crossfire, it happens, I have not yet pushed the memory on a single card...since my Primary card has Hynix. I'm willing to give it another shot. Or Isolate them both to push them independently to find the root cause.
> 
> @Anubis, sorry to hear your CFX bridge is shot...I might have a spare available if you want me to ship you one. though mostly have SLI bridges in my possession.


Its annoying because I have about 10 SLI bridges sitting in my computer room. I do have a 3rd CFX bridge, but it isn't long enough.

Thanks for the offer, it looks like microcenter has a couple in stock, I'll probably just pick one up on Monday when I pick up the Pentium/z97.

I guess now I can test each card individually, which doesn't sound as fun.


----------



## Devildog83

Quote:


> Originally Posted by *anubis1127*
> 
> Well I tried to join the Unigine fun, but it seems both my CFX bridges are shot. The one I get white flashing everywhere, the other blue flashing.
> 
> I'm also picking up a Z97 board, and Pentium G3258 come Monday. Specifically the Gigabyte Z97X Gaming 7 board. It looks like it comes with some type of bridge, but I am betting its a SLI one knowing my luck.
> 
> 
> 
> 
> [edit] Just checked a review site, it is indeed a SLI bridge, ofc.


Nice, I don't know but the board NewEgg is sending me doesn't do SLI, only X-Fire. Yours does both and probably is an SLI bridge because the new AMD cards don't need one.

Yep it is.

Here -







http://us.estore.asus.com/index.php?l=product_detail&p=4812


----------



## Roaches

Quote:


> Originally Posted by *anubis1127*
> 
> Its annoying because I have about 10 SLI bridges sitting in my computer room. I do have a 3rd CFX bridge, but it isn't long enough.
> 
> Thanks for the offer, it looks like microcenter has a couple in stock, I'll probably just pick one up on Monday when I pick up the Pentium/z97.
> 
> I guess now I can test each card individually, which doesn't sound as fun.


Well alright then, I do have one that came with my X79E-WS board. Happens to be long 4 slot spacing instead of the standard 3.


----------



## anubis1127

Quote:


> Originally Posted by *Roaches*
> 
> Well alright then, I do have one that came with my X79E-WS board. Happens to be long 4 slot spacing instead of the standard 3.


Nice. I have the regular x79 WS board, but got it open-box from microcenter, so I didn't get a lot of the goodies.

Quote:


> Originally Posted by *Devildog83*
> 
> Nice, I don't know but the board NewEgg is sending me doesn't do SLI, only X-Fire. Yours does both and probably is an SLI bridge because the new AMD cards don't need one.
> 
> Yep it is.
> 
> Here -
> 
> 
> 
> 
> 
> 
> 
> http://us.estore.asus.com/index.php?l=product_detail&p=4812


Hrmm, that is a nice looking CFX bridge. I may order it just to have a black one.


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> In Crossfire, it happens, I have not yet pushed the memory on a single card...since my Primary card has Hynix. I'm willing to give it another shot. Or Isolate them both to push them independently to find the root cause.
> 
> @Anubis, sorry to hear your CFX bridge is shot...I might have a spare available if you want me to ship you one. though mostly have SLI bridges in my possession.


Elpida -


----------



## Roaches

You probably got the good apples from the pick. I'll have test both of mine independently tomorrow to see how far I can push on each of them.


----------



## Roaches

Quote:


> Originally Posted by *Devildog83*
> 
> Rig looks sweet.
> I am right there with you.


Holy****

I just broke my previous record! 1200 Mhz now scoring over 100 FPS!!











Quite Oomph in performance!


----------



## Dasboogieman

Quote:


> Originally Posted by *Devildog83*
> 
> That's at stock, nice.
> I think they were at 1235/1450 and 4.8Ghz on the 8350 which is easy to reach, it's my everyday clocks.
> 
> New egg sent me a Z97 mobo to review so I will be trying the Intel side too. I am getting a G3258 to start because I heard they can overclock like crazy and I want to learn to overclock Intel chips.


Yeah that's a good chip, when oced to 4.6ghz its like exactly half an i5-4690k (including the cache size lol) for one third of the price. You'll do well with it as a gaming CPU.

Intel CPU oc has gotten really easy since Sandybridge, at the basic level, you basically just juggle vcore and multipliers.

I reckon over clocking was more fun back in the Core2 duo, Bloomfield and Lynnfield era with the baseclock modification. The distinction between good and bad motherboards and the distinction between good and bad ram made more of a difference in your over clock capability.
Nowadays, you're more at the mercy of the chip lottery.


----------



## austinmrs

Quote:


> Originally Posted by *anubis1127*
> 
> Well I tried to join the Unigine fun, but it seems both my CFX bridges are shot. The one I get white flashing everywhere, the other blue flashing.
> 
> I'm also picking up a Z97 board, and Pentium G3258 come Monday. Specifically the Gigabyte Z97X Gaming 7 board. It looks like it comes with some type of bridge, but I am betting its a SLI one knowing my luck.
> 
> 
> 
> 
> [edit] Just checked a review site, it is indeed a SLI bridge, ofc.


Why not a VII Hero? I find it so beautiful!


----------



## BruceB

Quote:


> Originally Posted by *Crowe98*
> 
> Hm. Okay so just did some testing, and came out with some interesting results.
> Stock clocks for MSI R9 280x Gaming are *1020/1500*
> I had previously got massive artifacts using *1200/1500*
> So I set the memory to 1400Mhz and the core clock to 1175Mhz. No artifacts.
> Intrigued, I kept going. 1200 Core and 1300 Memory. No artifacts. Hmm.
> Bumped the core up to 1225, crashed the second test on Firestrike.
> Back down to 1200, bumped the memory back up to 1400, I got a screen black flash for a millisecond, but other than that no artifacts.
> Back up to *1200/1500* and tested. No artifacts.
> Saved this as a profile and restarted. Now I'm here.
> I don't understand.


That's strange...








I haven't experienced anything like that myself, maybe someone else can shed some light on what's going on here?
Was there anything different this time round? Was it a colder day (->lower temps ->better stability)? Different software?


----------



## Devildog83

Quote:


> Originally Posted by *Dasboogieman*
> 
> Yeah that's a good chip, when oced to 4.6ghz its like exactly half an i5-4690k (including the cache size lol) for one third of the price. You'll do well with it as a gaming CPU.
> 
> Intel CPU oc has gotten really easy since Sandybridge, at the basic level, you basically just juggle vcore and multipliers.
> 
> I reckon over clocking was more fun back in the Core2 duo, Bloomfield and Lynnfield era with the baseclock modification. The distinction between good and bad motherboards and the distinction between good and bad ram made more of a difference in your over clock capability.
> Nowadays, you're more at the mercy of the chip lottery.


AMD overclocking has been a blast so I hope Intel is too. How did you get this chip to 4.6 Ghz when it hasn't been released, do you have a review chip?


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> AMD overclocking has been a blast so I hope Intel is too. How did you get this chip to 4.6 Ghz when it hasn't been released, do you have a review chip?


Thats one thing i miss with my old 8350. It was so fun to overclock


----------



## anubis1127

Quote:


> Originally Posted by *austinmrs*
> 
> Why not a VII Hero? I find it so beautiful!


Yeah, that would be a nice board too, but its not in stock at microcenter.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Thats one thing i miss with my old 8350. It was so fun to overclock


That's the beauty of it, I get to have both. Enjoy the AMD rig while I test, play with and upgrade the Intel rig.


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> Holy****
> 
> I just broke my previous record! 1200 Mhz now scoring over 100 FPS!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quite Oomph in performance!


Try on ExtremeHD setting.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> That's the beauty of it, I get to have both. Enjoy the AMD rig while I test, play with and upgrade the Intel rig.


I do have both. My old 8350 is in my second rig. I just miss it being in my main rig. So i can just overclock when i was bored lol...


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> I do have both. My old 8350 is in my second rig. I just miss it being in my main rig. So i can just overclock when i was bored lol...










I will have them both going pretty much next to each other so I will be able to compare side by side. I love my Man-Cave!!!


----------



## Roaches

Quote:


> Originally Posted by *Devildog83*
> 
> Try on ExtremeHD setting.


Hmm odd somehow I missed the 8xAA....will do again in a minute.

EDIT: doesn't seem like much of a Jump...


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> 
> 
> 
> 
> 
> 
> 
> I will have them both going pretty much next to each other so I will be able to compare side by side. I love my Man-Cave!!!




Man caves ftw! I need to make mine bigger to many monitors
I have my second rig in my bedroom, so I could watch films etc


----------



## Roaches

The feel when not enough room for a 4th monitor







Nice setup.


----------



## Recr3ational

Quote:


> Originally Posted by *Roaches*
> 
> The feel when not enough room for a 4th monitor
> 
> 
> 
> 
> 
> 
> 
> Nice setup.


I have to put the 4th on top on the triple, my gaming room has got to many TVs and monitors in it. So I'll have to cut my desk is half.. Thanks


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> 
> 
> Man caves ftw! I need to make mine bigger to many monitors
> I have my second rig in my bedroom, so I could watch films etc


Very nice !!! I would not know what to do with 4 monitors.

I actually build my own Cave because we didn't have enough room in the house. It's not quite finished but almost.




Spoiler: Warning: Spoiler!












1 more special one -


----------



## anubis1127

Nice man cave! I just took over our spare bedroom and setup an office/man cave. (I work from home 50% of the time).

Also, I may be addicted to Pitcairn. Heh.


----------



## miraldo

Quote:


> Originally Posted by *miraldo*
> 
> Please doese anyone knows the solution.
> 
> I try to fix it with Display Driver Uninstaller, and still get atikmdag.sys
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I get this error before with 7870. Now with r9 290 it is the same.
> 
> Error Name:
> 
> atikmdag.sys+128f6f
> 
> If enyone know solution I will be very grateful. Fresh clean instalation of windows not helping..I even format HDD and SSD.


Well somehow I found the solution!!!

All I need to open web browser(Mozzila Firefox) and then go: Firefox / Tools / Options / Advance / And then Uncheck "Use hardware acceleration when available " and click Save.

This is how I solved my atikmdag.sys problem thumbsupsmiley.png

Also you can first try this:

atikmdag Stopped Responding - Step 1: Download and install the latest ati graphic driver.

atikmdag Stopped Responding - Step 2: Then go to C:\Windows\System32\Drivers and rename atikmdag.sys to atikmdag.sys.old.

atikmdag Stopped Responding - Step 3: Go to ati directory (usually in C:\ATI) and find the file atikmdag.sy_.

atikmdag Stopped Responding - Step 4: Copy the file to your Desktop directory.

atikmdag Stopped Responding - Step 5: Open cmd.exe by going to Start -> type cmd in the search box and hit enter.

atikmdag Stopped Responding - Step 6: Change the directory to Desktop by typing chdir Desktop.

atikmdag Stopped Responding - Step 7: Then, type EXPAND.EXE atikmdag.sy_ atikmdag.sys. Or,
expand -r atikmdag.sy_ atikmdag.sys

atikmdag Stopped Responding - Step 8: When the expansion is complete, copy the new atikmdag.sys from your Desktop to C:\Windows\System32\Drivers

atikmdag Stopped Responding - Step 9: Restart your computer and the problem should be resolved.


----------



## anubis1127

Cool, glad you got it sorted out. That is more of a work-around than a solution, but it will work, so


----------



## amateurbuilder

Just got my 2nd PC built, my first was an editing machine and now I have a play machine 

XFX R9 270X 2GB DDR5 1050MHz



Sorry for the horrible pic of my build, taken with my phone......

CPU-Z, in case anyone is interested.


----------



## Crowe98

Alright, so. MSI RMA support responded to my email about my defective/heavily artifacting 280x. This was the reply.

*Nothing was changed, I just copied this straight from the email.*

_"Regarding your concern,you can update the driver from the below link:MSI *(This link was a direct DL to AMD's 14.4 Catalyst Driver)*,when you update the driver ,you must clean the former.if there has the same problem ,maybe there has something wrong with this card,you can contact with the seller for help."_

I really don't know how I feel about this.

For one, the guy can't speak (or type) English properly. I think he must have been Asian, as the support ranges in this area are covering Asia/Australia.

Secondly, does this guy know in depth what is going on with my card? It seems like this guy has the vaguest possible knowledge of troubleshooting hardware.

Not happy.


----------



## anubis1127

Well that is bollocks. I would not be happy either. I've always had a great experience with their service over here.


----------



## Roaches

Not sure how support is different from the US, though I assume you live in Australia? Don't your country has laws to protect consumers against defective products and should be able to be replaced without charge?

Sorry to hear on your end, though if your card is not artifacting on factory clocks...It just means your memory or core isn't the best out of the silicon draw to go higher than whats available.

My devils are an exact example of that. I can push the cores but not the memory. Without running into Artifacts.


----------



## Dasboogieman

Quote:


> Originally Posted by *Devildog83*
> 
> AMD overclocking has been a blast so I hope Intel is too. How did you get this chip to 4.6 Ghz when it hasn't been released, do you have a review chip?


Nah I don't own it personally but going off the data that has already been published.
Quote:


> Originally Posted by *Crowe98*
> 
> Alright, so. MSI RMA support responded to my email about my defective/heavily artifacting 280x. This was the reply.
> 
> *Nothing was changed, I just copied this straight from the email.*
> 
> _"Regarding your concern,you can update the driver from the below link:MSI *(This link was a direct DL to AMD's 14.4 Catalyst Driver)*,when you update the driver ,you must clean the former.if there has the same problem ,maybe there has something wrong with this card,you can contact with the seller for help."_
> 
> I really don't know how I feel about this.
> 
> For one, the guy can't speak (or type) English properly. I think he must have been Asian, as the support ranges in this area are covering Asia/Australia.
> 
> Secondly, does this guy know in depth what is going on with my card? It seems like this guy has the vaguest possible knowledge of troubleshooting hardware.
> 
> Not happy.


If you bought the product from an Australian retailer (much much easier to enforce accountability but also applies to overseas vendors selling in to the Australian Market) then you are entitled to a refund at the very least (maybe even a replacement) within 2 years of purchase if the product is not fit for purpose or not functioning as advertised. This cannot be waived by the retailer, if they screw with this you just contact the Consumer Protection Ombudsman. They'll sort it out pretty quick.

I think a heavily artifacting 280X at stock counts as not fit for purpose and is very easy to prove.


----------



## BruceB

Quote:


> Originally Posted by *Crowe98*
> 
> Alright, so. MSI RMA support responded to my email about my defective/heavily artifacting 280x. This was the reply. ]Nothing was changed, I just copied this straight from the email.
> 
> _"Regarding your concern,you can update the driver from the below link:MSI (This link was a direct DL to AMD's 14.4 Catalyst Driver),when you update the driver ,you must clean the former.if there has the same problem ,maybe there has something wrong with this card,you can *contact with the seller for help*."_
> 
> I really don't know how I feel about this.
> For one, the guy can't speak (or type) English properly. I think he must have been Asian, as the support ranges in this area are covering Asia/Australia.
> Secondly, does this guy know in depth what is going on with my card? It seems like this guy has the vaguest possible knowledge of troubleshooting hardware.
> Not happy.


That's seems fairly non-commital :-/

Some manufacturers only do RMAs through their re-sellers, maybe that's what he means with that bit of the email.
Still, it doesn't exactly instill faith that he knows what he's speaking about...


----------



## Crowe98

Quote:


> Originally Posted by *Dasboogieman*
> 
> Nah I don't own it personally but going off the data that has already been published.
> If you bought the product from an Australian retailer (much much easier to enforce accountability but also applies to overseas vendors selling in to the Australian Market) then you are entitled to a refund at the very least (maybe even a replacement) within 2 years of purchase if the product is not fit for purpose or not functioning as advertised. This cannot be waived by the retailer, if they screw with this you just contact the Consumer Protection Ombudsman. They'll sort it out pretty quick.
> 
> I think a heavily artifacting 280X at stock counts as not fit for purpose and is very easy to prove.


I totally agree however, its performing fine at stock clocks and I bought it second hand.

*EDIT:* I just shot an email to the Australian based MSI Servce Center, I guess I probably should have spoken with these people first. Oops.


----------



## BruceB

Quote:


> Originally Posted by *Crowe98*
> 
> I totally agree however, its performing fine at stock clocks and I bought it second hand.


If that's the case they may not replace it. These things are only supposed to run at their rated speed, if it can't run faster than stock that dosen't count as broken, at least not in the EU.


----------



## Crowe98

Quote:


> Originally Posted by *BruceB*
> 
> If that's the case they may not replace it. These things are only supposed to run at their rated speed, if it can't run faster than stock that dosen't count as broken, at least not in the EU.


Got a email back from the Australia MSI Service Center. They sent me the form to fill out to receive a RA number.

It's looking a little better


----------



## lluukkman

Quote:


> Originally Posted by *Devildog83*
> 
> If you wish to be added let me know.


here is my card. HIS ICEQ R9 280 (953/1250)


Spoiler: screenshot







I have something to ask. I'm reach 80 above degree temperature when I play Watchdogs (ultra setting and using TheWorse mod). I'm not overclock the card. is this normal? how if I play game all day with the temperature is 80 above, whether it will damage my card?

sorry for my english


----------



## Hacker90

you should use MSI after burner and set a custom fan profile so that the temps wont go that far. Its probably becuase of the blower type fan!


----------



## BruceB

Quote:


> Originally Posted by *Crowe98*
> 
> Got a email back from the Australia MSI Service Center. They sent me the form to fill out to receive a RA number.
> It's looking a little better


Sweet! Keep us posted on how it goes!


----------



## Hacker90

btw check out this sniper elite III video I just made, MAXED out on my XFX R9 280x @ Stock


----------



## lluukkman

Quote:


> Originally Posted by *Hacker90*
> 
> you should use MSI after burner and set a custom fan profile so that the temps wont go that far. Its probably becuase of the blower type fan!


yes this is just blower type fan. I set the fan speed 100% in catalyst. is that different if I use MSI after burner?


----------



## Crowe98

Quote:


> Originally Posted by *Hacker90*
> 
> btw check out this sniper elite III video I just made, MAXED out on my XFX R9 280x @ Stock


Ditto.


Spoiler: Only if you want to, not forcing you to.


----------



## Crowe98

*Update on the conversation with the guy from the Australian MSI Service Center.*

His words:
Quote:


> If the card will work fine under normal clock, MSI will not provide replacement. Because MSI can't guaranty that customer can 100% overclock successfully.


*But then, after a bit of softening up;*
Quote:


> I can provide a replacement to you as favour, but I can't guaranty if it would be good under overclock.


Booyah. 'Straya mate!


----------



## Hacker90

Quote:


> Originally Posted by *Crowe98*
> 
> Ditto.
> 
> 
> Spoiler: Only if you want to, not forcing you to.


Nice video, I left a like.

Btw with ditto u meant Same hardware? as in both the CPU and GPU? cause Hardline wasnt that smooth on my PC!


----------



## miraldo

Spoiler: Only if you want to, not forcing you to.










[/quote]

Nice video









Ti si naš človek drugače?


----------



## Devildog83

Quote:


> Originally Posted by *anubis1127*
> 
> Nice man cave! I just took over our spare bedroom and setup an office/man cave. (I work from home 50% of the time).
> 
> Also, I may be addicted to Pitcairn. Heh.


I work from home most of the time although I have an office/warehouse I am hardly ever there.


----------



## Devildog83

*amateurbuilder and llkkuuman* have been added - Welcome to the club!!!!


----------



## Devildog83

Quote:


> Originally Posted by *amateurbuilder*
> 
> Just got my 2nd PC built, my first was an editing machine and now I have a play machine
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> XFX R9 270X 2GB DDR5 1050MHz
> 
> 
> 
> Sorry for the horrible pic of my build, taken with my phone......
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> CPU-Z, in case anyone is interested.


Why only 4.1 on the 8350? I had mine stable at 4.8 on an H100i.


----------



## amateurbuilder

Quote:


> Originally Posted by *Devildog83*
> 
> Why only 4.1 on the 8350? I had mine stable at 4.8 on an H100i.


Just what it reported in cpuz. I haven't done a thing to tweak anything.... I am going to play around on the long weekend, I am not experienced, may just bump to 4.5 or so for a little boost. I'm sure I will have questions in that thread lol. I really cant wait to try a game. The system this is replacing was an Intel core2duo 8400 + bfg 8800gts oc2. To say the least I think I will be pleased.


----------



## Devildog83

Quote:


> Originally Posted by *amateurbuilder*
> 
> Just what it reported in cpuz. I haven't done a thing to tweak anything.... I am going to play around on the long weekend, I am not experienced, may just bump to 4.5 or so for a little boost. I'm sure I will have questions in that thread lol. I really cant wait to try a game. The system this is replacing was an Intel core2duo 8400 + bfg 8800gts oc2. To say the least I think I will be pleased.


You should be pleasantly surprised. You should be able to get to 4.5 easy by a multi of 22.5 and around 1.4v. Make sure you keep the HT link at 2600 and the CPU/NB frequency around 2200 to start. If you bump the RAM above 2200 then pump the CPU/NB to match at least. No need for the CPU/NB to be above 2200 unless the RAM is.


----------



## BruceB

Quote:


> Originally Posted by *amateurbuilder*
> 
> Just what it reported in cpuz. I haven't done a thing to tweak anything.... I am going to play around on the long weekend, I am not experienced, may just bump to 4.5 or so for a little boost. I'm sure I will have questions in that thread lol. I really cant wait to try a game. The system this is replacing was an Intel core2duo 8400 + bfg 8800gts oc2. To say the least I think I will be pleased.


Quote:


> Originally Posted by *Devildog83*
> 
> You should be pleasantly surprised. You should be able to get to 4.5 easy by a multi of 22.5 and around 1.4v. Make sure you keep the HT link at 2600 and the CPU/NB frequency around 2200 to start. If you bump the RAM above 2200 then pump the CPU/NB to match at least. No need for the CPU/NB to be above 2200 unless the RAM is.


On a side note: I see that you'vew both got H100's, do your MB temps get high with an H100 compared to an air Tower cooler?


----------



## amateurbuilder

Quote:


> Originally Posted by *Devildog83*
> 
> You should be pleasantly surprised. You should be able to get to 4.5 easy by a multi of 22.5 and around 1.4v. Make sure you keep the HT link at 2600 and the CPU/NB frequency around 2200 to start. If you bump the RAM above 2200 then pump the CPU/NB to match at least. No need for the CPU/NB to be above 2200 unless the RAM is.


Most of this sounds greek to me, I got some learning to do. Thanks!


----------



## amateurbuilder

Quote:


> Originally Posted by *BruceB*
> 
> On a side note: I see that you'vew both got H100's, do your MB temps get high with an H100 compared to an air Tower cooler?


I've noticed a big drop in temps compared to air cooling. This is my first experience with liquid, I am very happy so far with the first impression.


----------



## Crowe98

Quote:


> Originally Posted by *Hacker90*
> 
> Nice video, I left a like.
> 
> Btw with ditto u meant Same hardware? as in both the CPU and GPU? cause Hardline wasnt that smooth on my PC!


Thanks for the nice words, I just meant that I had a video to share too


----------



## BruceB

Quote:


> Originally Posted by *amateurbuilder*
> 
> I've noticed a big drop in temps compared to air cooling. This is my first experience with liquid, I am very happy so far with the first impression.


Could you clarify that please? Are you talking about the CPU or the mainboard (AKA system) temps?


----------



## xutnubu

Well, think I might be joining the club.

After my 7870 died, I decided to go with a 280X.

I first picked up the ASUS DCII TOP, but then read the reviews and the whole deal about artifacting because of the memory being overvolted.

So I switched it for an MSI GAMING 3G, and then read the reviews for the fans failing and coil whine lol, but I decided to keep it and has been working fine, no coil whine, fans still spinning. And I supposedly got the one with better VRM cooling.

Only thing that worries me a little bit is the temps, 74C max on Crysis 3 and 71C on BF4, my 7870 was definitely cooler on that regard. Though this is @ 38% fan speed.


----------



## amateurbuilder

Quote:


> Originally Posted by *BruceB*
> 
> Could you clarify that please? Are you talking about the CPU or the mainboard (AKA system) temps?


CPU sits at around 10-20C when not under much load. Going to be taking some logs while running Sniper III by this weekend to see how it fares. Overall though I think it may also be my new case (500R), I've got the two front intake, huge side intake, rear/top exhaust (H100i radiator is going exhaust).


----------



## Devildog83

Quote:


> Originally Posted by *BruceB*
> 
> On a side note: I see that you'vew both got H100's, do your MB temps get high with an H100 compared to an air Tower cooler?


Actually I have a full loop now. Used to have an H100i. I use a fan on the back of the motherboard tray now even though I have never had huge issues with NB/VRM heat up it's peace of mind.



I have also had one on the front which is uglier but more effective -


----------



## Devildog83

Quote:


> Originally Posted by *amateurbuilder*
> 
> CPU sits at around 10-20C when not under much load. Going to be taking some logs while running Sniper III by this weekend to see how it fares. Overall though I think it may also be my new case (500R), I've got the two front intake, huge side intake, rear/top exhaust (H100i radiator is going exhaust).


I am sure you are or have been to the Vishera thread and if you need and help I will be glad to also. I have overclocked mine for more than a year now.


----------



## Devildog83

Quote:


> Originally Posted by *xutnubu*
> 
> Well, think I might be joining the club.
> 
> After my 7870 died, I decided to go with a 280X.
> 
> I first picked up the ASUS DCII TOP, but then read the reviews and the whole deal about artifacting because of the memory being overvolted.
> 
> So I switched it for an MSI GAMING 3G, and then read the reviews for the fans failing and coil whine lol, but I decided to keep it and has been working fine, no coil whine, fans still spinning. And I supposedly got the one with better VRM cooling.
> 
> Only thing that worries me a little bit is the temps, 74C max on Crysis 3 and 71C on BF4, my 7870 was definitely cooler on that regard. Though this is @ 38% fan speed.


Set a profile for the fan to ramp up a bit, say to 50 or 60%, and I am sure the temps will come down into the 60's which is more than fine.


----------



## Crowe98

Quote:


> Originally Posted by *xutnubu*
> 
> Well, think I might be joining the club.
> 
> After my 7870 died, I decided to go with a 280X.
> 
> I first picked up the ASUS DCII TOP, but then read the reviews and the whole deal about artifacting because of the memory being overvolted.
> 
> So I switched it for an MSI GAMING 3G, and then read the reviews for the fans failing and coil whine lol, but I decided to keep it and has been working fine, no coil whine, fans still spinning. And I supposedly got the one with better VRM cooling.
> 
> Only thing that worries me a little bit is the temps, 74C max on Crysis 3 and 71C on BF4, my 7870 was definitely cooler on that regard. Though this is @ 38% fan speed.


Welcome.

I'm actually about to send my MSI R9 280x Gaming 3G down to Auburn to get replaced; the MSI guy down there was really nice and said he would replace it. I was getting huge artifacting if i pushed the card anything over 1200Mhz.

As for your temperatures, I was getting 29-32C at idle and around 64-70C under 100% load. It is winter here in Australia though. If you don't mind voiding your warranty maybe you should try replacing the TIM. *Roaches* did and he was getting temperatures ~10C lower at full load.


----------



## Internet Swag

If you Crossfire a 270 and a 7850 do you still get Mantle and TrueAudio?


----------



## anubis1127

Quote:


> Originally Posted by *Internet Swag*
> 
> If you Crossfire a 270 and a 7850 do you still get Mantle and TrueAudio?


Mantle yes, but neither of those cards support TrueAudio. If you CFX a 7850 and 270 you will loose a bit over 200 shaders though on the 270.

(edit) I guess 256 shaders.


----------



## Crowe98

Just received my RA number today!


----------



## Internet Swag

Quote:


> Originally Posted by *anubis1127*
> 
> Mantle yes, but neither of those cards support TrueAudio. If you CFX a 7850 and 270 you will loose a bit over 200 shaders though on the 270.
> 
> (edit) I guess 256 shaders.


Will that have a big impact on my performance and will it increase micro-stuttering?


----------



## anubis1127

Quote:


> Originally Posted by *Internet Swag*
> 
> Will that have a big impact on my performance and will it increase micro-stuttering?


You will just be gimping the 270 slightly, but overall you would have better performance than the 7850. I'm guessing you have a 7850 now, and are looking to add a 270?? If that is the case you may want to check out the R7 265, you may be able to find one cheaper, and the R7 265 is a rebadge of a 7850.

I'm not sure about micro-stuttering, I'm using CFX, and I don't notice it much, but I don't really game much, and some people are more sensitive to that than others.


----------



## Internet Swag

Quote:


> Originally Posted by *anubis1127*
> 
> You will just be gimping the 270 slightly, but overall you would have better performance than the 7850. I'm guessing you have a 7850 now, and are looking to add a 270?? If that is the case you may want to check out the R7 265, you may be able to find one cheaper, and the R7 265 is a rebadge of a 7850.
> 
> I'm not sure about micro-stuttering, I'm using CFX, and I don't notice it much, but I don't really game much, and some people are more sensitive to that than others.


I can't rep you for some reason :s

Thanks







Yeah I have the 7850 atm, I wasn't even thinking about the R7 265, that's a great choice, but I will probably be getting the R9 270 second hand for about $90-110.

I'm getting worried now though cause my motherboard runs at PCI-E 2.1, 16x (1st slot) and 4x (2nd slot) during crossfire, so I'm not even sure if it's a good idea to add a 270. I don't know if less bandwidth will make crossfire less ideal.


----------



## Kuhl

Quote:


> Originally Posted by *xutnubu*
> 
> Well, think I might be joining the club.
> 
> After my 7870 died, I decided to go with a 280X.
> 
> I first picked up the ASUS DCII TOP, but then read the reviews and the whole deal about artifacting because of the memory being overvolted.
> 
> So I switched it for an MSI GAMING 3G, and then read the reviews for the fans failing and coil whine lol, but I decided to keep it and has been working fine, no coil whine, fans still spinning. And I supposedly got the one with better VRM cooling.
> 
> Only thing that worries me a little bit is the temps, 74C max on Crysis 3 and 71C on BF4, my 7870 was definitely cooler on that regard. Though this is @ 38% fan speed.


I think the Asus issue may just be blown out or proportion I have a Asus DC2TOP and I haven't had any artifact issues at stock clocks.


----------



## xutnubu

Quote:


> Originally Posted by *Kuhl*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xutnubu*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Well, think I might be joining the club.
> 
> After my 7870 died, I decided to go with a 280X.
> 
> I first picked up the ASUS DCII TOP, but then read the reviews and the whole deal about artifacting because of the memory being overvolted.
> 
> So I switched it for an MSI GAMING 3G, and then read the reviews for the fans failing and coil whine lol, but I decided to keep it and has been working fine, no coil whine, fans still spinning. And I supposedly got the one with better VRM cooling.
> 
> Only thing that worries me a little bit is the temps, 74C max on Crysis 3 and 71C on BF4, my 7870 was definitely cooler on that regard. Though this is @ 38% fan speed.
> 
> 
> 
> 
> 
> 
> I think the Asus issue may just be blown out or proportion I have a Asus DC2TOP and I haven't had any artifact issues at stock clocks.
Click to expand...

Well, you know what they say about product reviews, people with defective units speak loud, while the ones with luck stay silent enjoying their new card.

But I guess it got to me, just reading all the Newegg reviews and a thread on Asus website lol. Besides, I checked my model # and there was a high chance it was one of the faulty ones.

Quote:


> Originally Posted by *Crowe98*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xutnubu*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Well, think I might be joining the club.
> 
> After my 7870 died, I decided to go with a 280X.
> 
> I first picked up the ASUS DCII TOP, but then read the reviews and the whole deal about artifacting because of the memory being overvolted.
> 
> So I switched it for an MSI GAMING 3G, and then read the reviews for the fans failing and coil whine lol, but I decided to keep it and has been working fine, no coil whine, fans still spinning. And I supposedly got the one with better VRM cooling.
> 
> Only thing that worries me a little bit is the temps, 74C max on Crysis 3 and 71C on BF4, my 7870 was definitely cooler on that regard. Though this is @ 38% fan speed.
> 
> 
> 
> 
> 
> 
> Welcome.
> 
> I'm actually about to send my MSI R9 280x Gaming 3G down to Auburn to get replaced; the MSI guy down there was really nice and said he would replace it. I was getting huge artifacting if i pushed the card anything over 1200Mhz.
> 
> As for your temperatures, I was getting 29-32C at idle and around 64-70C under 100% load. It is winter here in Australia though. If you don't mind voiding your warranty maybe you should try replacing the TIM. *Roaches* did and he was getting temperatures ~10C lower at full load.
Click to expand...

I've heard this model is not that good for OC, most people get +100MHz over the stock 1050MHz.

I think I'll try with the custom fan curve first, I definitely don't want to void the warranty









I live in a tropical country, so is always hot here.


----------



## Crowe98

Quote:


> Originally Posted by *xutnubu*
> 
> Well, you know what they say about product reviews, people with defective units speak loud, while the ones with luck stay silent enjoying their new card.
> 
> But I guess it got to me, just reading all the Newegg reviews and a thread on Asus website lol. Besides, I checked my model # and there was a high chance it was one of the faulty ones.
> I've heard this model is not that good for OC, most people get +100MHz over the stock 1050MHz.
> 
> I think I'll try with the custom fan curve first, I definitely don't want to void the warranty
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I live in a tropical country, so is always hot here.


Where do you live?


----------



## BruceB

Quote:


> Originally Posted by *Internet Swag*
> 
> ...I'm getting worried now though cause my motherboard runs at PCI-E 2.1, 16x (1st slot) and 4x (2nd slot) during crossfire, so I'm not even sure if it's a good idea to add a 270. I don't know if less bandwidth will make crossfire less ideal.


I wrote a post about this here: PCIe Lane width and GPU performance
If you've got any further questions you can always PM me!


----------



## Internet Swag

Quote:


> Originally Posted by *BruceB*
> 
> I wrote a post about this here: PCIe Lane width and GPU performance
> If you've got any further questions you can always PM me!


Perfect, thanks! That really shows that it only made a 2% difference.

Now I wonder about - drivers, do you need special drivers to CF a 7850 with a 265/270 or should it be fine and give no less issues than a 270/270 or a 7850/7850?

Also my CPU might be a bottleneck? I have an i3 2120 3.3 GHz.


----------



## BruceB

Quote:


> Originally Posted by *Internet Swag*
> 
> Perfect, thanks! That really shows that it only made a 2% difference.
> Now I wonder about - drivers, do you need special drivers to CF a 7850 with a 265/270 or should it be fine and give no less issues than a 270/270 or a 7850/7850?
> Also my CPU might be a bottleneck? I have an i3 2120 3.3 GHz.


AFAIK Xfire just Needs the latest CCC from AMD to work








As for your i3; I think it would bottleneck an 270 xfire Setup, your best bet would be one of the newer i5's (eg i5 4670). That said, I'm not an expert on balanced xfire Setups, maybe someone else can give a second opinion?


----------



## anubis1127

Well and a crossfire bridge.









I'm also not sure on the CPU, I tend to think it will be fine, but I could be wrong.

I just picked up a z97 board and Pentium g3258 yesterday. Once I get it setup I can probably do some crossfire testing with the dual core vs my hex core.


----------



## BruceB

Quote:


> Originally Posted by *anubis1127*
> 
> Well and a crossfire bridge.










how could I Forget that?! lol









Quote:


> Originally Posted by *anubis1127*
> 
> I'm also not sure on the CPU, I tend to think it will be fine, but I could be wrong.
> I just picked up a z97 board and Pentium g3258 yesterday. Once I get it setup I can probably do some crossfire testing with the dual core vs my hex core.


Sounds good, I'm looking foward to Hearing the results!


----------



## Roaches

My MSI 270 from the OCN marketplace has arrived yesterday noon, I didn't took pics though will post them and some 3D mark results once I get home today to test it








Though it does appear to be in mint condition with little dust on the left fan intake shroud which I cleaned off.


----------



## anubis1127

Quote:


> Originally Posted by *Roaches*
> 
> My MSI 270 from the OCN marketplace has arrived yesterday noon, I didn't took pics though will post them and some 3D mark results once I get home today to test it
> 
> 
> 
> 
> 
> 
> 
> 
> Though it does appear to be in mint condition with little dust on the left fan intake shroud which I cleaned off.


Sweet, can you let me know if you get voltage control working with AB?


----------



## Devildog83

Quote:


> Originally Posted by *anubis1127*
> 
> Well and a crossfire bridge.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm also not sure on the CPU, I tend to think it will be fine, but I could be wrong.
> 
> I just picked up a z97 board and Pentium g3258 yesterday. Once I get it setup I can probably do some crossfire testing with the dual core vs my hex core.


I will be doing the same thing with a Z97/g3258 and my devils if I can. I will have to pull my power supply from my main rig as my wifes PSU will not handle X-Fire with these cards and I will be using her rig to test the set-up since mine is watercooled.

I don't know, maybe it would be a good time for a flush.


----------



## anubis1127

Quote:


> Originally Posted by *Devildog83*
> 
> I will be doing the same thing with a Z97/g3258 and my devils if I can. I will have to pull my power supply from my main rig as my wifes PSU will not handle X-Fire with these cards and I will be using her rig to test the set-up since mine is watercooled.
> 
> I don't know, maybe it would be a good time for a flush.


Awesome, yeah, I too have to scrounge up a PSU, my x750 died on me last week, so I need to RMA that. I have my backup PSU (M12II 620w) in the rig that had the x750 in it, so now I will have to use one of my spare PSUs. I think I have an old OCZ Fatal1y 550w that should be OK for testing the Pentium and my 270s, might be pushing it though.


----------



## Devildog83

Quote:


> Originally Posted by *anubis1127*
> 
> Awesome, yeah, I too have to scrounge up a PSU, my x750 died on me last week, so I need to RMA that. I have my backup PSU (M12II 620w) in the rig that had the x750 in it, so now I will have to use one of my spare PSUs. I think I have an old OCZ Fatal1y 550w that should be OK for testing the Pentium and my 270s, might be pushing it though.


Maybe but the g3258 is very easy on power and the 270's are not bad either. I maxed my system out with my devil's and my 8350 heavily overclocked, ran P95 and Valley at the same time and only drew 600w from the wall which means the actual power consumption was about 550w. The only real concern there would be the "Old" part.

Edit : and maybe the amps on the 12v rail/rails. Is it 80 plus and do you have at least 20a if it has multiple rails?


----------



## anubis1127

Quote:


> Originally Posted by *Devildog83*
> 
> Maybe but the g3258 is very easy on power and the 270's are not bad either. I maxed my system out with my devil's and my 8350 heavily overclocked, ran P95 and Valley at the same time and only drew 600w from the wall which means the actual power consumption was about 550w. The only real concern there would be the "Old" part.
> 
> Edit : and maybe the amps on the 12v rail/rails. Is it 80 plus and do you have at least 20a if it has multiple rails?


It is 80 plus rated, it would be equivalent to a bronze PSU these days. It is a bit older, I think I got it back in '09, or '10, and its been sitting idle for at least two years or so, other than to test watercooling loops, haha. I'll have to check the rails when I get home, I can't recall of the top of my head. It would just be a temporary solution anyway.


----------



## Internet Swag

Quote:


> Originally Posted by *BruceB*
> 
> AFAIK Xfire just Needs the latest CCC from AMD to work
> 
> 
> 
> 
> 
> 
> 
> 
> As for your i3; I think it would bottleneck an 270 xfire Setup, your best bet would be one of the newer i5's (eg i5 4670). That said, I'm not an expert on balanced xfire Setups, maybe someone else can give a second opinion?


Thanks! I have never Crossfire'd before so I am quite excited. Just need to see if I can get a decent compatible card







Quote:


> Originally Posted by *anubis1127*
> 
> Well and a crossfire bridge.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm also not sure on the CPU, I tend to think it will be fine, but I could be wrong.
> 
> I just picked up a z97 board and Pentium g3258 yesterday. Once I get it setup I can probably do some crossfire testing with the dual core vs my hex core.


That will be interesting to see







Hope you benchmark lots!


----------



## Devildog83

Quote:


> Originally Posted by *anubis1127*
> 
> It is 80 plus rated, it would be equivalent to a bronze PSU these days. It is a bit older, I think I got it back in '09, or '10, and its been sitting idle for at least two years or so, other than to test watercooling loops, haha. I'll have to check the rails when I get home, I can't recall of the top of my head. It would just be a temporary solution anyway.


You should be OK if the PSU is still functioning properly. Do you have a kill-a-watt tester?


----------



## anubis1127

Quote:


> Originally Posted by *Devildog83*
> 
> You should be OK if the PSU is still functioning properly. Do you have a kill-a-watt tester?


Indeed I do. I could use that to test the use under load I suppose.


----------



## NameUnknown

how do the 280X's perform against heavily modded Skyrim?


----------



## Internet Swag

While on the topic of PSU's.

I did use a PSU calculator, but I should make sure any way.

I have an Antec HCG 620w (non modular) and want to make use of the combination of 7850 CF with 270(x), 7 265. 7850. 7870.

MY PSU - http://www.newegg.com/Product/Product.aspx?Item=N82E16817371059


----------



## Devildog83

Quote:


> Originally Posted by *anubis1127*
> 
> Indeed I do. I could use that to test the use under load I suppose.


Yep, if it's 500w or less at the wall you should be fine. I just read a great review on this chip from Guru3D and he had the sucker up to 4.8Ghz. Seriously a Pentium dual core at 4.8Ghz?


----------



## Internet Swag

Quote:


> Originally Posted by *Internet Swag*
> 
> While on the topic of PSU's.
> 
> I did use a PSU calculator, but I should make sure any way.
> 
> I have an Antec HCG 620w (non modular) and want to make use of the combination of 7850 CF with 270(x), 7 265. 7850. 7870.
> 
> MY PSU - http://www.newegg.com/Product/Product.aspx?Item=N82E16817371059


I got my answer.

My PSU only has 2x 6+2 pin connectors so unless I get an adapter I will only CF 7850/625


----------



## Ashura

Hey guys, Is 14.6 driver a good update?
I'm running 13.12. Time for an update?


----------



## c3p0c3p0

MSI R9 280X 3G Gaming (not HAWK)
Clocks: *cough* stock 1020/1500


----------



## xutnubu

Quote:


> Originally Posted by *Crowe98*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xutnubu*
> 
> Well, you know what they say about product reviews, people with defective units speak loud, while the ones with luck stay silent enjoying their new card.
> 
> But I guess it got to me, just reading all the Newegg reviews and a thread on Asus website lol. Besides, I checked my model # and there was a high chance it was one of the faulty ones.
> I've heard this model is not that good for OC, most people get +100MHz over the stock 1050MHz.
> 
> I think I'll try with the custom fan curve first, I definitely don't want to void the warranty
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I live in a tropical country, so is always hot here.
> 
> 
> 
> Where do you live?
Click to expand...

Costa Rica.

I just started artifacting on Firefox. Huge black rectangles all over the window.

I hope is just the browser and not the card


----------



## neurotix

Quote:


> Originally Posted by *NameUnknown*
> 
> how do the 280X's perform against heavily modded Skyrim?


I had a 7970 at 1200/1600mhz last year, and a single 1080p monitor, and it was more than sufficient to run Skyrim well and average 60 fps. I just used FXAA, high res texture packs, and all that stuff like better wood and water, and so on. As long as you aren't stacking 200 graphics mods to try and make it photorealistic, you should be fine with a 280X/7970.


----------



## Dasboogieman

Quote:


> Originally Posted by *neurotix*
> 
> I had a 7970 at 1200/1600mhz last year, and a single 1080p monitor, and it was more than sufficient to run Skyrim well and average 60 fps. I just used FXAA, high res texture packs, and all that stuff like better wood and water, and so on. As long as you aren't stacking 200 graphics mods to try and make it photorealistic, you should be fine with a 280X/7970.


I've also heard that skyrim isn't able to address more than 3072gb of VRAM by design. So you won't be too bothered. Those texture mods are really heavy on the texture filtering resources so even a 290 would be taxed.


----------



## Crowe98

Quote:


> Originally Posted by *xutnubu*
> 
> Costa Rica.
> 
> I just started artifacting on Firefox. Huge black rectangles all over the window.
> 
> I hope is just the browser and not the card


That would happen to me if i pushed the card over 1200Mhz core, and huge artifacting on the Firestrike benchmark on 3DMark.

There is a few things you could try to help fix it though.

*Forcing stock 1020/1500 clocks at idle.*
To do this, you need to go to the Catalyst control center and enable overdrive. Then, create a new profile under the presets tab. Don't enable it yet.

Then, navigate to you C:/Users/name/Appdata/Local/ATI/ACE/profiles/then open the previously created preset xml file with a text editor.

Then, hit control F and search "want_" this will search for the clocks wanted for what kind of application. You find answers want_0 and want_1. Match want_0 to the number after want_2 to ensure the card is using the same clocks for all applications.

You will need to do this 4 times for "CoreClockTarget...", "MemoryClockTarget...", "CoreVoltageTarget...", and "MemoryVoltageTarget...". Save and quit.

After you've done all that, go back to CCC and apply your preset and double check that the clocks are consistent when you're on the desktop and in 3D applications.

If that doesn't work, try underclocking it a little.


----------



## Roaches

Heres my MSI 270 stock bench scores @1080p.







Metro LL High Settings No SSAA


Metro LL Very High Settings No SSAA


Hitman Ultra FXAA


Hitman Ultra 4xMSAA


Tombraider Ultimate FXAA


Valley and Heaven



3Dmark score, seems to lag alot in Firestrike :/

http://www.3dmark.com/3dm/3442901

I'll be doing overclocked test tomorrow....But I don't have much confidence since this card has only 4 phases and probably 1 phase for the memory on the PCB

I won't be able to take PCB VRM and Phase controller until my tube of GC-Extreme arrives. Though the card is relatively cool during all test since the typical full stress temperatures ranged from 50-55 degrees Celsius on a custom fan curve. With the exception of Metro which peaked up to 60 during one test on Very High. Pretty exceptional card for the budget gamer.


----------



## Roaches

Quick OC results for tonight









1000 core stock mem Ultra FXAA


1000 core stock mem Ultra 4xMSAA


1025 core 1500 mem Ultra FXAA


1025 core 1500 mem Ultra 4xMSAA


This card is laying the wood, and stable at its max OC limit from Afterburner! No artifacts!!



1050 core 1500 mem Ultra FXAA


1050 core 1500 mem Ultra 4xMSAA


Metro LL 1050 core 1500 mem Very High


Metro LL 1050 core 1500 mem High


Sadly no Voltage unlock. Expected for a 4 phase card.

EDIT:

Added 3Dmark score http://www.3dmark.com/3dm/3443546


----------



## Crowe98

Spoiler: Pic Heavy



Quote:


> Originally Posted by *Crowe98*
> 
> That would happen to me if i pushed the card over 1200Mhz core, and huge artifacting on the Firestrike benchmark on 3DMark.
> 
> There is a few things you could try to help fix it though.
> 
> *1. Forcing stock 1020/1500 clocks at idle.*
> To do this, you need to go to the Catalyst control center and enable overdrive. Then, create a new profile under the presets tab. Don't enable it yet.
> Then, navigate to you C:/Users/name/Appdata/Local/ATI/ACE/profiles/then open the previously created preset xml file with a text editor.
> Then, hit control F and search "want_" this will search for the clocks wanted for what kind of application. You find answers want_0 and want_1. Match want_0 to the number after want_2 to ensure the card is using the same clocks for all applications.
> You will need to do this 4 times for "CoreClockTarget...", "MemoryClockTarget...", "CoreVoltageTarget...", and "MemoryVoltageTarget...". Save and quit.
> After you've done all that, go back to CCC and apply your preset and double check that the clocks are consistent when you're on the desktop and in 3D applications.
> 
> If that doesn't work, try underclocking it a little.


Quote:


> Originally Posted by *Roaches*
> 
> Heres my MSI 270 stock bench scores @1080p.
> 
> 
> 
> 
> 
> 
> 
> Metro LL High Settings No SSAA
> 
> 
> Metro LL Very High Settings No SSAA
> 
> 
> Hitman Ultra FXAA
> 
> 
> Hitman Ultra 4xMSAA
> 
> 
> Tombraider Ultimate FXAA
> 
> 
> Valley and Heaven
> 
> 
> 
> 3Dmark score, seems to lag alot in Firestrike :/
> 
> http://www.3dmark.com/3dm/3442901
> 
> I'll be doing overclocked test tomorrow....But I don't have much confidence since this card has only 4 phases and probably 1 phase for the memory on the PCB
> 
> I won't be able to take PCB VRM and Phase controller until my tube of GC-Extreme arrives. Though the card is relatively cool during all test since the typical full stress temperatures ranged from 50-55 degrees Celsius on a custom fan curve. With the exception of Metro which peaked up to 60 during one test on Very High. Pretty exceptional card for the budget gamer.






You might have to disable the official overclocking limits on that card! Push it further haha









Will recommend one to my brother









Went to the post office today to post my defective 280x down to Auburn. Hopefully I get a better model.

*OFF THREAD TOPIC*
Whilst I'm not in possession of my 280x anymore I put my 6850 back in, it's stock clocks are 775/1000, and on Sky Diver I got 8579 points.

At the moment, I'm sitting on 950/1300 @ 61-62C full load, and getting 9302 points on Sky Diver. Still room for overclocking; interrupted by Payday2.









I know this isn't supposed to go in this thread, but I thought you guys might be interested in my results.


----------



## xutnubu

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Crowe98*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xutnubu*
> 
> Costa Rica.
> 
> I just started artifacting on Firefox. Huge black rectangles all over the window.
> 
> I hope is just the browser and not the card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That would happen to me if i pushed the card over 1200Mhz core, and huge artifacting on the Firestrike benchmark on 3DMark.
> 
> There is a few things you could try to help fix it though.
> 
> *Forcing stock 1020/1500 clocks at idle.*
> To do this, you need to go to the Catalyst control center and enable overdrive. Then, create a new profile under the presets tab. Don't enable it yet.
> 
> Then, navigate to you C:/Users/name/Appdata/Local/ATI/ACE/profiles/then open the previously created preset xml file with a text editor.
> 
> Then, hit control F and search "want_" this will search for the clocks wanted for what kind of application. You find answers want_0 and want_1. Match want_0 to the number after want_2 to ensure the card is using the same clocks for all applications.
> 
> You will need to do this 4 times for "CoreClockTarget...", "MemoryClockTarget...", "CoreVoltageTarget...", and "MemoryVoltageTarget...". Save and quit.
> 
> After you've done all that, go back to CCC and apply your preset and double check that the clocks are consistent when you're on the desktop and in 3D applications.
> 
> If that doesn't work, try underclocking it a little.
Click to expand...





Wouldn't that force my voltage to the max value all the time? If that's the case, I'd rather do that as a last resort.

This is something that has just happened one time. I've had problems with Firefox in the past, when using my 7870 I got little black squares artifacts all around.

If it happens again, I'd try disabling FF's HW acceleration, which is known to cause lots of troubles.

If that doesn't work, and my other browsers are unaffected, then it would be the perfect final excuse to switch to Chrome. FF has been uber trash for the last couple of months: sluggish and buggy. The only reason I can't let it go is because of how many years I've been using it.

Appreciate the advice, will try it if all else fails









Here are my pics so the TC can add me to the list:


----------



## c3p0c3p0

So who else has small screenflickers on the R9 280X and found a way to fix it? The card is stock. Connected via dvi port. They are barely to see but all over the screen :/

Edit: Tryed oboard graphics. Same flickers. TN panel can't handle the sweg. So solved /c


----------



## anubis1127

Quote:


> Originally Posted by *Roaches*
> 
> Quick OC results for tonight
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 1000 core stock mem Ultra FXAA
> 
> 
> 1000 core stock mem Ultra 4xMSAA
> 
> 
> 1025 core 1500 mem Ultra FXAA
> 
> 
> 1025 core 1500 mem Ultra 4xMSAA
> 
> 
> This card is laying the wood, and stable at its max OC limit from Afterburner! No artifacts!!
> 
> 
> 
> 1050 core 1500 mem Ultra FXAA
> 
> 
> 1050 core 1500 mem Ultra 4xMSAA
> 
> 
> Metro LL 1050 core 1500 mem Very High
> 
> 
> Metro LL 1050 core 1500 mem High
> 
> 
> 
> 
> Sadly no Voltage unlock. Expected for a 4 phase card.
> 
> EDIT:
> 
> Added 3Dmark score http://www.3dmark.com/3dm/3443546


Nice, results, my MSI Gaming 270s can also hit 1050 all day on stock voltage. I extended the official limits or whatever, and push them up to 1100Mhz for [email protected] They seem stable for gaming at that clock too, but mine also have Elpida, so I wasn't able to OC the memory much/at all.

I was able to push my voltage up to 1.22V in Trixx if you want a slight bit extra, the stock VID on each of my cards is 1.18V. Not sure if its a good idea for 24/7, so I just leave mine stock voltage most of the time, save for a quick bench here or there.

---------------------

MoAR Pitcairn!

Wanted a new CFX bridge, so I did the logical thing and bought a couple Gigabyte 7870 Ghz cards of the ocn MP, heh. I got two new CFX bridges, but unfortunately they are both too short, so it looks like I'll be ordering that ASUS ROG CFX bridge after all.





Pretty high ASIC Quality %, not sure if that matters on Pitcairn, if it were Kepler, I'd be stoked. I am pretty happy that this one has Hynix, I'm getting ready to put the 2nd one in after this post, so hopefully that one is Hynix as well.

The blue PCB with the metal bracket looks great with my x79 WS board too, so I think these will be my main GPUs now. The 270s will get relegated to a [email protected] box, sit in the corner, and crunch away.



I know they aren't really R9 270X, but they pretty much are, so I thought I'd share anyway.

----------

[edit]

2nd GPU in, this one has Samsung memory, and ASIC Quality % of 77.9





I'll take it, should be a descent match for the first card, hopefully I'll be able to get that vram OCd well, its at 1200Mhz (4800Mhz effective) stock. I'd like to push 6000Mhz effective if possible.


----------



## Roaches

Hmm so Trixx has voltage unlock? I'm using the latest MSI AB and it still won't let me control the volts







....I will give Trixx a try then







.

Pitcairn is Pitcairn, don't really care about the branding, Its still the same ASIC on board








So this means you won't be benching your MSI 270s CFX againts my results?


----------



## Recr3ational

Quote:


> Originally Posted by *Roaches*
> 
> Hmm so Trixx has voltage unlock? I'm using the latest MSI AB and it still won't let me control the volts
> 
> 
> 
> 
> 
> 
> 
> ....I will give Trixx a try then
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Pitcairn is Pitcairn, don't really care about the branding, Its still the same ASIC on board
> 
> 
> 
> 
> 
> 
> 
> 
> So this means you won't be benching your MSI 270s CFX againts my results?


Some are hardware locked, some are software. I got told that the MSI are hardware. You might have re flash the bios.

Try with trixx first. I was going to buy the MSI and asked the same question that's why I went with power colour in the end.

Edit: Didn't see you was talking about the 270X not the 280x, my facts might be incorrect


----------



## anubis1127

Quote:


> Originally Posted by *Roaches*
> 
> Hmm so Trixx has voltage unlock? I'm using the latest MSI AB and it still won't let me control the volts
> 
> 
> 
> 
> 
> 
> 
> ....I will give Trixx a try then
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Pitcairn is Pitcairn, don't really care about the branding, Its still the same ASIC on board
> 
> 
> 
> 
> 
> 
> 
> 
> So this means you won't be benching your MSI 270s CFX againts my results?


Yep, Trixx does the trick, up to 1.22V at least.  Yeah, I am using 3.0.1 and no voltage control here either, tried a couple of the older betas with no success.

Oh no, I still plan on doing 4930k vs Pentium G328 CFX benchmark results on the R9 270s, just need to get a proper working CFX bridge that is long enough.


----------



## Roaches

I've been using MSI AB for so long now, didn't know it was neutered all along. I'll definitely give Trixx a shot later today.

Alright, I will wait for them.


----------



## Devildog83

*c3p0c3p0 & xutnubu* have been added, love the MSI Gaming cards. How much are those going for these days?


----------



## Devildog83

Quote:


> Originally Posted by *anubis1127*
> 
> Yep, Trixx does the trick, up to 1.22V at least.
> 
> 
> 
> 
> 
> 
> 
> Yeah, I am using 3.0.1 and no voltage control here either, tried a couple of the older betas with no success.
> 
> Oh no, I still plan on doing 4930k vs Pentium G328 CFX benchmark results on the R9 270s, just need to get a proper working CFX bridge that is long enough.


My g3258 should be here tomorrow so I will be benching against the 8350 FPS wise in games only. Although I will be running Valley and what not a comparison there would not be fair, at least I don't think. I will be sticking it in my full watercooled system to see if I can get better than the 4.8 that guru3D got with a 240 AIO.


----------



## NameUnknown

Quote:


> Originally Posted by *neurotix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NameUnknown*
> 
> how do the 280X's perform against heavily modded Skyrim?
> 
> 
> 
> I had a 7970 at 1200/1600mhz last year, and a single 1080p monitor, and it was more than sufficient to run Skyrim well and average 60 fps. I just used FXAA, high res texture packs, and all that stuff like better wood and water, and so on. As long as you aren't stacking 200 graphics mods to try and make it photorealistic, you should be fine with a 280X/7970.
Click to expand...

Thanks for the info. I know its getting to be a bit older but its also one of those games that's customizable enough that I figured I should ask.


----------



## solar0987

Ok so i just switched sides and am regretting it. I can not for the life of me get bf3 to load it just locks up. Current system is a g3220 with a z87 wifi 2 gb of ram. Used to have a 550 ti it ran ok at low with the same settings all i did was totally uninstall all the nvidia stuff and install the amd install. It tryes to load up bf3 but just locks up with a sound repeating itself i cannot shutdown game at all even with cntrl alt delete i have to manually reboot system. Any help would be greatly appreciated. Like i said before it played fine with a nvidia card. Always had a nvidia card. The amd one r9 270x gigbyte windforce not at all playable. Help me please. Ona side note it plays skyrim fine at maz quality with the high ress pack. Im repairing bf3 now so i will report back when its done oh ya i installed 14.4 catalyst driver.


----------



## boot318

Quote:


> Originally Posted by *solar0987*
> 
> Ok so i just switched sides and am regretting it. I can not for the life of me get bf3 to load it just locks up. Current system is a g3220 with a z87 wifi 2 gb of ram. Used to have a 550 ti it ran ok at low with the same settings all i did was totally uninstall all the nvidia stuff and install the amd install. It tryes to load up bf3 but just locks up with a sound repeating itself i cannot shutdown game at all even with cntrl alt delete i have to manually reboot system. Any help would be greatly appreciated. Like i said before it played fine with a nvidia card. Always had a nvidia card. The amd one r9 270x gigbyte windforce not at all playable. Help me please. Ona side note it plays skyrim fine at maz quality with the high ress pack. Im repairing bf3 now so i will report back when its done oh ya i installed 14.4 catalyst driver.


I'm having that same problem with BF3 and ME3. Origin updated itself --AND BAM!!!!! Can't play my games. They lock up right after the start screen. For me, it is Origin problem.

I'm going to opt in the Beta program and see if that fixes it.


----------



## Devildog83

Quote:


> Originally Posted by *boot318*
> 
> I'm having that same problem with BF3 and ME3. Origin updated itself --AND BAM!!!!! Can't play my games. They lock up right after the start screen. For me, it is Origin problem.
> 
> I'm going to opt in the Beta program and see if that fixes it.


I play BF3 and all other games just fine with my 7870/270x but I haven't updated origin in a long time and now I won't for sure.


----------



## Roaches

Quote:


> Originally Posted by *Devildog83*
> 
> *c3p0c3p0 & xutnubu* have been added, love the MSI Gaming cards. How much are those going for these days?


Forgot to update me?

I have 2 Devils and a MSI 270 Gaming. First page lists me owning a single Devil


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> Forgot to update me?
> 
> I have 2 Devils and a MSI 270 Gaming. First page lists me owning a single Devil


Sorry about that, could you throw all of the clocks at me so I don't have to reread all of the posts to figure it out. All all 3 would be good unless you are running them all at the same speed then just one will do.


----------



## roflcopter159

Quick question for all of you: would my R9 280 be able to crossfire with a R9 280X and/or 79x0 cards or would it only work with another 280?

EDIT: Also, would blocks for the 280X version of my card (HIS R9 280 IceQ X2 OC) be compatible with mine?


----------



## anubis1127

Quote:


> Originally Posted by *roflcopter159*
> 
> Quick question for all of you: would my R9 280 be able to crossfire with a R9 280X and/or 79x0 cards or would it only work with another 280?
> 
> EDIT: Also, would blocks for the 280X version of my card (HIS R9 280 IceQ X2 OC) be compatible with mine?


Yes, you can crossfire with R9 280X, 7950, or 7970. Technically even a 7870 LE should work. Basically anything Tahiti.

Not sure on the block, try checking the EK cooling configurator. http://www.coolingconfigurator.com/


----------



## roflcopter159

Quote:


> Originally Posted by *anubis1127*
> 
> Yes, you can crossfire with R9 280X, 7950, or 7970. Technically even a 7870 LE should work. Basically anything Tahiti.
> 
> Not sure on the block, try checking the EK cooling configurator. http://www.coolingconfigurator.com/


Ok, thanks. I already tried to check with the cooling configurator but it did not have information on my card. It does have the 280X in there but I'm not sure if those components would work on my card. Do you know of any other way to check?


----------



## anubis1127

Quote:


> Originally Posted by *roflcopter159*
> 
> Ok, thanks. I already tried to check with the cooling configurator but it did not have information on my card. It does have the 280X in there but I'm not sure if those components would work on my card. Do you know of any other way to check?


I would try looking at a review site to see the PCB by itself, most good reviews show the card naked. Or take all the cooling off yourself and compare it to reference cards. I don't know of those HIS IceQ cards are reference, or custom PCB.


----------



## roflcopter159

Quote:


> Originally Posted by *anubis1127*
> 
> I would try looking at a review site to see the PCB by itself, most good reviews show the card naked. Or take all the cooling off yourself and compare it to reference cards. I don't know of those HIS IceQ cards are reference, or custom PCB.


Ok, thank you for your help!


----------



## Internet Swag

Hrmm.

Need advice here.

Crossfire 7850/R7 265 vs R9 280


----------



## anubis1127

R9 280 if the price is right.


----------



## roflcopter159

Rule that I try and follow (and would suggest to others) is to buy the most powerful single card you can before you go to xfire/sli


----------



## Internet Swag

Quote:


> Originally Posted by *anubis1127*
> 
> R9 280 if the price is right.


Quote:


> Originally Posted by *roflcopter159*
> 
> Rule that I try and follow (and would suggest to others) is to buy the most powerful single card you can before you go to xfire/sli


Aww, I've always wanted to Crossfire like all the fancy people







But deep down I know you are both right, I actually do many things that don't support CF anyways like play emulated games and stuff...

Do you guys think that going from a 7850 to a R9 280 is a big enough upgrade though? Should I just wait for next gen?


----------



## roflcopter159

Well, a R9 280 is basically a 7950. I would look up some benchmarks that compare the two (7850 vs 7950) so that you can determine if it is a large enough upgrade. I personally would probably wait for the next generation or save for a more powerful 280X/290/290X if it were me. However, if your card isn't performing at the level you want it to, by all means, upgrade it now


----------



## rdr09

Quote:


> Originally Posted by *Internet Swag*
> 
> Aww, I've always wanted to Crossfire like all the fancy people
> 
> 
> 
> 
> 
> 
> 
> But deep down I know you are both right, I actually do many things that don't support CF anyways like play emulated games and stuff...
> 
> Do you guys think that going from a 7850 to a R9 280 is a big enough upgrade though? Should I just wait for next gen?


the 280 at only 1100 on the core can be as fast as a stock 280X. something that a 7850 cannot achieve. i doubt it. i owned both a 7970 and a 7950. both can max out the same games. whatever one can't max . . . the other will neither.


----------



## Devildog83

Quote:


> Originally Posted by *Internet Swag*
> 
> Aww, I've always wanted to Crossfire like all the fancy people
> 
> 
> 
> 
> 
> 
> 
> But deep down I know you are both right, I actually do many things that don't support CF anyways like play emulated games and stuff...
> 
> Do you guys think that going from a 7850 to a R9 280 is a big enough upgrade though? Should I just wait for next gen?


Performance in bench's and games that do support X-Fire would be better with the 7850/265 but if as you say you do a lot of stuff that doesn't support X-Fire I would save and get a R9 290. It's once again the best price to performance card there is now that prices are back to the way they were when they were released. $410 for a Sapphire Tri-X is a very good price considering the performance and a 280x is only about $300 now which is twice as much as the 265 but you could sell the 7850 to make up for some of that.


----------



## rdr09

Quote:


> Originally Posted by *Devildog83*
> 
> Performance in bench's and games that do support X-Fire would be better with the 7850/265 but if as you say you do a lot of stuff that doesn't support X-Fire I would save and get a R9 290. It's once again the best price to performance card there is now that prices are back to the way they were when they were released. $410 for a Sapphire Tri-X is a very good price considering the performance and a 280x is only about $300 now which is twice as much as the 265 but you could sell the 7850 to make up for some of that.


^i agree with Devil's points but I would look at the other components of your rig, especially the cpu.


----------



## Devildog83

Quote:


> Originally Posted by *rdr09*
> 
> ^i agree with Devil's points but I would look at the other components of your rig, especially the cpu.


Good point, I don't know much about Intel yet but that CPU does look limiting, especially for a X-Fire set-up. I could be wrong though, as I said I don't know much about Intel.

I will find out soon though. I have a Asrock Z97 board and a Pentuim g3250 coming and will be reviewing the board using my main rig. If you have heard of this new chip you know it's going to be a blast trying to get 4.9 Ghz out of it.







Guru3D got 4.8 and Linus/Slick got 4.7.


----------



## roflcopter159

I think I would have to agree with these other guys. It probably wouldn't hurt to look at updating some of the other components of your rig in addition and perhaps even before your gpu. For right now, it is a good combination but a more powerful gpu/combination of gpus (depending on what you decide to do) would probably be bottlenecked a bit by your cpu.


----------



## rdr09

Quote:


> Originally Posted by *Devildog83*
> 
> Good point, I don't know much about Intel yet but that CPU does look limiting, especially for a X-Fire set-up. I could be wrong though, as I said I don't know much about Intel.
> 
> I will find out soon though. I have a Asrock Z97 board and a Pentuim g3250 coming and will be reviewing the board using my main rig. If you have heard of this new chip you know it's going to be a blast trying to get 4.9 Ghz out of it.
> 
> 
> 
> 
> 
> 
> 
> Guru3D got 4.8 and Linus/Slick got 4.7.


I'll look forward to your review.


----------



## Devildog83

Quote:


> Originally Posted by *rdr09*
> 
> I'll look forward to your review.


It's for NewEgg but I will post it here too.


----------



## rdr09

Quote:


> Originally Posted by *Devildog83*
> 
> It's for NewEgg but I will post it here too.


Thanks you. BTW, i think pioneerscloud runs an i3 with crossfired 7900 cards without issues with most games i read.


----------



## Devildog83

Quote:


> Originally Posted by *rdr09*
> 
> Thanks you. BTW, i think pioneerscloud runs an i3 with crossfired 7900 cards without issues with most games i read.


I guess there's the answer them. I will be running my X-Fired Devil's with the Pentium Dual core.


----------



## Roaches

Quote:


> Originally Posted by *Devildog83*
> 
> It's for NewEgg but I will post it here too.


I take it you're one of those EggExpert reviewers? I've always pass them off as shills. (no offense) Though I didn't know Newegg sends out samples for people to review....


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> I take it you're one of those EggExpert reviewers? I've always pass them off as shills. (no offense) Though I didn't know Newegg sends out samples for people to review....


I have reviewed some cool stuff. They sent me a Corsair M45 Gaming mouse, the Vengeance 1400 Gaming headset, a ton of Routers including an AC1750 and even a 500Gb HDD and now an Asrock Z97 and I get to keep all of the stuff. At least everything I have reviewed so for.


----------



## Roaches

Nice! I'd thought you'd have to return a review sample. Its nice knowing they let you keep it.

I guess you must be a long time reviewer at Newegg to be granted EggExpert status?
I've bought countless hardware from Newegg but rarely leave reviews since its kinda pointless and a waste of time as it'll get buried by countless other reviews lol.


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> Nice! I'd thought you'd have to return a review sample. Its nice knowing they let you keep it.
> 
> I guess you must be a long time reviewer at Newegg to be granted EggExpert status?
> I've bought countless hardware from Newegg but rarely leave reviews since its kinda pointless and a waste of time as it'll get buried by countless other reviews lol.


I just did reviews for fun and I bought most of my stuff from them, one day they just sent me an e-mail and asked if I would join the crew so I did. I get tired of router reviews but other than that it's been great.

I remember reading review of some fool who said that he was going to knock an Egg off because the power supply he bought wasn't modular. I had to say something so I did a review sort of. I just said "people please, if you don't buy a power supply that's modular don't complain about it just get a modular one"... and so on. The next day I got the e-mail.


----------



## Roaches

I did something similar for a PS3 accessory called Eagle Eye a year back...that was my very last review I can remember of. And addressing the reviewer below me that had a similar issue about running the mapping software.

http://www.newegg.com/Product/Product.aspx?Item=N82E16879177001&SortField=0&SummaryType=0&PageSize=10&SelectedRating=-1&VideoOnlyMark=False&IsFeedbackTab=true#scrollFullInfo

I'm the Anonymous review with the review title "The perfect KB/M emulator for PS3 FPS"

And Yeah its the reason why I stop posting reviews on Newegg because people use it to complain more than review a product.


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> I did something similar for a PS3 accessory called Eagle Eye a year back...that was my very last review I can remember of. And addressing the reviewer below me that had a similar issue about running the mapping software.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16879177001&SortField=0&SummaryType=0&PageSize=10&SelectedRating=-1&VideoOnlyMark=False&IsFeedbackTab=true#scrollFullInfo
> 
> I'm the Anonymous review with the review title "The perfect KB/M emulator for PS3 FPS"
> 
> And Yeah its the reason why I stop posting reviews on Newegg because people use it to complain more than review a product.


I hear ya, I am just in it for fun and for free stuff.


----------



## solar0987

Quote:


> Originally Posted by *Devildog83*
> 
> I play BF3 and all other games just fine with my 7870/270x but I haven't updated origin in a long time and now I won't for sure.


I repaired battlefield 3:and everything is working again.
Will rwport back when I get some actual game time in.


----------



## anubis1127

7870 Results @ 1200Mhz core / 1400Mhz memory:

Metro LL Benchmark



Hitman: Absolution Benchmark Ultra 4X MSAA



Hitman: Absolution Benchmark Ultra No MSAA



Tombraider Benchmark Ultimate FXAA



Resident Evil 6 Benchmark



Sleeping Dogs Benchmark



Still working on getting Pentium K stable over 4.0Ghz, then tomorrow I"ll be testing R 270s in Pentium dual core rig, and 4930k rig.


----------



## Recr3ational

Quote:


> Originally Posted by *roflcopter159*
> 
> Quick question for all of you: would my R9 280 be able to crossfire with a R9 280X and/or 79x0 cards or would it only work with another 280?
> 
> EDIT: Also, would blocks for the 280X version of my card (HIS R9 280 IceQ X2 OC) be compatible with mine?


Hey Rofl!
How you doing dude!
Are you buying a 280x?
With blocks some cards have blocks some doesn't. Some cards have really big bios switches which causes the block to raise. So be careful


----------



## Crowe98

Is there any real differences between the MSI R9 280x Gaming 3G and the "" LE?

Cheers


----------



## BruceB

Quote:


> Originally Posted by *Crowe98*
> 
> Is there any real differences between the MSI R9 280x Gaming 3G and the "" LE?
> Cheers


LE stands for "Lite Edition"

The LE has a lower core clock (860/1020MHz (normal/boost) ) than the 3G (1000/1050MHz). Apart from that they appear to be the same.









[EDIT]
Here are the links:
MSI R9 280X Gaming 3G
MSI R9 280X Gaming 3G LE


----------



## NameUnknown

Here is GPU-Z for my Diamond at stock. Apparently they just push the voltage up and do a nice crude OC at Diamond.


Spoiler: GPU-Z









Spoiler: GPU-Z & Afterburner


----------



## Crowe98

Quote:


> Originally Posted by *BruceB*
> 
> LE stands for "Lite Edition"
> 
> The LE has a lower core clock (860/1020MHz (normal/boost) ) than the 3G (1000/1050MHz). Apart from that they appear to be the same.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [EDIT]
> Here are the links:
> MSI R9 280X Gaming 3G
> MSI R9 280X Gaming 3G LE










I thought it meant Limited Edition! Haha


----------



## Roaches

Quote:


> Originally Posted by *anubis1127*
> 
> 7870 Results @ 1200Mhz core / 1400Mhz memory:
> 
> Metro LL Benchmark
> 
> 
> 
> Hitman: Absolution Benchmark Ultra 4X MSAA
> 
> 
> Hitman: Absolution Benchmark Ultra No MSAA
> 
> 
> Tombraider Benchmark Ultimate FXAA
> 
> 
> Resident Evil 6 Benchmark
> 
> 
> Sleeping Dogs Benchmark
> 
> 
> 
> Still working on getting Pentium K stable over 4.0Ghz, then tomorrow I"ll be testing R 270s in Pentium dual core rig, and 4930k rig.


Single card or Crossfire? Looks really damn good


----------



## Dasboogieman

Quote:


> Originally Posted by *Crowe98*
> 
> 
> 
> 
> 
> 
> 
> 
> I thought it meant Limited Edition! Haha


LOL, last time I ever bought a card with the words "LE" was an NVIDIA 6200 LE....it was awful. I never assumed LE meant Limited Edition ever again, and even if it did, it could be limited edition for good and bad reasons (e.g. Extra Slow limited DDR2 edition).


----------



## anubis1127

Quote:


> Originally Posted by *Roaches*
> 
> Single card or Crossfire? Looks really damn good


Crossfire.


----------



## roflcopter159

Quote:


> Originally Posted by *Recr3ational*
> 
> Hey Rofl!
> How you doing dude!
> Are you buying a 280x?
> With blocks some cards have blocks some doesn't. Some cards have really big bios switches which causes the block to raise. So be careful


Hey Rec
I'm doing alright, keeping busy this summer haha
I actually won the HIS R9 280 from one of the recent giveaways so that's what I was talking about. Just curious about how crossfire would work with the different cards and what blocks would work for this particular card.


----------



## DiceAir

So my brother will be getting my old 3770k and b75a-g45 gaming motherboard. i want to know if the 270 (no x) will be good enough to do 1080p gaming and running on medium to ultra graphics with noaa or 2xaa. My whole point is to prove him that he can maintain a much better FPS on pc than on the consoles and it will cost him less to do that. He currently have ps3 but looking into getting ps4. He's mainly into racing games and soon he will be playing Project Cars. The specs is as follows

i7-3770k Stock cooling and no overclocking
b75a-G43 Gaming
Standard 8Gb 1600mhz ram (2x4GB)
Basic office case (Has enough room for the r9 270)
Zalman 600w PSU
1TB WD Blue HDD

My local pc shop has this amazing special on the r9 270 and is so cheap i'm thinking of getting 2 for crossfire but I think for the sake of making thing a little bit easier for him I will stick to 1 card.

So should the r9 270 be good enough for the pc or should I be looking at something more expensive like the 280. i'm just scared that the 2GB ram will not be enough for future games. The 280 will double the price of the Graphics card.


----------



## anubis1127

Quote:


> Originally Posted by *DiceAir*
> 
> So my brother will be getting my old 3770k and b75a-g45 gaming motherboard. i want to know if the 270 (no x) will be good enough to do 1080p gaming and running on medium to ultra graphics with noaa or 2xaa. My whole point is to prove him that he can maintain a much better FPS on pc than on the consoles and it will cost him less to do that. He currently have ps3 but looking into getting ps4. He's mainly into racing games and soon he will be playing Project Cars. The specs is as follows
> 
> i7-3770k Stock cooling and no overclocking
> b75a-G43 Gaming
> Standard 8Gb 1600mhz ram (2x4GB)
> Basic office case (Has enough room for the r9 270)
> Zalman 600w PSU
> 1TB WD Blue HDD
> 
> My local pc shop has this amazing special on the r9 270 and is so cheap i'm thinking of getting 2 for crossfire but I think for the sake of making thing a little bit easier for him I will stick to 1 card.
> 
> So should the r9 270 be good enough for the pc or should I be looking at something more expensive like the 280. i'm just scared that the 2GB ram will not be enough for future games. The 280 will double the price of the Graphics card.


I think the R9 270 is pretty good for 1080p gaming. The 2gb vram is a bit low compared to other cards these days, but isn't a huge deal yet. I definitely don't think the 280 is worth double the price, so from a max bang for the buck standpoint I would just stick with the 270 if its pretty cheap.


----------



## DiceAir

Quote:


> Originally Posted by *anubis1127*
> 
> I think the R9 270 is pretty good for 1080p gaming. The 2gb vram is a bit low compared to other cards these days, but isn't a huge deal yet. I definitely don't think the 280 is worth double the price, so from a max bang for the buck standpoint I would just stick with the 270 if its pretty cheap.


I want the pc to last at least as long as the console. So about 5 years as long as you can maintain the same graphics as consoles but 1080p and maybe 60fps or later on 30fps. It's not that the 280x is expensive it's just that the 270 is really cheap now. I'm really just worried that in the near future the 2GB ram will hold back and we wouldn't be able to maintain that 60FPS.


----------



## Dasboogieman

Quote:


> Originally Posted by *DiceAir*
> 
> I want the pc to last at least as long as the console. So about 5 years as long as you can maintain the same graphics as consoles but 1080p and maybe 60fps or later on 30fps. It's not that the 280x is expensive it's just that the 270 is really cheap now. I'm really just worried that in the near future the 2GB ram will hold back and we wouldn't be able to maintain that 60FPS.


5 years is a very very long time. Especially when it comes to VRAM, the consoles all have 8Gb standard even though they have Pitcairn levels of Rendering power. I'd probably best wait as Tonga is around the corner.


----------



## xutnubu

Can someone tell me what's your stock voltage on the 280X?

I don't think Afterburner is giving me the correct number, 1.109v seems rather low.

I thought these cards were supposed to ship with 1.2v.


----------



## Arkanon

Quote:


> Originally Posted by *Dasboogieman*
> 
> 5 years is a very very long time. Especially when it comes to VRAM, the consoles all have 8Gb standard even though they have Pitcairn levels of Rendering power. I'd probably best wait as Tonga is around the corner.


Small misconception there. The ps4 uses a pitcairn based gpu, while the xbox one uses a bonaire based gpu







. Also: first tonga to be released will only be available with 2gb vram maybe later on 4gb, who knows.


----------



## BruceB

Quote:


> Originally Posted by *xutnubu*
> 
> Can someone tell me what's your stock voltage on the 280X?
> I don't think Afterburner is giving me the correct number, 1.109v seems rather low.
> I thought these cards were supposed to ship with 1.2v.


I'll check for you when I get home








Did you check it under load? When they're in an idle state they down clock and undervolt themselves.


----------



## boot318

Quote:


> Originally Posted by *xutnubu*
> 
> Can someone tell me what's your stock voltage on the 280X?
> 
> I don't think Afterburner is giving me the correct number, 1.109v seems rather low.
> 
> I thought these cards were supposed to ship with 1.2v.


Mines is 1.144. R9 280X - Sapphire Vapor-X

Check the MAX VOLTAGE in GPU-Z after running a game. I think it will be higher than AB voltage.


----------



## Dasboogieman

Quote:


> Originally Posted by *Arkanon*
> 
> Small misconception there. The ps4 uses a pitcairn based gpu, while the xbox one uses a bonaire based gpu
> 
> 
> 
> 
> 
> 
> 
> . Also: first tonga to be released will only be available with 2gb vram maybe later on 4gb, who knows.


Oh yeah, true I overestimated the Xbone.


----------



## BruceB

Quote:


> Originally Posted by *xutnubu*
> 
> Can someone tell me what's your stock voltage on the 280X?
> I don't think Afterburner is giving me the correct number, 1.109v seems rather low.
> I thought these cards were supposed to ship with 1.2v.


Just checked mine in MSI AfterBurner and it says a max of 1.144V, GPU-Z says 1.164V. Both of which seem a bit low to me.

Google tells me that 1.2V is the max. voltage for These Cards, I guess that just means loads of OC headroom?


----------



## xutnubu

Quote:


> Originally Posted by *boot318*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xutnubu*
> 
> Can someone tell me what's your stock voltage on the 280X?
> 
> I don't think Afterburner is giving me the correct number, 1.109v seems rather low.
> 
> I thought these cards were supposed to ship with 1.2v.
> 
> 
> 
> Mines is 1.144. R9 280X - Sapphire Vapor-X
> 
> Check the MAX VOLTAGE in GPU-Z after running a game. I think it will be higher than AB voltage.
Click to expand...

Actually GPU-Z reports 1.2v, but I've read from Unwinder you can't trust GPU-Z voltages for non-reference cards.

Quote:


> Originally Posted by *BruceB*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xutnubu*
> 
> Can someone tell me what's your stock voltage on the 280X?
> I don't think Afterburner is giving me the correct number, 1.109v seems rather low.
> I thought these cards were supposed to ship with 1.2v.
> 
> 
> 
> Just checked mine in MSI AfterBurner and it says a max of 1.144V, GPU-Z says 1.164V. Both of which seem a bit low to me.
> 
> Google tells me that 1.2V is the max. voltage for These Cards, I guess that just means loads of OC headroom?
Click to expand...

Thank you for checking, and yes, my load voltage is 1.109v in Afterburner.

I opened the BIOS with VBE7 and it's set to 1.2v









So I guess the only way to know for sure is with a multimeter, but I don't know where the v-check points are.


----------



## mattjm

Hey guys, is the Asus R9 280 DCII TOP a good card? I'm thinking of buying it. My case would have only 3mm of space remaining after the installation though.


----------



## Roaches

Quote:


> Originally Posted by *anubis1127*
> 
> Crossfire.


Neat! they beat my Devils in Metro and Hitman. I'll try to stack up against your clocks once I get them back in CFX.

Semi-Related: Got my hands on the Devil's big brother. Will post pics next week when it arrives.








Never thought of myself being a PowerColor Devil GPU series fan, which I'm not afraid to admit. Really wished they made Nvidia cards too.


----------



## DiceAir

Ok so I have yet another issue. Everything was fine with my r9-280x. I'm running 14.4 drivers and ever since it came out I've been running stable. No overclocks on my card 1100mhz/1500mhz. Now since yesterday when I play some games it will just freeze and driver not responding also sometimes on desktop my screen will sort of jump a bit. So now i don't know how to test for stability.

I tried loading up valley benchmark but that went fine for about 1-2 hours. I have a feeling my memroy on my card is faulty and I even ran 2560x1440 everything max on valley benchamrk and still everything is fine. So any way of testing my video ram for errors or so. I don't get freezes all the time but it's almost all the time. Just now I played about 3-4 maps in BF4 without freeze but earlier it would freeze on the second map.

Funny thing is when it freeze It's like a display driver not responding like something is not stable. Like an overclock that failed but I'm not overclocking. I don't even overclock my cpu. This is now my second card doing this. First card was ok the only issue I saw was green lines on desktop but later it would do more or less the same.

I'm running a qnix monitor on 96Hz and I also noticed that on a normal 1080p monitor 60hz looks a lot smoother than 60hz on my qnix so there is something wrong for sure. Please tell me whatI can do to test this out before I try it on another pc.

Thanks


----------



## BruceB

Quote:


> Originally Posted by *DiceAir*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Ok so I have yet another issue. Everything was fine with my r9-280x. I'm running 14.4 drivers and ever since it came out I've been running stable. No overclocks on my card 1100mhz/1500mhz. Now since yesterday when I play some games it will just freeze and driver not responding also sometimes on desktop my screen will sort of jump a bit. So now i don't know how to test for stability.
> 
> I tried loading up valley benchmark but that went fine for about 1-2 hours. I have a feeling my memroy on my card is faulty and I even ran 2560x1440 everything max on valley benchamrk and still everything is fine. So any way of testing my video ram for errors or so. I don't get freezes all the time but it's almost all the time. Just now I played about 3-4 maps in BF4 without freeze but earlier it would freeze on the second map.
> 
> Funny thing is when it freeze It's like a display driver not responding like something is not stable. Like an overclock that failed but I'm not overclocking. I don't even overclock my cpu. This is now my second card doing this. First card was ok the only issue I saw was green lines on desktop but later it would do more or less the same.
> 
> I'm running a qnix monitor on 96Hz and I also noticed that on a normal 1080p monitor 60hz looks a lot smoother than 60hz on my qnix so there is something wrong for sure. Please tell me whatI can do to test this out before I try it on another pc.
> 
> 
> Thanks


1. Return your monitor to its stock speed.
2. Return your CPU to its stock speed.
3. Use GPU-Z to check the clocks, temps and voltages are what they should be.
4. Test it with a game you know has this problem (BF4 by the sounds of it)

When you've done that post your findings here and then we can help you further


----------



## BruceB

Quote:


> Originally Posted by *mattjm*
> 
> Hey guys, is the Asus R9 280 DCII TOP a good card? I'm thinking of buying it. My case would have only 3mm of space remaining after the installation though.


3mm clearance in which direction?

The DCII TOP is a good card, not the fastest though. It also has a triple-slot cooler, so make sure you have the space! The DCII TOP has a custom PCB, that means if you're watercooling you'll have to find a special 280x DCII TOP block.

I'm not sure how much the DCIIT is where you are but here in Germany it makes more sense (price/performance -wise) to buy the Sapphire Toxic (the fastest 280X).


----------



## alani4837

hello guys, can i use crossfire with XFX R9 280X TD-BD + MSI R9 280X 3G V277-053R, with only msi on water loop?(cause the xfx dont has waterblock)


----------



## Roaches

You can crossfire as long both cards runs the same chip (Tahiti Silicon). Doesn't matter if both cards are from different vendors. It will work in crossfire.


----------



## DiceAir

Quote:


> Originally Posted by *BruceB*
> 
> 1. Return your monitor to its stock speed.
> 2. Return your CPU to its stock speed.
> 3. Use GPU-Z to check the clocks, temps and voltages are what they should be.
> 4. Test it with a game you know has this problem (BF4 by the sounds of it)
> 
> When you've done that post your findings here and then we can help you further


Funny thing it it also happens on 60hz but the thing is why does my mouse cursor and games feels so jerky and laggy on 60hz. All frames being process the same but it doesn't feel as smooth as on a normal 60hz 1080p screen. Maybe there is something wrong with my monitor. maybe I should just ditch the Qnix panel and go for my old 1080p panel. I will force myself to get used to it.

Maybe when i upgraded my pc I can then use the old pc to build myself some sort of stem box and so on.


----------



## BruceB

Quote:


> Originally Posted by *DiceAir*
> 
> Funny thing it it also happens on 60hz but the thing is why does my mouse cursor and games feels so jerky and laggy on 60hz. All frames being process the same but it doesn't feel as smooth as on a normal 60hz 1080p screen. Maybe there is something wrong with my monitor. maybe I should just ditch the Qnix panel and go for my old 1080p panel. I will force myself to get used to it.
> Maybe when i upgraded my pc I can then use the old pc to build myself some sort of stem box and so on.


It sounds like you frame times aren't evenly spaced, I've heard of this becoming a problem when your monitor OC is too high.









Does the problem go away _fully_ on you old/other monitor?


----------



## ad hoc

Does anyone here crossfire 270 or 270x's? I'm looking for some input.


----------



## BruceB

Quote:


> Originally Posted by *ad hoc*
> 
> Does anyone here crossfire 270 or 270x's? I'm looking for some input.


AFAIK @Devildog83 and @Roaches run 270Xs in xfire. They can probably help you further


----------



## DiceAir

Quote:


> Originally Posted by *BruceB*
> 
> It sounds like you frame times aren't evenly spaced, I've heard of this becoming a problem when your monitor OC is too high.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does the problem go away _fully_ on you old/other monitor?


Yes goes away fully on my old monitor. I had my monitor on 96Hz all the time. I only clocked it to 120hz for about 2 months that's it. maybe I broke it sf some sort.

The funny thing is that sometimes when playing games my game will freeze and amd driver stop responding. I'm still on 14.4 and been running them ever since they came out. no issues then on Thursday and yesterday my game started crashing for no reason. In BF4 when i run around an nobody infront of me it will freeze then I have to end task it. So the other thing is that sometimes on desktop use i get screenjump. and I always set my refresh rate to 60hz default. So what can I do to test the monitor/card or so. i've done all the test mentioned in this forum.

I did that block type of thing and I see no frame skipping. So at this moment I don't know if my 96Hz on my screen wasn't stable or some sort and my monitor started degrading more than others and so on. So please tell me what to do?


----------



## BruceB

Quote:


> Originally Posted by *DiceAir*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Yes goes away fully on my old monitor. I had my monitor on 96Hz all the time. I only clocked it to 120hz for about 2 months that's it. maybe I broke it sf some sort.
> 
> The funny thing is that sometimes when playing games my game will freeze and amd driver stop responding. I'm still on 14.4 and been running them ever since they came out. no issues then on Thursday and yesterday my game started crashing for no reason. In BF4 when i run around an nobody infront of me it will freeze then I have to end task it. So the other thing is that sometimes on desktop use i get screenjump. and I always set my refresh rate to 60hz default. So what can I do to test the monitor/card or so. i've done all the test mentioned in this forum.
> 
> 
> I did that block type of thing and I see no frame skipping. So at this moment I don't know if my 96Hz on my screen wasn't stable or some sort and my monitor started degrading more than others and so on. So please tell me what to do?


If the Problem goes away fully with a different Monitor then the Problem isn't your Computer (GPU/Drivers/etc.), the Problem is your Monitor.

Just because it Shows every Frame (no frame-skipping) dosen't mean that each Frame is delivered at the right time, there may still be a Problem even if there is no frame-skipping.

Are you _sure_ you've returned it to stock and you're not using a custom Resolution for it through Windows?


----------



## DiceAir

Quote:


> Originally Posted by *BruceB*
> 
> If the Problem goes away fully with a different Monitor then the Problem isn't your Computer (GPU/Drivers/etc.), the Problem is your Monitor.
> 
> Just because it Shows every Frame (no frame-skipping) dosen't mean that each Frame is delivered at the right time, there may still be a Problem even if there is no frame-skipping.
> 
> Are you _sure_ you've returned it to stock and you're not using a custom Resolution for it through Windows?


Yes I used DDU and removed drivers then I reinstalled drivers and nothing else. Is there anything else i must do to make sure everything is at defaults?


----------



## BruceB

Quote:


> Originally Posted by *DiceAir*
> 
> Yes I used DDU and removed drivers then I reinstalled drivers and nothing else. Is there anything else i must do to make sure everything is at defaults?


Right-click on the Desktop -> Screen Resolution -> Advanced Settings -> Monitor Tab, check that the drop-down box for refresh rate on this tab is set to 60Hz


----------



## NeoReaper

I guess can I be added for my bwetttyfewl gpu?


----------



## BruceB

Quote:


> Originally Posted by *NeoReaper*
> 
> I guess can I be added for my bwetttyfewl gpu?


Quote:


> Originally Posted by *Devildog83*
> 
> *To be added on the member list please post a picture of your GPU, brand, series and clocks you wish to be posted.*.


Rig pic please!


----------



## DiceAir

Quote:


> Originally Posted by *BruceB*
> 
> Right-click on the Desktop -> Screen Resolution -> Advanced Settings -> Monitor Tab, check that the drop-down box for refresh rate on this tab is set to 60Hz


Ok just checked and it's set at 60hz. Like i said used ddu in safe mode and reinstalled the drivers that's all then when I launch bf4 all is laggy and doesn't feel like a constant 60hz


----------



## BruceB

Quote:


> Originally Posted by *DiceAir*
> 
> Ok just checked and it's set at 60hz. Like i said used ddu in safe mode and reinstalled the drivers that's all then when I launch bf4 all is laggy and doesn't feel like a constant 60hz


Bad times. It sounds like your Monitor's the Problem, but I'm no expert with Monitors, I'm sure someone in this Forum: Monitors and Displays can help you further!
Good luck!


----------



## DiceAir

Quote:


> Originally Posted by *BruceB*
> 
> Bad times. It sounds like your Monitor's the Problem, but I'm no expert with Monitors, I'm sure someone in this Forum: Monitors and Displays can help you further!
> Good luck!


Thanks for the help dude anyone else that can help me? I also think my monitor is the problem so bad times. I just have to go back to my old Sucky 1080p monitor but ag well will get used to it


----------



## agrims

I guess I should be added..
Gigabyte R9 280X Rev 2 running at stock, 1100/1500.





Replaced an amazing overclocking HD7850, was running it at 7950 speeds, the mystical 50% OC club, 1290/1500...


----------



## miraldo

Hey.

I wonders if there is any chance that my i5 2500k(stock 3.30GHz) bottleneck R9 290 TRI-X?

In Battlefield 4 my frames jump like crazy from 40-100fps on Ultra settings MSSA x4.

What is wrong with Crysis 3?

If I put everything to the MAX settings with MSSA x8 then I get 25-35fps at the begginig of game(storm) LoL

This game is sick.. Is this normal FPS? Anyone else with R9 290 here?


----------



## Devildog83

*agrim* has been added. Welcome!!!


----------



## Devildog83

Intel and mantle? HMMMMM - http://www.overclock3d.net/articles/software/intel_approached_amd_to_access_mantle/1


----------



## BruceB

Quote:


> Originally Posted by *Devildog83*
> 
> Intel and mantle? HMMMMM - http://www.overclock3d.net/articles/software/intel_approached_amd_to_access_mantle/1


I think the most intresting News in that article was that AMD want to bring Mantle to Linux?!! I love you AMD!









Quote:


> Originally Posted by *miraldo*
> 
> Hey.
> I wonders if there is any chance that my i5 2500k(stock 3.30GHz) bottleneck R9 290 TRI-X?
> In Battlefield 4 my frames jump like crazy from 40-100fps on Ultra settings MSSA x4.
> What is wrong with Crysis 3?
> If I put everything to the MAX settings with MSSA x8 then I get 25-35fps at the begginig of game(storm) LoL
> This game is sick.. Is this normal FPS? Anyone else with R9 290 here?


I don't have a 290 so I can't tell you about specific Frame rates.
I'm pretty sure that an i5 2500k will bottleneck an R9 290. To Balance an R9 290 you want to be looking at the i7 range.


----------



## ad hoc

Quote:


> Originally Posted by *BruceB*
> 
> AFAIK @Devildog83 and @Roaches run 270Xs in xfire. They can probably help you further


Thanks!


----------



## Arkanon

Quote:


> Originally Posted by *BruceB*
> 
> I think the most intresting News in that article was that AMD want to bring Mantle to Linux?!! I love you AMD!
> 
> 
> 
> 
> 
> 
> 
> 
> I don't have a 290 so I can't tell you about specific Frame rates.
> I'm pretty sure that an i5 2500k will bottleneck an R9 290. To Balance an R9 290 you want to be looking at the i7 range.


uhm wut? The only thing i7 adds over i5 is hypertreading which is pretty pointless in games. You won't see any significant improvements in games with using an i7. Rendering however might be another story.


----------



## BruceB

Quote:


> Originally Posted by *Arkanon*
> 
> uhm wut? The only thing i7 adds over i5 is hypertreading which is pretty pointless in games. You won't see any significant improvements in games with using an i7. Rendering however might be another story.


Err.. ok. Aparently I'm not all together on intel's line up, thanks for correcting me. Disregard what I said.









Although in games that use more than 4 threads (like BF4) there appears to be an Advantage:


----------



## DiceAir

I want to know if my psu is 600w and it has 4x12v rails rated at 18a each. Will that be enough to power one r9 280x? I'm just scared i will blow the other parts of the pc or the psu. Rest of system is

3770k no overclock
8GB ram 1333mhz
1x HDD
2x fans
r9 280x

That's about it.


----------



## Recr3ational

Quote:


> Originally Posted by *DiceAir*
> 
> I want to know if my psu is 600w and it has 4x12v rails rated at 18a each. Will that be enough to power one r9 280x? I'm just scared i will blow the other parts of the pc or the psu. Rest of system is
> 
> 3770k no overclock
> 8GB ram 1333mhz
> 1x HDD
> 2x fans
> r9 280x
> 
> That's about it.


What PSU is it?


----------



## DiceAir

ip-p600cq3-2

http://www.in-win.com.tw/Power/en/goods.php?act=view&id=CQ-Series.

BTW I don't think that exact PSU cause it only comes with 2x 6ins so I see that's not enough for the r9-280x.


----------



## alani4837

Quote:


> Originally Posted by *Roaches*
> 
> You can crossfire as long both cards runs the same chip (Tahiti Silicon). Doesn't matter if both cards are from different vendors. It will work in crossfire.


yeah its the same chip Tahiti XTL, any suggestion for this crossfire?


----------



## anubis1127

Quote:


> Originally Posted by *DiceAir*
> 
> ip-p600cq3-2
> 
> http://www.in-win.com.tw/Power/en/goods.php?act=view&id=CQ-Series.
> 
> BTW I don't think that exact PSU cause it only comes with 2x 6ins so I see that's not enough for the r9-280x.


I think a slightly better 600w PSU and you would be fine. Having only 2x 6-pins, would mean using some sort of molex adapter to get enough power to the 280x, and that never seems like a good idea to me.

Quote:


> Originally Posted by *alani4837*
> 
> yeah its the same chip Tahiti XTL, any suggestion for this crossfire?


Make sure you have enough power, other than that, have fun.

Quote:


> Originally Posted by *ad hoc*
> 
> Does anyone here crossfire 270 or 270x's? I'm looking for some input.


I have some 270s in crossfire on my pentium-k system right now. Ran some benches on it yesterday, I'll post the results in a few hours. I want to also take the 270s and put them in my 4930k system to see how much the Pentium-K was gimping my results.


----------



## alani4837

thank you m8, i have corsair hx1050...is that enough?







the msi r9 280x is on waterloop and the xfx is stock, is that ok?


----------



## anubis1127

Quote:


> Originally Posted by *alani4837*
> 
> thank you m8, i have corsair hx1050...is that enough?
> 
> 
> 
> 
> 
> 
> 
> the msi r9 280x is on waterloop and the xfx is stock, is that ok?


Should be good. I ran 7970 Lightning cards (power hungry buggers) in CFX off a hx850 and it worked fine.

Just put a CFX bridge on them and CCC should auto detect crossfire and enable it for you, and then notify you to go check the crossfire settings.


----------



## alani4837

thank you very much m8..


----------



## buttface420

Quote:


> Originally Posted by *DiceAir*
> 
> I want to know if my psu is 600w and it has 4x12v rails rated at 18a each. Will that be enough to power one r9 280x? I'm just scared i will blow the other parts of the pc or the psu. Rest of system is
> 
> 3770k no overclock
> 8GB ram 1333mhz
> 1x HDD
> 2x fans
> r9 280x
> 
> That's about it.


not sure i was always told that a 280x should be on at least a 750watt psu. the card alone uses something like 200 watts


----------



## DiceAir

What about a cougar 750w with 19a per 12v rail and 4 rails in total?


----------



## anubis1127

Quote:


> Originally Posted by *buttface420*
> 
> not sure i was always told that a 280x should be on at least a 750watt psu. the card alone uses something like 200 watts


Yeah, they are right around 200W, little more with a 280X because of higher clocks. I guess it would depend on the rest of the system, 750W would be better for efficiency, and keeping the PSU running cooler, so that would be a good recommendation.

It could be done with a 600W PSU, wouldn't necessarily be the best, but it would work for systems without a lot of extra fans / HDDs /watercooling gear, and such, just a normal newer i5/i7 quad core/ mobo / SSD / HDD type system.

Quote:


> Originally Posted by *DiceAir*
> 
> What about a cougar 750w with 19a per 12v rail and 4 rails in total?


Should be good, better than your other option.


----------



## [CyGnus]

I have a 550W PSU and a 280x with a 4770K with no problems at all, you just have to be sure you get a quality PSU even a good 450-500w will do but about PSU's Shilka is the guy you want to get advice from









I doubt my system uses more then 300-350w


----------



## DiceAir

Quote:


> Originally Posted by *anubis1127*
> 
> Yeah, they are right around 200W, little more with a 280X because of higher clocks. I guess it would depend on the rest of the system, 750W would be better for efficiency, and keeping the PSU running cooler, so that would be a good recommendation.
> 
> It could be done with a 600W PSU, wouldn't necessarily be the best, but it would work for systems without a lot of extra fans / HDDs /watercooling gear, and such, just a normal newer i5/i7 quad core/ mobo / SSD / HDD type system.
> 
> Should be good, better than your other option.


This is only for a test. just building it for my brother. Normal case with only 1 fan for now. Card won't get to hot as 1080p is not that hard to drive in most games. He will be playing racing games connected to a Samsung TV. This is only going to be for a bout 1-2 months and he doesn't play like every day. Mostly over weekends.


----------



## anubis1127

Quote:


> Originally Posted by *DiceAir*
> 
> This is only for a test. just building it for my brother. Normal case with only 1 fan for now. Card won't get to hot as 1080p is not that hard to drive in most games. He will be playing racing games connected to a Samsung TV. This is only going to be for a bout 1-2 months and he doesn't play like every day. Mostly over weekends.


It'll be fine then. I'm running 2 270s off a 550W PSU with not much else in there, and it was using 355W running Heaven at the wall with both cards OCd. I would think one 280X would be a bit less than those 270s.


----------



## xutnubu

I'm going to replace the TIM on my card.

I'm in between the IC Diamond 7 or the Gelid GC-Extreme, which one would be better?


----------



## Recr3ational

Quote:


> Originally Posted by *xutnubu*
> 
> I'm going to replace the TIM on my card.
> 
> I'm in between the IC Diamond 7 or the Gelid GC-Extreme, which one would be better?


I never used Gelid, but i had used diamond and it dropped my temp by 10c.


----------



## anubis1127

Quote:


> Originally Posted by *xutnubu*
> 
> I'm going to replace the TIM on my card.
> 
> I'm in between the IC Diamond 7 or the Gelid GC-Extreme, which one would be better?


I would vote Gelid GC-E, much easier to work with, and you don't really want diamonds scratching up your GPU core, do you?


----------



## Arkanon

I'm acutally very positive about Gelid as well. Got a tube with my tranquilo rev 2.0 cpu cooler and as far as i can tell it performs just a bit better then arctic MX2.


----------



## Roaches

I vouch for the Gelid GC-Extreme. My MX-2 doesn't come close to it. I have 2 devils running on Gelid GC-E and MX-2 and the GC-E beats it by a mile during benchmarks.


----------



## Recr3ational

Quote:


> Originally Posted by *anubis1127*
> 
> I would vote Gelid GC-E, much easier to work with, and you don't really want diamonds scratching up your GPU core, do you?


I dont actually have any scratches on mine.


----------



## anubis1127

Quote:


> Originally Posted by *Recr3ational*
> 
> I dont actually have any scratches on mine.


I don't even know how that is possible. Got pics?


----------



## Recr3ational

Quote:


> Originally Posted by *anubis1127*
> 
> I don't even know how that is possible. Got pics?


I don't sorry, it was on my old 7950s, I've also tried it on my cpu and never got scratches. :/
Is that bad?


----------



## anubis1127

Quote:


> Originally Posted by *Recr3ational*
> 
> I don't sorry, it was on my old 7950s, I've also tried it on my cpu and never got scratches. :/
> Is that bad?


It's not bad, its just amazing to me. I used the stuff once, and it scratched the bejesus out of the IHS on my CPU, and the cooler I was using. Maybe I just got a bad batch or something.


----------



## Recr3ational

Quote:


> Originally Posted by *anubis1127*
> 
> It's not bad, its just amazing to me. I used the stuff once, and it scratched the bejesus out of the IHS on my CPU, and the cooler I was using. Maybe I just got a bad batch or something.


Oh did you heat it up before?


----------



## anubis1127

Quote:


> Originally Posted by *Recr3ational*
> 
> Oh did you heat it up before?


This was like three years ago, but I think I remember doing that, maybe putting it in a zip lock bag, and stuck it in boiling water, or something along those lines. It was a while ago, can't say for sure, but I'm 90% sure I had to just to make it pliable enough to apply if I'm remembering correctly.


----------



## Devildog83

Quote:


> Originally Posted by *xutnubu*
> 
> I'm going to replace the TIM on my card.
> 
> I'm in between the IC Diamond 7 or the Gelid GC-Extreme, which one would be better?


GC Extreme all the way.


----------



## xutnubu

Gelid it is then









I've never replaced the TIM on a GPU, I'm guessing the 'dot and heat-sink pressure method' doesn't work here? Is it better to spread it evenly?

Should I use more than what I'd use for a CPU?


----------



## anubis1127

Quote:


> Originally Posted by *xutnubu*
> 
> Gelid it is then
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've never replaced the TIM on a GPU, I'm guessing the 'dot and heat-sink pressure method' doesn't work here? Is it better to spread it evenly?
> 
> Should I use more than what I'd use for a CPU?


I just use a small bit in a BB shape on the GPU core, and let the heatsink spread it personally. The GPU core is a lot smaller than a CPU IHS, so I use less than I do on a CPU.


----------



## Recr3ational

Quote:


> Originally Posted by *anubis1127*
> 
> This was like three years ago, but I think I remember doing that, maybe putting it in a zip lock bag, and stuck it in boiling water, or something along those lines. It was a while ago, can't say for sure, but I'm 90% sure I had to just to make it pliable enough to apply if I'm remembering correctly.


Yeah maybe I was just lucky, I'm currently using MX4 on my 280x. I might try it with diamond next time and see.


----------



## Dasboogieman

Quote:


> Originally Posted by *xutnubu*
> 
> I'm going to replace the TIM on my card.
> 
> I'm in between the IC Diamond 7 or the Gelid GC-Extreme, which one would be better?


Go with GELID, I was discussing this with the IC Diamond guy on Notebookreview. IC Diamond 7 is the most optimized for bridging large gaps, its high filler loading (which is what gives the paste the incredible stability) is a liability when gaps thinner than <0.2mm need to be bridged due to its poor spread and surface wetting. Only use if you can confirm that your card has really poor contact, almost to the point of considering using a shim.

GELID has really high thermal conductivity, its basically the same OEM as the excellent Phobya HeGrease Extreme which has been near the top of Skinneelabs for a while now (second only to either phase change or liquid metal TIMs). It isn't quite as brute stable as IC Diamond 7 but your pasting a GPU which is reasonably easy to repaste, unlike laptops which are the key usage scenario for Diamond 7.

By the way, it is extremely well documented that IC Diamond is abrasive, was even confirmed by the Rep himself. Basically, again, its due to IC Diamond's high filler to carrier ratio, its got the same abrasion power as that gritty standard Colgate toothpaste. Best way to remove is liberal amounts of something like Arctic cleanser and slow + careful TLC when rubbing.


----------



## Devildog83

Ran Valley with the new g3258 @ stock and believe it or not it's so close to my 8350 @ 4.8 Ghz it's not even funny. I did notice that the 1st 16x slot hit a max of Gbs PCI-link speed where the CHVFZ and 8350 maxed at 5.0Ghz. Very interesting. I will see where an overclock takes me.



8350 max score at 4.8 Ghz


----------



## rdr09

Quote:


> Originally Posted by *Devildog83*
> 
> Ran Valley with the new g3258 @ stock and believe it or not it's so close to my 8350 @ 4.8 Ghz it's not even funny. I did notice that the 1st 16x slot hit a max of Gbs PCI-link speed where the CHVFZ and 8350 maxed at 5.0Ghz. Very interesting. I will see where an overclock takes me.
> 
> 
> 
> 8350 max score at 4.8 Ghz


i think that is expected in that particular bench. much like Heaven. that is why you'll see the i5 mingling with the i7s in that bench. something you won't see in other benches like Futuremarks.


----------



## anubis1127

Nice. Although Valley isn't really CPU intensive at all, so its not surprising to me. I have my Pentium G3258 stress testing 4.5ghz right now, ran some benches @ 4.4Ghz with it, the only benchmark it really struggled with from the in-game benchmarks I posted last time was Hitman Absolution, that one must need more threads.


----------



## Roaches

Just got word that my 7990 Devil 13 has arrived. Hopefully it can fit in my FT02 so I can do some comparison benches against my 270X Devil Crossfire tonight or tomorrow.


----------



## BruceB

Quote:


> Originally Posted by *Roaches*
> 
> Just got word that my 7990 Devil 13 has arrived. Hopefully it can fit in my FT02 so I can do some comparison benches against my 270X Devil Crossfire tonight or tomorrow.


7990 Devil?!








Is it new?
Can't wait to see the benches on that bad boy!


----------



## Roaches

Quote:


> Originally Posted by *BruceB*
> 
> 7990 Devil?!
> 
> 
> 
> 
> 
> 
> 
> 
> Is it new?
> Can't wait to see the benches on that bad boy!


Not new but the seller had it in near mint condition with boxes and all accessories, he says one of the screws on the shroud is loose. But the physical condition is close as new.

Card itself is in working condition and never mined. 8 months old. He also mention the center fan started rattling not long ago though its nothing a lube job can't fix









I'll post some pics when I get home.


----------



## xutnubu

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xutnubu*
> 
> I'm going to replace the TIM on my card.
> 
> I'm in between the IC Diamond 7 or the Gelid GC-Extreme, which one would be better?
> 
> 
> 
> Go with GELID, I was discussing this with the IC Diamond guy on Notebookreview. IC Diamond 7 is the most optimized for bridging large gaps, its high filler loading (which is what gives the paste the incredible stability) is a liability when gaps thinner than <0.2mm need to be bridged due to its poor spread and surface wetting. Only use if you can confirm that your card has really poor contact, almost to the point of considering using a shim.
> 
> GELID has really high thermal conductivity, its basically the same OEM as the excellent Phobya HeGrease Extreme which has been near the top of Skinneelabs for a while now (second only to either phase change or liquid metal TIMs). It isn't quite as brute stable as IC Diamond 7 but your pasting a GPU which is reasonably easy to repaste, unlike laptops which are the key usage scenario for Diamond 7.
> 
> By the way, it is extremely well documented that IC Diamond is abrasive, was even confirmed by the Rep himself. Basically, again, its due to IC Diamond's high filler to carrier ratio, its got the same abrasion power as that gritty standard Colgate toothpaste. Best way to remove is liberal amounts of something like Arctic cleanser and slow + careful TLC when rubbing.
Click to expand...

Thanks for the detailed explanation.

The cleaning process worries me a bit.

How am I supposed to remove the TIM from the small transistors around the core?

I still have Artic cleaner, I don't know if it expires though, I have had it for a couple of years now, it's there somewhere. Should I just use isopropyl alcohol instead?


----------



## Roaches

Alcohol should work, if the residue on the SMD capacitors near the GPU core concerns you, I'd leave it alone.


----------



## dmfree88

Quote:


> Originally Posted by *xutnubu*
> 
> Thanks for the detailed explanation.
> 
> The cleaning process worries me a bit.
> 
> How am I supposed to remove the TIM from the small transistors around the core?
> 
> I still have Artic cleaner, I don't know if it expires though, I have had it for a couple of years now, it's there somewhere. Should I just use isopropyl alcohol instead?


Ive used old arctic cleaner with thermal residue on the cap and it still works fine (as long as you can wipe it clean and stage 2 is clean). Toothpick was my favorite tool for cleaning difficult spots between caps n such.


----------



## Roaches

+ 1 for toothpics, though I use a precision Flathead to scrape the TIM off the edges of the Die and SMDs


----------



## buttface420

Quote:


> Originally Posted by *Devildog83*
> 
> Ran Valley with the new g3258 @ stock and believe it or not it's so close to my 8350 @ 4.8 Ghz it's not even funny. I did notice that the 1st 16x slot hit a max of Gbs PCI-link speed where the CHVFZ and 8350 maxed at 5.0Ghz. Very interesting. I will see where an overclock takes me.
> 
> 
> 
> 8350 max score at 4.8 Ghz


wow i cant believe it scored that good. im looking at upgradin my e5450 to something better for gaming, anyone know if the g3258 would perform better? i mainly play BF4 so i wonder if this would work better than my old quad.

like which bottle necks my r9 280x worse,e5450 or a g3258? ill probably still go with a fx8320 tho.


----------



## anubis1127

Quote:


> Originally Posted by *buttface420*
> 
> wow i cant believe it scored that good. im looking at upgradin my e5450 to something better for gaming, anyone know if the g3258 would perform better? i mainly play BF4 so i wonder if this would work better than my old quad.
> 
> like which bottle necks my r9 280x worse,e5450 or a g3258? ill probably still go with a fx8320 tho.


I doubt it would do well in BF4, probably OK avg FPS, but the minimum spikes would probably kill the experience. I could install Bf4 on my z97/Pentium G3258 system and try it out though.


----------



## Roaches

Probably the hugest graphics card I've ever touched and held on to. Even dwarfs the size of my GTX 680 SOCs, which I'll take a size comparison pic later once I install this in my main sig rig as its too long to fit in my FT02. After testing and benching I'll be putting this beast right on my display shelf as a collectors item.














Looking forward on how this scales against my 270X Devil Crossfire.








Really wish these cards came back in the market as R9 280X2.


----------



## anubis1127

Dang. That's a nice looking card, great find.


----------



## Roaches

Thanks! Lucky my GPUs are mounted 90 degrees, its so heavy and delicate, I'd feel very worried to see it sag mounted in a standard horizontal case, PCB is noticeably thicker than its young devils.


----------



## BruceB

Quote:


> Originally Posted by *Roaches*
> 
> ...After testing and benching I'll be putting this beast right on my display shelf as a collectors item.
> 
> 
> 
> 
> 
> 
> 
> 
> ...


?!?!?!!?!!?!?!?!?!?!?
..but, the gaming Overkill... ..it can't just sit on a shelf!


----------



## Dasboogieman

Quote:


> Originally Posted by *Roaches*
> 
> Probably the hugest graphics card I've ever touched and held on to. Even dwarfs the size of my GTX 680 SOCs, which I'll take a size comparison pic later once I install this in my main sig rig as its too long to fit in my FT02. After testing and benching I'll be putting this beast right on my display shelf as a collectors item.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looking forward on how this scales against my 270X Devil Crossfire.
> 
> 
> 
> 
> 
> 
> 
> 
> Really wish these cards came back in the market as R9 280X2.


That thing is monstrous. I thought the Asus mars with twin gtx 580s was epic. This thing is on a completely new level.


----------



## rdr09

Quote:


> Originally Posted by *Devildog83*
> 
> Ran Valley with the new g3258 @ stock and believe it or not it's so close to my 8350 @ 4.8 Ghz it's not even funny. I did notice that the 1st 16x slot hit a max of Gbs PCI-link speed where the CHVFZ and 8350 maxed at 5.0Ghz. Very interesting. I will see where an overclock takes me.
> 
> 
> 
> 8350 max score at 4.8 Ghz


Devil,

Have any oc'ing results yet? Also, if possible, usages in games. Thanks.


----------



## Dasboogieman

Quote:


> Originally Posted by *buttface420*
> 
> wow i cant believe it scored that good. im looking at upgradin my e5450 to something better for gaming, anyone know if the g3258 would perform better? i mainly play BF4 so i wonder if this would work better than my old quad.
> 
> like which bottle necks my r9 280x worse,e5450 or a g3258? ill probably still go with a fx8320 tho.


You want more cores for bf4. Onviuosly the ideal would be either devils canyon or ivy-e. However, way too expensive for the performance potential. If you can deal with the heat, the FX is probably superior bang for buck compared to an i5 devils canyon but the i5 would give you more consistent performance across all game engines.

Probably best for the FX owners to chime in. The pentium anniversary is great for older titles but will heavily bottleneck the modern multi threaded engines.


----------



## Roaches

Removed my Nvidia drivers this morning and installed this beast, It took forever getting this card to fit in my case because the PCI-bracket was sagging due to the weight of the card had a hard time lining up with the support screw holes to hold it in with thumb screws.

I've taken some pics showing it compared to my GTX 680 SOC, they go quite well together, but the SOC has a better, bigger heatsink compared to the Devil 13.






Gonna power it up, setup AMD drivers and give this one a test drive.


----------



## Roaches

Some quick 3DMark scores and comparison.

7990 Devil 13 (stock bios) Pretty odd the results are invalid, I have the latest driver installed 14.4
http://www.3dmark.com/fs/2415494

GTX 680 SOC (stock bios)
http://www.3dmark.com/fs/2393072

270X Devil CFX 1200 core / 1400 mem.
http://www.3dmark.com/fs/2352824

Its worth mentioning this card has Coil whine. When the card is idle the whine resonates during web browsing and normal desktop browsing activity. Under load, it gets louder like a buzzing light bulb.


----------



## anubis1127

Quote:


> Originally Posted by *Roaches*
> 
> Some quick 3DMark scores and comparison.
> 
> 7990 Devil 13 (stock bios) Pretty odd the results are invalid, I have the latest driver installed 14.4
> http://www.3dmark.com/fs/2415494
> 
> GTX 680 SOC (stock bios)
> http://www.3dmark.com/fs/2393072
> 
> 270X Devil CFX 1200 core / 1400 mem.
> http://www.3dmark.com/fs/2352824
> 
> Its worth mentioning this card has Coil whine. When the card is idle the whine resonates during web browsing and normal desktop browsing activity. Under load, it gets louder like a buzzing light bulb.


Damn it @Roaches now I want some Dominator Platinums for my x79 WS board...

Was the 4930k at the same clock speed for both 7990 and 680 SOC tests? Physics was a bit higher on the 680 SLI run is why I ask. I read somebody grumbling about NV cards getting higher physics scores(even though it should just be CPU load), which I thought at the time was nonsense, but now I'm curious.

I did a quick run on my 7870s at 1200 core / 1400 mem for a comparison w/ the same CPU:

http://www.3dmark.com/3dm/3505208?

I have 14.6 rc2 installed so bogus driver warning.

That is a bummer a bout the coil whine, but I did hear that was common on those Devil13s, so its not entirely surprising.


----------



## Roaches

Yeah they match really well for a WS board, though are worth it if money is no object.









Hmm now that you mentioned it, I'll look into it with MSI AB logging charts enabled. When I ran the 680s I was on 4.3 Ghz on the CPU, right now I'm at 4.2 ghz since this morning since its a bit hot today on the weather. For the time being I'm gonna stick with the Devil 13 until the weekend (somehow I'm on love with it). Currently my day off work until tomorrow. I'll be sure to reboot back to 4.3ghz and do the re-run on 3Dmark with the 7990.

Nice how your 7870 CFX can match my Devil 13 running stock as I've yet to switch the BIOS to OC mode.

Coil whine isn't that bad, I might of exaggerate a bit about the load part. though it may be annoying to someone that is sound sensitive. I game on headphones so it won't worry me a bit









Some quick scores from Heaven and Valley


----------



## solar0987

Is there currently any way to unlock the voltage on a gigabyte 270x?


----------



## Roaches

@ Solar

I don't think theres much to do, I got the latest MSI AB and I can't even access the voltage slider on both of my 270X Devils and MSI 270. I have not tried TriXX yet to verify. Though Anubis confirms that voltage can be adjusted with TriXX.

@ Anubis.

As requested, I did a re-run at 4.3ghz with the 7990. the Physics score are pretty close to what I got with my 680s.

http://www.3dmark.com/compare/fs/2417516/fs/2393072

I kinda got carried way gaming with this card for the past hours and so far the Crossfire experience is freaking damn smooth on the Devil 13, even smoother than my 680 SLI in some games!
Planetside 2 ran as if it was on a single card even though MSI OSD showed both GPUs under load. My aiming was much smoother than it was on my SLI setup.


----------



## End3R

Quote:


> Originally Posted by *Roaches*
> 
> Removed my Nvidia drivers this morning and installed this beast, It took forever getting this card to fit in my case because the PCI-bracket was sagging due to the weight of the card had a hard time lining up with the support screw holes to hold it in with thumb screws.
> 
> I've taken some pics showing it compared to my GTX 680 SOC, they go quite well together, but the SOC has a better, bigger heatsink compared to the Devil 13.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna power it up, setup AMD drivers and give this one a test drive.


Why are you switching out your 680 for a 7990? The 680 is more powerful, and won't be crippled by physics.


----------



## Roaches

Quote:


> Originally Posted by *End3R*
> 
> Why are you switching out your 680 for a 7990? The 680 is more powerful, and won't be crippled by physics.


I'm simply doing it for the fun and pleasure, doesn't hurt to play around with different hardware as I hold no loyalty to hardware vendors / brands nor do I care whats faster and slower.
Intentionally bought the Devil 13 as a collectors item. I will be switching back to my 680s by the end of the weekend after testing and playing with it.
The fate of my 7990 Devil 13 will be kept as display model for my GPU collection, and if something goes wrong with my modern setup. I can use it as a backup GPU.


----------



## End3R

Quote:


> Originally Posted by *Roaches*
> 
> I'm simply doing it for the fun and pleasure, doesn't hurt to play around with different hardware as I hold no loyalty to hardware vendors / brands nor do I care whats faster and slower.
> Intentionally bought the Devil 13 as a collectors item. I will be switching back to my 680s by the end of the weekend after testing and playing with it.
> The fate of my 7990 Devil 13 will be kept as display model for my GPU collection, and if something goes wrong with my modern setup. I can use it as a backup GPU.


Ah that makes more sense, I had no loyalties, but since joining the red team, as much of a beast as my 270x is, it feels like a slap in the face that any amount of physics brings it to it's knees just because it's an ati (Case in point, while playing metal gear rising, I get a constant 60 fps, but a few slices too many with my sword and I'll drop to below 10 until all those bits and pieces despawn). So even though I use an ati card, chances are I'll be back to using nvidia my next upgrade.


----------



## Roaches

Quote:


> Originally Posted by *End3R*
> 
> Ah that makes more sense, I had no loyalties, but since joining the red team, as much of a beast as my 270x is, it feels like a slap in the face that any amount of physics brings it to it's knees just because it's an ati (Case in point, while playing metal gear rising, I get a constant 60 fps, but a few slices too many with my sword and I'll drop to below 10 until all those bits and pieces despawn). So even though I use an ati card, chances are I'll be back to using nvidia my next upgrade.


I don't recall AMD cards having GPU physics, usually games like Metal Gear Rising are console ports, which I believe to likely tax the CPU than the GPU.

My last post showed a toe to toe comparison of the physics score of my 680s SLI vs 7990 in Firestrike. Both running stock BIOS clocks and the score isn't far from each other.

http://www.3dmark.com/compare/fs/2417516/fs/2393072


----------



## End3R

Quote:


> Originally Posted by *Roaches*
> 
> I don't recall AMD cards having GPU physics.


Exactly, it really sucks but regardless of how good the ati card is, it can't handle physics (at least not compared to nvidia), which nvidia cards can handle without breaking a sweat.


----------



## Roaches

Quote:


> Originally Posted by *End3R*
> 
> Exactly, it really sucks but regardless of how good the ati card is, it can't handle physics (at least not compared to nvidia), which nvidia cards can handle without breaking a sweat.


Mind you, Nvidia GPU physics are exclusive to game titles that support PhysX. Games that don't support PhysX accelerate their Engine Physics from the CPU processing. Good example is Havok.

I don't know if a game like Metal Gear Rising is even PhysX supported. Likely your problem is the result of a CPU bottleneck.


----------



## anubis1127

Quote:


> Originally Posted by *End3R*
> 
> Exactly, it really sucks but regardless of how good the ati card is, it can't handle physics (at least not compared to nvidia), which nvidia cards can handle without breaking a sweat.


That's not exactly how it works. I think you are confusing Physx with a normal physics engine like Havok.

AMD cards can't display the extra NV proprietary Physx parts of a game, so you aren't seeing lower performance vs NV cards because they are not rendering the same content. (Well without tricking the software to run off your CPU, in which case yes, its slow and shouldn't really be done).

Havok, a software physics engine, mainly runs off CPU cores, and generally performs fine using either GPU.


----------



## End3R

Quote:


> Originally Posted by *Roaches*
> 
> Mind you, Nvidia GPU physics are exclusive to game titles that support PhysX. Games that don't support PhysX accelerate their Engine Physics from the CPU processing. Good example is Havok.
> 
> I don't know if a game like Metal Gear Rising is even PhysX supported. Likely your problem is the result of a CPU bottleneck.


Metal Gear Rising uses Havok, it's not a matter of that game needing to be PhysX supported. (again, the game runs flawlessly until an enemy gets cut into too many pieces)

Lol did you even look at my cpu? It's not the result of a bottleneck, (or rather it is, but physics being processed by a cpu are ALWAYS slow) - it's the result of using ati over nvidia, even games that use havok slow down compared to nvidia cards because the nvidia cards can still process the physics before unloading to the cpu.
Quote:


> Originally Posted by *anubis1127*
> 
> AMD cards can't display the extra NV proprietary Physx parts of a game. so you aren't seeing lower performance vs NV cards because they are not rendering the same content.


Actually they totally can, it just brings them to their knees. I've been able to turn on "PhsyX" options in several games.


----------



## Roaches

I'd rather see some third party folks here to cook some results with different CPUs and GPUs to support your facts, though bear in mind console ports are generally poorly threaded to fully function to their full potential on a modern multicore processor. Of course the tide is changing to current consoles on the market.

GPU can do fine rendering millions of polygons in realtime though the displacement per moving object over time is done by CPU, especially on something like Havok.

And no I didn't look at your sigrig


----------



## End3R

Quote:


> Originally Posted by *Roaches*
> 
> I'd rather see some third party folks here to cook some results with different CPUs and GPUs to support your facts, though bear in mind console ports are generally poorly threaded to fully function to their full potential on a modern multicore processor.


I'd love for that to happen too


----------



## anubis1127

Quote:


> Originally Posted by *End3R*
> 
> Metal Gear Rising uses Havok, it's not a matter of that game needing to be PhysX supported. (again, the game runs flawlessly until an enemy gets cut into too many pieces)
> 
> Actually they totally can, it just brings them to their knees. I've been able to turn on "PhsyX" options in several games.


I'm guessing MGR is just poor coding / terrible console port. I would be curious to see if NV cards had similar issues, or even other AMD cards with that game. I've never played it / don't own it, so I have no idea.

Yes, that is what I meant about tricking the software to run Physx on your CPU. I've done it with a couple games, BL2, Batman Arkham City, and it is bollocks running on the CPU, pretty unplayable with a lot of stuff on screen, even with my 4930k.


----------



## End3R

Quote:


> Originally Posted by *anubis1127*
> 
> I'm guessing MGR is just poor coding / terrible console port. I would be curious to see if NV cards had similar issues, or even other AMD cards with that game. I've never played it / don't own it, so I have no idea.


Considering I get a very fluid 60 fps pretty constantly, I don't think it's a poor port, who knows... I'm also curious to know what other people may have experienced.
Quote:


> Originally Posted by *anubis1127*
> 
> Yes, that is what I meant about tricking the software to run Physx on your CPU. I've done it with a couple games, BL2, Batman Arkham City, and it is bollocks running on the CPU, pretty unplayable with a lot of stuff on screen, even with my 4930k.


Yea


----------



## DiceAir

Ok so i decided to go with asus card now. I already have 1 club3d r9 280x royalking and can't get those anymore here. will they crossfire together anything I need to tweak to make it work or will it work out of the box?

Also do you guys think the ASUS r9 280x DCUII top. Is this a reliable card as my one club3d broke 2 times on me so trying out another brand?

I still have the chance to cancel my order.


----------



## solar0987

Tri xx only allows 1.2 volts. How do i go beyond that? Im used to nvidia.


----------



## [CyGnus]

Quote:


> Originally Posted by *DiceAir*
> 
> Ok so i decided to go with asus card now. I already have 1 club3d r9 280x royalking and can't get those anymore here. will they crossfire together anything I need to tweak to make it work or will it work out of the box?
> 
> Also do you guys think the ASUS r9 280x DCUII top. Is this a reliable card as my one club3d broke 2 times on me so trying out another brand?
> 
> I still have the chance to cancel my order.


Actually the Asus R9 280X is one of the best 280x's out there the cooler is excellent very quiet and keeps the card with good temps, and yes it will crossfire fine with the Club3D


----------



## DiceAir

Quote:


> Originally Posted by *[CyGnus]*
> 
> Actually the Asus R9 280X is one of the best 280x's out there the cooler is excellent very quiet and keeps the card with good temps, and yes it will crossfire fine with the Club3D


But i heard about issues with artifacts


----------



## BruceB

Quote:


> Originally Posted by *DiceAir*
> 
> Ok so i decided to go with asus card now. I already have 1 club3d r9 280x royalking and can't get those anymore here. will they crossfire together anything I need to tweak to make it work or will it work out of the box?
> Also do you guys think the ASUS r9 280x DCUII top. Is this a reliable card as my one club3d broke 2 times on me so trying out another brand?
> I still have the chance to cancel my order.


Club3D Cards are made by a vendor called Tul Corp. (as are the Cards from Powercolor and VTX3D) and are generally regarded as the low-end (although I really like my Powercolor Card







)

Asus, AFAIK, make thier own Cards and are generally regarded as being of high Quality (I've never owned an Asus Card myself).








I think it dosen't really matter from which vendor your 280X Comes (unless you're going for a massive OC), just as Long as the brand has a good RMA dept./warranty you'll be fine
















Quote:


> Originally Posted by *solar0987*
> 
> Tri xx only allows 1.2 volts. How do i go beyond that? Im used to nvidia.


Which Card have you got? If it's a 280X then 1.2V is the safe Limit.


----------



## Recr3ational

thinking about going
Quote:


> Originally Posted by *BruceB*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Club3D Cards are made by a vendor called Tul Corp. (as are the Cards from Powercolor and VTX3D) and are generally regarded as the low-end (although I really like my Powercolor Card
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Asus, AFAIK, make thier own Cards and are generally regarded as being of high Quality (I've never owned an Asus Card myself).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think it dosen't really matter from which vendor your 280X Comes (unless you're going for a massive OC), just as Long as the brand has a good RMA dept./warranty you'll be fine


+1
I used to vow for MSI and Asus etc.
The 280x was the first PowerColor I ever got.
Such a great card. Overclocks well, not voltage locked etc..

I'm starting to like the "lower" end cards.


----------



## [CyGnus]

Quote:


> Originally Posted by *DiceAir*
> 
> But i heard about issues with artifacts


Never had one with my card, but i had a xfx and powercolor that gave me artifacts at stock clocks. Some cards have little defects like any other hardware can have. If you do not like Asus take a look at MSI or Sapphire they are also good card's


----------



## DiceAir

Quote:


> Originally Posted by *[CyGnus]*
> 
> Never had one with my card, but i had a xfx and powercolor that gave me artifacts at stock clocks. Some cards have little defects like any other hardware can have. If you do not like Asus take a look at MSI or Sapphire they are also good card's


Never had ASUS nad Sapphire but had MSI cards. I'm just worried about twhat i read online that you can get artifacts on the ASUS cards due to the memory not being cooled. I like the Sapphire toxic cards as the cooler does come in contact with memory. To my understanding you can never have to many heatsinks on a card unless it makes the card super heavy.


----------



## [CyGnus]

Go with the one you feel comfortable with


----------



## solar0987

I have a gigabyte r9 270xoc the voltage is already set at 1.2 thats how it comes stock. I want to go above that like you can with a nvidias card. Is there anyway to do that? Atm my card wont oc at all..


----------



## Devildog83

Quote:


> Originally Posted by *solar0987*
> 
> I have a gigabyte r9 270xoc the voltage is already set at 1.2 thats how it comes stock. I want to go above that like you can with a nvidias card. Is there anyway to do that? Atm my card wont oc at all..


Try Sapphire Trixx or MSI Afterburner


----------



## Recr3ational

Do you guys think its worth going triple 280x? Paired with a 4770k?

Its really cheap now, I've seen some go for like £120. I have 3 weeks off. So I'm thinking about updating my rig..
Do you think I'll get any major problems?


----------



## [CyGnus]

What res do you play? For 1080p one is enough 2 if you want all eye candy and even so its overkill


----------



## Crowe98

tfw you miss the delivery of your brand new RMA'd 280x

ring them up for a re delivery time thinking they could drop it round this morning

tfw they say next monday

*next monday*


----------



## Recr3ational

Quote:


> Originally Posted by *[CyGnus]*
> 
> What res do you play? For 1080p one is enough 2 if you want all eye candy and even so its overkill


Triple 1080p, soon to be triple 1440p when its a bit cheaper


----------



## [CyGnus]

In that case dont think twice, go get it


----------



## nvidiageek

Guys, is flashing a new BIOS version necessary? My GPU's all fine and good, but I so want it to be up to date. What are the chances of bricking a GPU, does it happen randomly or if I screw up the flashing process?


----------



## anubis1127

Quote:


> Originally Posted by *nvidiageek*
> 
> Guys, is flashing a new BIOS version necessary? My GPU's all fine and good, but I so want it to be up to date. What are the chances of bricking a GPU, does it happen randomly or if I screw up the flashing process?


It is not necessary. Bricking a GPU during BIOS flash is not very likely, I've never bricked one at least. If for some reason there is a bad flash, you can always flash it again (provided you have another GPU /onboard to actually see what you are doing).


----------



## Recr3ational

Quote:


> Originally Posted by *nvidiageek*
> 
> Guys, is flashing a new BIOS version necessary? My GPU's all fine and good, but I so want it to be up to date. What are the chances of bricking a GPU, does it happen randomly or if I screw up the flashing process?


Don't fix something that's not broken.


----------



## nvidiageek

Quote:


> Originally Posted by *Recr3ational*
> 
> Don't fix something that's not broken.


I think I'll go with that advice, thank you.


----------



## Dasboogieman

Quote:


> Originally Posted by *DiceAir*
> 
> Never had ASUS nad Sapphire but had MSI cards. I'm just worried about twhat i read online that you can get artifacts on the ASUS cards due to the memory not being cooled. I like the Sapphire toxic cards as the cooler does come in contact with memory. To my understanding you can never have to many heatsinks on a card unless it makes the card super heavy.


I've had a really sour taste with ASUS Engineering of late. The ASUS 7870 DirectCU (I believe the same design was also recycled for the 270 and 270X) has a truly poor design, no dedicated VRM cooling, no VRAM plate, no backplate.
Seriously, even over engineered Digi+ VRMs need dedicated heatsinking instead of just relying on airflow from the hot stack above, there is basically no overvoltage headroom at all, hell the temps were already in the 80s-90s at stock unless the fan speed is at a very noisy 75%. The lack of VRAM cooling was also extremely troubling but not as lethal since Pitacirn uses fairly low speed ICs.
Don't get me started on the Hawaii DirectCU cards lol. They could learn a lesson or 2 on good heatsink design by observing MSI or Sapphire before trying to justify the exorbitant prices on their cards.


----------



## DiceAir

Quote:


> Originally Posted by *Dasboogieman*
> 
> I've had a really sour taste with ASUS Engineering of late. The ASUS 7870 DirectCU (I believe the same design was also recycled for the 270 and 270X) has a truly poor design, no dedicated VRM cooling, no VRAM plate, no backplate.
> Seriously, even over engineered Digi+ VRMs need dedicated heatsinking instead of just relying on airflow from the hot stack above, there is basically no overvoltage headroom at all, hell the temps were already in the 80s-90s at stock unless the fan speed is at a very noisy 75%. The lack of VRAM cooling was also extremely troubling but not as lethal since Pitacirn uses fairly low speed ICs.
> Don't get me started on the Hawaii DirectCU cards lol. They could learn a lesson or 2 on good heatsink design by observing MSI or Sapphire before trying to justify the exorbitant prices on their cards.


Yes i agree. The club3d also doesn't have cooling but i see the toxic does have cooling on everything. That's why I'm leaning towards the sapphire cards


----------



## Roaches

Quote:


> Originally Posted by *DiceAir*
> 
> But i heard about issues with artifacts


ASUS 280X Direct CU II cooler neglects memory cooling, (i.e no base plate and thermal pads for memory)
While the GPU heatsink is great, I'm guessing it suffer from artifacts due memory chips heating up (not sure if the issue is the same with the 780 Direct CU II).
Also power delivery isn't the best on that card. The VRMs can heat up pretty badly. I wouldn't take ASUS marketing seriously as a high end manufacturer.

BruceB makes a good point . It boils down to quality control and customer support of any brand.


----------



## jrizzz

Question, does the Asus 280x Direct CU II have a bios switch? When I open up the cards original Bios in VBE7 I get a message saying that I am editing a UEFI bios and any changes made will affect only the legacy bios. But I cant find any switch on the card so I am assuming it only has 1 bios and it is UEFI.


----------



## Kuhl

Quote:


> Originally Posted by *jrizzz*
> 
> Question, does the Asus 280x Direct CU II have a bios switch? When I open up the cards original Bios in VBE7 I get a message saying that I am editing a UEFI bios and any changes made will affect only the legacy bios. But I cant find any switch on the card so I am assuming it only has 1 bios and it is UEFI.


The DCU2 does not have a bios switch. I believe the matrix does though.


----------



## jrizzz

Quote:


> Originally Posted by *Kuhl*
> 
> The DCU2 does not have a bios switch. I believe the matrix does though.


Ok I thought so. So i guess flashing an edited bios wont do any harm.


----------



## Kuhl

Quote:


> Originally Posted by *jrizzz*
> 
> Ok I thought so. So i guess flashing an edited bios wont do any harm.


From post around here you shouldn't have an issue flashing bios. Why are you trying to flash the Bios on your DCU2?


----------



## jrizzz

Quote:


> Originally Posted by *Kuhl*
> 
> From post around here you shouldn't have an issue flashing bios. Why are you trying to flash the Bios on your DCU2?


I just want to disable the boost function so I can shoot for lower voltage. Its not a big deal for me, I just wanna experiment with it.


----------



## xutnubu

Quote:


> Originally Posted by *jrizzz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Kuhl*
> 
> The DCU2 does not have a bios switch. I believe the matrix does though.
> 
> 
> 
> Ok I thought so. So i guess flashing an edited bios wont do any harm.
Click to expand...

You will lose UEFI support.

It has something to do with UEFI being compressed. There's a post on the VBE7 forum talking about that.

I suggest you to backup the original BIOS.


----------



## bios_R_us

Hello everyone,

I've got a question about overclocking the RAM on my 280x, or better yet about temps, frequencies and how safe is it 

Right now I'm running the card at 1150 GPU and 1750 RAM with 1.281v (which is something like 1.24v under load). No problems so far, but I'm worried about the fact that from what I can see, one of the RAM chips on the card is not covered by the cooling block like all the others (you can see it in the picture). I'm wondering how safe it is to push up the frequencies on the RAM in this case. I'm not touching RAM voltage as there's no control for it in any of the software I'm using. I wouldn't touch it anyway 

So.. do you guys think it's ok to push the RAM speeds in these conditions? Are there any other 280x out there that don't cool all of the RAM chips and if so have you overclocked the RAM on them safely?

Any info is appreciated!
Thanks.


----------



## DiceAir

I know i asked it before but just to finalize it again. What would you guys suggest for 1080p 60fps gaming on let's say minimum of medium settings. The pc will be connected to a TV. Everything else will be good enough except for psu.

I can't decide between r9 270 or r7 260. I'm leaning towards the r9 270 but the thing is I'm trying to convince my brother to rather get a pc so I upgraded to 4790k etc etc and I will give him my old 3770k with b75a-g45 gaming. So I don't want to spend like a fortune on it. if I go with the r7 260x I don't have to spend on the psu as i already have on that will handle that card like a beauty but if I go with the r9 270 I will have to spend some more money on the psu.

Just another thing is it worth building the pc for him when he mostly plays racing games?


----------



## Roboyto

Quote:


> Originally Posted by *DiceAir*
> 
> I know i asked it before but just to finalize it again. What would you guys suggest for 1080p 60fps gaming on let's say minimum of medium settings. The pc will be connected to a TV. Everything else will be good enough except for psu.
> 
> I can't decide between r9 270 or r7 260. I'm leaning towards the r9 270 but the thing is I'm trying to convince my brother to rather get a pc so I upgraded to 4790k etc etc and I will give him my old 3770k with b75a-g45 gaming. So I don't want to spend like a fortune on it. if I go with the r7 260x I don't have to spend on the psu as i already have on that will handle that card like a beauty but if I go with the r9 270 I will have to spend some more money on the psu.
> 
> Just another thing is it worth building the pc for him when he mostly plays racing games?


What PSU do you have to give him? 3770k is a fairly efficient CPU, you don't need anything extravagant to power it and a mid-grade GPU.

R7 260 is a waste of time, you will be wanting to upgrade too soon; 260X w/ 2GB would be minimum IMO. If you want to convince him of getting a PC, then you must have the ability to turn up the eye candy.

Question being, what is your price range?

270/270X are identical cards, but the 270X have an additional power connector for higher clocks. The cheapest 270 on Newegg is a Sapphire Dual-X for $150. You can have the MSI 270X Gaming for $190 - $30 MIR. The 270X would probably allow high/max settings @ 1080P and give you a longer wait between upgrades. The MSI Gaming 270X comes with 20% faster core clock, 920 vs 1120, out of the box, a nice performance jump for $40; $10 after MIR.

Sure, racing games are great. If he is a console gamer, then you can always get a cheap $10 bluetooth receiver and hook up a PS3/XBOX controller to the PC with MotionInJoy.


----------



## DiceAir

Quote:


> Originally Posted by *Roboyto*
> 
> What PSU do you have to give him? 3770k is a fairly efficient CPU, you don't need anything extravagant to power it and a mid-grade GPU.
> 
> R7 260 is a waste of time, you will be wanting to upgrade too soon; 260X w/ 2GB would be minimum IMO. If you want to convince him of getting a PC, then you must have the ability to turn up the eye candy.
> 
> Question being, what is your price range?
> 
> 270/270X are identical cards, but the 270X have an additional power connector for higher clocks. The cheapest 270 on Newegg is a Sapphire Dual-X for $150. You can have the MSI 270X Gaming for $190 - $30 MIR. The 270X would probably allow high/max settings @ 1080P and give you a longer wait between upgrades. The MSI Gaming 270X comes with 20% faster core clock, 920 vs 1120, out of the box, a nice performance jump for $40; $10 after MIR.
> 
> Sure, racing games are great. If he is a console gamer, then you can always get a cheap $10 bluetooth receiver and hook up a PS3/XBOX controller to the PC with MotionInJoy.


Thanks for the info. He will be using a Logitech driving force gt steering wheel. He owns a ps3 but would like to play some newer racing games with better details. So do you think the R9 270 will last a long enough time compared to the ps4. I know the ps4 will last maybe longer but they always tune down graphics at later stage anyway.

I don't really have a price range. it's just between those two gpu's but for now actually the 270 and 270x. I will most probably try and overclock the gpu's a bit to make sure he reaches 60fps more often.


----------



## Roboyto

Quote:


> Originally Posted by *DiceAir*
> 
> Thanks for the info. He will be using a Logitech driving force gt steering wheel. He owns a ps3 but would like to play some newer racing games with better details. So do you think the R9 270 will last a long enough time compared to the ps4. I know the ps4 will last maybe longer but they always tune down graphics at later stage anyway.
> 
> I don't really have a price range. it's just between those two gpu's but for now actually the 270 and 270x. I will most probably try and overclock the gpu's a bit to make sure he reaches 60fps more often.


If I'm not mistaken the PS4 has the same CU and shader count as a 270/270X. So they could be very similar in performance terms, but the PS4 uses a portion of its GPU for compute so it's not getting 100% of its capabilities. I know the PS4 is already being pushed to 90%+ of its graphical capabilities.

The upside to the PC is that you can always upgrade the graphics card. If he is getting an i7 3770k then that CPU will be good for years to come for gaming. Drivers are also constantly being upgraded and optimized so that is free performance as well. You also have the enormous benefit of Steam for PC where you can get games for fractions of their cost even only months after release; popular titles for the console don't drop in price for long periods of time.

If you're going to want to overclock the GPU, than the 270X will have to be your choice. 270 only has (1) PCI-E power connector and is very limited in the clock speeds they can reach.

You will want a 270X at least, if not a 280 for the 40% boost in stream processors and 50% in VRAM.


----------



## xutnubu

Quote:


> Originally Posted by *DiceAir*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roboyto*
> 
> What PSU do you have to give him? 3770k is a fairly efficient CPU, you don't need anything extravagant to power it and a mid-grade GPU.
> 
> R7 260 is a waste of time, you will be wanting to upgrade too soon; 260X w/ 2GB would be minimum IMO. If you want to convince him of getting a PC, then you must have the ability to turn up the eye candy.
> 
> Question being, what is your price range?
> 
> 270/270X are identical cards, but the 270X have an additional power connector for higher clocks. The cheapest 270 on Newegg is a Sapphire Dual-X for $150. You can have the MSI 270X Gaming for $190 - $30 MIR. The 270X would probably allow high/max settings @ 1080P and give you a longer wait between upgrades. The MSI Gaming 270X comes with 20% faster core clock, 920 vs 1120, out of the box, a nice performance jump for $40; $10 after MIR.
> 
> Sure, racing games are great. If he is a console gamer, then you can always get a cheap $10 bluetooth receiver and hook up a PS3/XBOX controller to the PC with MotionInJoy.
> 
> 
> 
> Thanks for the info. He will be using a Logitech driving force gt steering wheel. He owns a ps3 but would like to play some newer racing games with better details. So do you think the R9 270 will last a long enough time compared to the ps4. I know the ps4 will last maybe longer but they always tune down graphics at later stage anyway.
> 
> I don't really have a price range. it's just between those two gpu's but for now actually the 270 and 270x. I will most probably try and overclock the gpu's a bit to make sure he reaches 60fps more often.
Click to expand...

Go with the 270X and overclock it all you can.

The 270X will be good enough to run games @ High while being conservative with the AA, this to ensure 1080p60. If what your brother cares about is graphics, then it will do 1080p30 @ Ultra with a nice overclock.

The card could last a lot, remember the GPU inside the PS4 is basically a beefed up 7850, if we remove all the overhead and add the software improvements that will get with the years, it will probably get to 270X level.


----------



## buttface420

for the amount of money you're going to spend on a r9 270 you could by a used card instead

a hd 7950 goes for around 100-120 bucks on ebay especially if you buy a action on tues or wednesday morning.

i got my oc r9 280x for $160.00

the best bang for buck tho is seriously a used hd 7950. 3gb vram and can run pretty much any game all ultra


----------



## rdr09

Quote:


> Originally Posted by *buttface420*
> 
> for the amount of money you're going to spend on a r9 270 you could by a used card instead
> 
> a hd 7950 goes for around 100-120 bucks on ebay especially if you buy a action on tues or wednesday morning.
> 
> i got my oc r9 280x for $160.00
> 
> the best bang for buck tho is seriously a used hd 7950. 3gb vram and can run pretty much any game all ultra


^this. at only 1100 it is already as fast as a stock 280X.

http://www.3dmark.com/3dm11/7288276


----------



## Crowe98

My replaced defective MSI R9 280x Gaming is due to be delivered today, so if anyone is interested I can post some pictures of the new card.


----------



## neurotix

+1 to getting a used card instead of a new 270X. For the same price as a new 270X, you can get a used 7970, 280X, 7950 or 280. I'm leaning towards the 7970. Might as well go all out, and I see some on Ebay for under $200 buy it now. A 270X won't max out many new games on Ultra with low AA at 1080p. A 7970 most definitely will, and should still be a good card for a year or two. The only games you probably won't get 60 fps with are Far Cry 3 and Crysis 3. Everything else should run great. If you are willing to sacrifice a little bit of visual quality (post processing), you can even get a steady 50+ fps in Crysis 3. (I know, I had a single 7970 in my rig for about a year, it's now in my girlfriend's computer). Still quite the powerful card.


----------



## DiceAir

Thanks but i don't really like used cards. I'm from South Africa and you know, people can be a bit dodgy here.


----------



## Dasboogieman

Quote:


> Originally Posted by *DiceAir*
> 
> Thanks but i don't really like used cards. I'm from South Africa and you know, people can be a bit dodgy here.


There are some decent sellers who might be willing to hook you up with international shipping. Because the 7950 for $150 is the best possible performance vs price.

If you really must go brand new, then get the 270x if you can. The performance difference between the 270 and the 260x is huge compareded to the price, the Bonaire core in the latter was really made for 1680x 946 or even 1440x800 resolutions, it would struggle with 1080p.
The main difference between the 270 and 270x is that the former tends to have a weaker VRM assembly, sometimes with locked voltages so you will have limited over voltage (thus OC) headroom if at all.
Get the 270x if the difference with the 270 is only about $20-$30.

That being said, the disadvantage of the Tahiti cards is power consumption. You would do well to have at least a 650w unit. With the Pitcairn 270x you can get away with 500W units.


----------



## ehafling

So, I've just recently picked up a GIGABYTE Radeon R9 270X 4GB Video Card & Corsair CS Series CS550M 550W 80PLUS Gold Modular PSU in a bundle recently for my build and it was pretty awesome to start. However, recently I've been experiencing video blackouts during games at seemingly random temperatures(52C-70C) and all occurring at about a little over 1 hour in.

These blackouts usually seem to just kill the video feed and my monitor goes into sleep mode(because there is no longer a feed coming through the HDMI cable) but the sound runs in the background. I cannot Ctrl+Alt+Delete or anything to make anything pop up so I'm forced to shutdown. I've tried to do a bit of under-clocking and I've seen no difference in the occurrence of these blackouts.

My drivers are all up to date(Including GPU), I just updated my BIOS, and I have all of the latest Windows, Steam, and Game updates.

First of all I'd like to figure out if it's a PSU or GPU issue. I'm wondering if this may be an overheating issue? But more specifically I'd like to know if this is faulty hardware or improper installation.

Here are my basic specs

I'd love to see what you guys may know. Thanks in advance for any help or advice.


----------



## BruceB

Quote:


> Originally Posted by *ehafling*
> 
> So, I've just recently picked up a GIGABYTE Radeon R9 270X 4GB Video Card & Corsair CS Series CS550M 550W 80PLUS Gold Modular PSU in a bundle recently for my build and it was pretty awesome to start. However, recently I've been experiencing video blackouts during games at seemingly random temperatures(52C-70C) and all occurring at about a little over 1 hour in.
> These blackouts usually seem to just kill the video feed and my monitor goes into sleep mode(because there is no longer a feed coming through the HDMI cable) but the sound runs in the background. I cannot Ctrl+Alt+Delete or anything to make anything pop up so I'm forced to shutdown. I've tried to do a bit of under-clocking and I've seen no difference in the occurrence of these blackouts.
> My drivers are all up to date(Including GPU), I just updated my BIOS, and I have all of the latest Windows, Steam, and Game updates.
> First of all I'd like to figure out if it's a PSU or GPU issue. I'm wondering if this may be an overheating issue? But more specifically I'd like to know if this is faulty hardware or improper installation.
> Here are my basic specs
> I'd love to see what you guys may know. Thanks in advance for any help or advice.


I think the corsair PSUs shut off completely when theres too much power draw, which would mean the whole PC would go out and not just the GPU (corrrct me if I'm wrong about that







) in that case I would guess a driver crash. Is your GPU or CPU overclocked?


----------



## ehafling

Both the CPU and the GPU are set to stock clock speeds(I've even tried to underclock a little to see if there was any difference). And I've triple checked current driver versions for basically everything on my PC right now.


----------



## BruceB

Quote:


> Originally Posted by *ehafling*
> 
> Both the CPU and the GPU are set to stock clock speeds(I've even tried to underclock a little to see if there was any difference). And I've triple checked current driver versions for basically everything on my PC right now.


550W should be more than enough for your system. Does it happen with a particular game or is the time delay before it happens dependant on the game you play? I don't want to point out the obvious, but you have checked all the power cables etc are plugged in correctly?


----------



## ehafling

So far it has happened in the following: Ghost Recon Phantoms, Saints Row IV, Dirt 3, Hitman Absolution, DayZ, XCom and a few others. Which seems to be any sort of mildly graphically dependent game that I play. However, it displays all of them at 1920x1080 at almost 60FPS with no stutters.

Feel free to point out the obvious! But yes my cables are all properly attached. I've checked internal and external. I've been considering reseating the GPU to see if I installed it improperly or something.

Maybe(Hopefully!?) it's just my HDMI cables too. I'm ordering some brand new 2.0's just to make sure.


----------



## Devildog83

Quote:


> Originally Posted by *Crowe98*
> 
> My replaced defective MSI R9 280x Gaming is due to be delivered today, so if anyone is interested I can post some pictures of the new card.


yep


----------



## Dasboogieman

Quote:


> Originally Posted by *ehafling*
> 
> So, I've just recently picked up a GIGABYTE Radeon R9 270X 4GB Video Card & Corsair CS Series CS550M 550W 80PLUS Gold Modular PSU in a bundle recently for my build and it was pretty awesome to start. However, recently I've been experiencing video blackouts during games at seemingly random temperatures(52C-70C) and all occurring at about a little over 1 hour in.
> 
> These blackouts usually seem to just kill the video feed and my monitor goes into sleep mode(because there is no longer a feed coming through the HDMI cable) but the sound runs in the background. I cannot Ctrl+Alt+Delete or anything to make anything pop up so I'm forced to shutdown. I've tried to do a bit of under-clocking and I've seen no difference in the occurrence of these blackouts.
> 
> My drivers are all up to date(Including GPU), I just updated my BIOS, and I have all of the latest Windows, Steam, and Game updates.
> 
> First of all I'd like to figure out if it's a PSU or GPU issue. I'm wondering if this may be an overheating issue? But more specifically I'd like to know if this is faulty hardware or improper installation.
> 
> Here are my basic specs
> 
> I'd love to see what you guys may know. Thanks in advance for any help or advice.


Have you double checked that your 6 pin cables are plugged in properly and not damaged? also double check that the 6 pin ports are also not damaged (e.g. bad soldering, excessive flexing etc). I doubt the issue is because your PSU isn supplying enough wattage.
I'm able to run a 270X Hawk edition off a Silverstone PSU that was only able to supply 418W for the 12V rail with a power hungry i5-750.

My first instinct if the issue isn't PSU related is VRAM issues, the GCN cards tend to blackscreen when the VRAM is unstable, in the order of descending likelihood:
1. Faulty VRAM chips, its rare but can happen
2. Overheating VRAM, can happen if the GPU has weak VRAM cooling or if there was a manufacturing defect with the cooler
3. Damaged GPU VRAM VRM assembly, not supplying RAM chips with enough juice under heavy load
4. Damaged GPU IMC

The last one is quite rare and the other 3 are easy to test. Try to underclock the VRAM to see if the issue persists, if you still get problems then its likely to be the VRAM. Then, try to push a little more voltage to the GPU core and see if the problem persists, if it still persists then it is likely to be option 3 or 4, in which case it's RMA time. Additionally you can try running this application: https://dl.dropboxusercontent.com/u/105687884/memtestCL.zip
Its a fixed MemtestCl, it theoretically should show if your GPU is having unstable VRAM but I haven't validated it against a user with known Blackscreen issues yet. Plus, GDDR5 ECC tends to obfuscate the results. But give it a shot and see what you get.


----------



## anubis1127

Quote:


> Originally Posted by *DiceAir*
> 
> Thanks for the info. He will be using a Logitech driving force gt steering wheel. He owns a ps3 but would like to play some newer racing games with better details. So do you think the R9 270 will last a long enough time compared to the ps4. I know the ps4 will last maybe longer but they always tune down graphics at later stage anyway.
> 
> I don't really have a price range. it's just between those two gpu's but for now actually the 270 and 270x. I will most probably try and overclock the gpu's a bit to make sure he reaches 60fps more often.


Do you know which games? I have 7870s in my main rig right now, and just tested a couple of the racing games I have installed.

My newest one is probably Assetto Corsa, that doesn't have an in-game benchmark, but I used fraps to record a quick run around the practice track.



> 2014-07-14 21:14:15 - acs
> Frames: 20013 - Time: 254172ms - Avg: 78.738 - Min: 66 - Max: 91


That was 1080p with ultra / high settings and 8x AA on. Single 7870 at 1100 / 1200. That game had the GPU pegged at 100%, so I would probably recommend slightly lower settings than what I was using for more than just the practice track.

I also tried Dirt3 benchmark, highest in-game settings 1080p:



Dirt Showdown, and Grid 2 ran well on it too, Grid 2 was getting close, but it was still able to maintain a steady 60 FPS while playing with everything turned up. All 4 games I tested real quick with my keyboard / mouse were pretty playable @ 1080p 60+ FPS.


----------



## DarthBaggins

The 7870's are a strong mid level card, best card I decided to buy next to the 270x's (want to get a couple sapphire toxic 270x's)


----------



## anubis1127

Quote:


> Originally Posted by *DarthBaggins*
> 
> The 7870's are a strong mid level card, best card I decided to buy next to the 270x's (want to get a couple sapphire toxic 270x's)


Yeah, I like the ones I picked up, stock core clock on them is just under what most 270X boost to, 1100mhz. They also have unlocked voltage in AB up to 1.3V, which most 270Xs don't allow. Call it 7870, call it 270X, I call it a nice little PPD producer.


----------



## DarthBaggins

Yeah mine averaged 65-70k ppd a piece in windows on [email protected] Just sucks can't fold them in linux


----------



## anubis1127

Quote:


> Originally Posted by *DarthBaggins*
> 
> Yeah mine averaged 65-70k ppd a piece in windows on [email protected] Just sucks can't fold them in linux


I see about 80-106k PPD depending on the WU.


----------



## Arkanon

Bit of a weird thing going on here, or atleast i think it's weird.
Put my gpu fans at 60% fixed amount which leads up to about 2400rpm.
Every time my gpu is under load, the fan speeds seem to drop a bit 2277-23xx ish.
When i close the game, the fan speed goes up again to 2400 ish.
No idea if it's normal or not, so anyone around here being able to help me out?


----------



## Arkanon

In addition to previous posting: There was something terribly wrong with those fans. Not only did the fanspeed drop under load, they also made the 12V load drop to about 11.40V. Constructed a home made fan bracket and ditched the msi ones, while i was at it i also replaced TIM. Tim replacement wasn't needed because msi actually did a mighty fine job there. Results: whilst the temps have remained the same with a home made 2x 120mm fan bracket, the 12v line seems to be alot more stable now= Me happy.


----------



## xutnubu

Those fans have a te
Quote:


> Originally Posted by *Arkanon*
> 
> Bit of a weird thing going on here, or atleast i think it's weird.
> Put my gpu fans at 60% fixed amount which leads up to about 2400rpm.
> Every time my gpu is under load, the fan speeds seem to drop a bit 2277-23xx ish.
> When i close the game, the fan speed goes up again to 2400 ish.
> No idea if it's normal or not, so anyone around here being able to help me out?


Those fans have a terrible reputation, just visit Newegg.

Some other users reported lower RPM as well followed by the death of the fans.


----------



## v3n0m90

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Devildog83*
> 
> Official AMD R9 280X / 280/ 270X/270 Owners Club
> 
> A Gamers GPU
> 
> 
> 
> *To be added to the member list please post a picture of your GPU, brand, series and clock you wish posted.
> *
> 
> *R9 280X Members List*
> 
> *nastytime - MSI Twin Frozr- R9 280X-1020(BST 1050)/1500*
> 
> *Offender_Mullet - Gigabyte Windforce - R9 280X - 1000(BST 1100)/1500*
> 
> *D3TH.GRUNT - ASUS DirectCU II Top-1070(BST 1100)/1600*
> 
> *cyph3rz - Gigabyte Windforce - R9 280X - 1000(BST 1100)/1500*
> 
> *Theroty - ASUS Matrix Platinum -1100/1600*
> 
> *Resme - ASUS DirectCU II Top-1070(BST 1100)/1600*
> 
> *candy_van - ASUS DirectCU II Top-1070(BST 1100)/1600*
> 
> *eTheBlack - Gigabyte Windforce - R9 280X - 1000(BST 1100)/1500*
> 
> *DeviousAddict - XFX Double D - R9 280X - 850(BST 1000)/1500*
> 
> *brazilianloser - Sapphire Vapor-X - R9 280X - 950(BST 1070)/1500*
> 
> *Shurtugal - ASUS Matrix Platinum -1100/1600*
> 
> *1haxor - Sapphire Vapor-X - R9 280X - 1250/1600*
> 
> *TonytotheB- MSI Gaming Edition - R9 280X - 1050/1500 x2*
> 
> *Ashuiegi- Sapphire Toxic - R9 280X - 1250/1700*
> 
> *Genma- Sapphire Toxic - R9 280X - 1200/1750*
> 
> *NinjaToast - ASUS DirectCU II Top-1070(BST 1100)/1600*
> 
> *Switchblade1969- Sapphire Vapor-X - R9 280X - 1203/1550*
> 
> *stolojansky- Sapphire Vapor-X - R9 280X - 1100/1600*
> 
> *Miiksu- HIS IceQ X Turbo - R9 280X - 1200/1760*
> 
> *Archea47- Gigabyte Rev 2.02 - R9 280X - 1100/1600 x2*
> 
> *lillebj0rn - Asus R9 280x 3 GB "Direct CU II Top"*
> 
> *eclipsedude - Sapphire Toxic R9 280x - 1200/1800*
> 
> *Lisjak - Asus R9 280x "DC II TOP" - 1100/1620*
> 
> *madorax - Msi R9 280x Gaming - 1160/1650 Daily - 1200/1700 OC*
> 
> *gh0stfac3killa - Gigabyte - R9 280x X 2*
> 
> *kpo6969 - Asus R9 280x "DC II Top" 1070/1600*
> 
> *PerspexPC - Sapphire R9 280x Toxic - OC 1200/1800*
> 
> *DiceAir - Club3D R9 280x X 2 - 1150/1500, OC 1150/1750*
> 
> *cr4p - MSI R9 280x Gaming - 1140/1560*
> 
> *danilon62 - Gigabyte R9 280x 1100/1500*
> 
> *uaedroid - Gigabyte R9 280x 1100/1500 OC 1195/1800*
> 
> *cremelo - Sapphire R9 280x Toxic 1150/1600*
> 
> *ILLmatik94 - Asus R9 280x "DCII Top" - 1180/1650*
> 
> *[email protected] - Asus R9 280x "DCII Top" - 1100/1650*
> 
> *Frogeye - Asus R9 280x "DCII" - 1100/1600*
> 
> *Butternut101 - Visiontek R9 280x*
> 
> *JCH979 - Asus R9 280x "DCII Top" - 1070/1500, 1200/1666*
> 
> *Jaffi - Asus R9 280x "DCII Top" 1150/1700*
> 
> *Pis - Sapphire Toxic R9 280x - 1150/1600*
> 
> *F3ERS 2 ASH3S - XFX DD R9 280x - 1070/1500, OC 1100/1850*
> 
> *kersoz2003 - Sapphire Vapor-X R9 280x - 1240/1800*
> 
> *Z0K1 - Sapphire Toxic R9 280x - 1250/1660*
> 
> *hamzta09 - XFX DD Ghost R9 280x X2*
> 
> *ricko99 - Assu R9 280x DCII Top - 1070/1600*
> 
> *Cinnamoncider - Asus R9 280x DCII Top - 1070/1600*
> 
> *Clexzor - MSI R9 280x - 1250/1650*
> 
> *Tugz - MSI R9 280x X2 - 1020/1500*
> 
> *nubk11la - Sapphire R9 280x Vapor X - 1070/1550*
> 
> *M1kuTheAwesome - Powercolor R9 280x Turbo Duo - 1219/1700*
> 
> *Devilangel - Gigabyte R9 280x WF - 100/1500*
> 
> *[CyGnus] - Asus R9 280x DCII Top - 1100/1600*
> 
> *Hacker90 - XFX Double D BE - R9 280X - 1100/1550*
> 
> *Linuxfueled - MSI R9 280x Gaming - 1202/1603*
> 
> *KauBoy - Sapphire R9 280x Vapor X, 1200/1600*
> 
> *Dabby91 - Asus R9 28x DCII Top - 1070/1600*
> 
> *cookiesown - Powercolor R9 280x TurboDuo X2 - 1100/1675*
> 
> *Recr3ational - Powercolor R9 280x Turbo Dou - 1250/1600*
> 
> *Shogin - Diamond - R9 280x Boost X2 - 850/1500*
> 
> *mAs81 - MSI Gaming R9 280x - 1020/1500 - OC 1200/1600*
> 
> *JetSet Chilli - MSI Gaming R9 280x - 1100/1500*
> 
> *CravinR1 - Saphire Dual X R9 280x X3 - 1080/1500*
> 
> *JaredLaskey - Asus R9 280x DCII Top X2 - 1100/1600*
> 
> *Tobe404 - Gigabyte R9 280x rev. 2 - 1085/1485*
> 
> *omalinkin - XFX R9 280x TDBD rev. 2.2*
> 
> *eAT5 - Powercolor R9 280x Turbo Duo X2 - 1040/1500*
> 
> *GuestVeea - XFX R9 280x DD Black Edition - 1150/1600*
> 
> *NBAasDOGG - MSI R9 280x Gaming X2 - 1220/1600*
> 
> creationsh - Diamond R9 280x - 1175/1500
> 
> *JeremyFenn - Saphire Vapor X R9 280x - 1100/1600*
> 
> *BruceB - Powercolor R9 280x Turbo Duo - 1030/1500*
> 
> *CaptianDyson - Sapphire R9 280x Dual X OC*
> 
> *bios-R-us - Sapphire R9 280x Dual X - 1100/1600*
> 
> *Neocoolzero - XFX R9 280x DD Black Edition*
> 
> *c3p0c3p0 - MSI R9 280x Gaming - 1020/1500*
> 
> *xutnubu - MSI R9 280x Gaming - 1050/1500*
> 
> *agrims - Gigabyte R9 280x Rev. 2 - 1100/1500*
> 
> *bios_R_us - Sapphire R9 280x Dual X - 1150/1750*
> 
> *R9 280Members List*
> 
> *lluukkman - HIS IceQ R9 280 - 953/1250*
> 
> *R9 270X Members List*
> 
> *Devildog83 - Powercolor R9 270x Devil/HD 7870 Devil, 1200/1400 X-Fire- OC 1235/1590*
> 
> *maxti6- XFX Double D - R9 270X-1050/1400*
> 
> *bokchoi- Sapphire Vapor-X - R9 270X-1168/1504*
> 
> *Dimaggio1103- PowerColor Devil - R9 270X-1250/1550*
> 
> *benjamen50- Gigabyte Windforce 3X - R9 270X-1150/1500*
> 
> *neurotix- Sapphire Vapor-X - R9 270X-1280/1500 - 1300/1500 OC*
> 
> *jamponget9 - Sapphire Toxic R9 270x - 1150/1500*
> 
> *robster84 - XFX R9 270x boost - 1050/1400*
> 
> *G20415 - Sapphire Dual X R9 270x - 1070/1400 OC 1170/1500*
> 
> *GTR Mclaren - Gigabyte R9 270x OC - 1160 /1520*
> 
> *Falconx50 - Powercolor R9 270x Devil - 1180/1400*
> 
> *Delphi - HIS IceQ R9 270x X2 - 1100/1450*
> 
> *Necrochain - Sappire Dual-X R9 270x - 1070/1400 - 1200/1600*
> 
> *Goldswimmerb - XFX DD R9 270x - 1100/1400*
> 
> *Pesmerrga - Sapphire Dual-X R9 270x - 1250/1575*
> 
> *mikemykeMB - XFX R9 270x - 1175/1500*
> 
> *dexterprog - HIS R9 270x ICE-Q X2 TB - 1140/1400*
> 
> *staryoshi - Asus R9270x DCU II - 1120/1400*
> 
> *Repo-Man - Asus 29 270x DCII Top -*
> 
> *Tiharo - Gigabyte R9 270X 4GB OC CFX - 1100/1400*
> 
> *Chubz727 - Powercolor R9 270x Devil - 1080/1500*
> 
> *bartledoo - Sapphire Toxic R9 270x - 1100/1500 - 1275/1575 OC*
> 
> *drieg500 - MSI R9 270x Hawk - 1250/1575*
> 
> *turbobooster - Sapphire Toxic R9 270x - 1100/1500*
> 
> *HCS01 - Powercolor - R9 270x Devil - 1180/1500*
> 
> *HunnoPT - Gigabyte Windforce R9 270x - 1100/1400-OC 1200/1600/MSI Gaming R9 270x - 1030/1400*
> 
> *Farih - Sapphire Toxic - R9 270x - 1362/1550*
> 
> *orlfman - MSI Gaming R9 270x*
> 
> *Arkanon - MSI Gaming R9 270x - 1360/1500*
> 
> *kzone - Sapphire Dual X R9 270x - 1070/1400*
> 
> *Pionir - Asus R9 270x DCII Top -*
> 
> *KingT - Powercolor PCS+ R9 270x - 1100/1425*
> 
> *Spade616 - HIS ICE-Q x2 R9 270x - 1140/1400*
> 
> *runelotus - HIS R9 270x Ice-Q - 1140/1400*
> 
> *Bagmup - MSI R9 270x Gaming X2*
> 
> *Roaches - Powercolor R9 270x Devil*
> 
> *uroshnish - Sapphire Vapor-X R9 270x*
> 
> *Praston - Asus R9 270x DCII Top - 1020/1400*
> 
> *microkid21 - Powercolor R9 270x Devil - 1140/1400*
> 
> *mxfreek09 - Gigabyte R9 270x - 1200/1500*
> 
> *Anonizer - Sappire Vapor X 270x*
> 
> *Bruska - Sapphire Vapor X R0 270x - 1200/1500*
> 
> *Roboyto - Powercolor Devil R9 270x -*
> 
> *trabunco - Asus Ro 270x DCII Top - 1300/1500*
> 
> *Destro - Gigabyte R9 279x 4Gb Windforce 1100/1400*
> 
> *Nfsdude0125 - Sapphire Dual X R9 270x 4 GB*
> 
> *amateurbuilder - XFX R9 270x 1050/1400*
> 
> *R9 270 Members list*
> 
> *Insidejob - XFX R9 270 DD - 1025/1556*
> 
> *Jawwwwish - XFX R9 270 - 1000/1450*
> 
> *D34TH - Asus R9 270 DCII OC 1000/1500*
> 
> *bardacuda - Gigabyte Windforce R9 270 OC - 1270/1500 - OC, Asus DCII OC R9 270 - 1270/1600*
> 
> *Pionir - Gigabyte R9 270 OC*
> 
> *Majentrix - Gigabyte R9 270 OC - 1175/1400*
> 
> *lightsout - Sapphire DualX R9 270 X 2*
> 
> *Anubus1127 - MSI Gaming R9 270 X2 - 1100/1500*
> 
> *To be added on the member list please post a picture of your GPU, brand, series and clocks you wish to be posted.*.
> 
> 
> 
> AMD Drivers & Software Link
> 
> *Catalyst Software Suite*
> 
> *AMD Catalyst™ Display Driver 13.12 WHQL*
> 
> *AMD Catalyst™ Display Driver 13.11 WHQL*
> 
> *AMD Beta Drivers*
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64
> 
> *AMD Catalyst™ 13.11 Beta9.5 for Windows® *
> 
> 
> Spoiler: Driver Info!
> 
> 
> 
> *Resolves the issue of AMD Overdrive missing in the AMD Catalyst Control Center for the AMD Radeon™ R9 290 Series graphics cards
> Resolves intermittent flickering seen on some AMD Radeon R9 270x graphics cards
> Resolves graphics corruption seen in Starcraft®
> Improves frame pacing r esults in AMD Quad CrossFire™ configurations for the following: Hitman: Absolution, and Total War™ : Rome 2*
> 
> 
> 
> *AM D Catalyst™ 13.11 Beta 9.4 Driver for Windows®*
> 
> 
> Spoiler: Driver Info!
> 
> 
> 
> *May resolve intermittent black screens or display loss observed on some AMD Radeon™ R9 290X and AMD Radeon R9 290 graphics cards
> Improves AMD CrossFire™ scaling in the multi-player portion of Call of Duty®: Ghosts
> 
> AMD Enduro Technology Profile updates:
> XCOM: Enemy Unknown
> Need for Speed Rivals*
> 
> 
> 
> *AMD Catalyst™ 13.11 Beta 9.2 for Windows®*
> 
> 
> Spoiler: Driver Info!
> 
> 
> 
> *Call of Duty®: Ghost - Improves anti-aliasing performance, and updates the AMD CrossFire™ profile*
> *AMD Radeon™ R9 290 Series - PowerTune update to reduce variance of fan speed / RPM
> Resolves intermittent crashes seen in legacy DirectX® 9 applications*
> 
> 
> 
> *AMD Catalyst™ 13.11 Beta 8 for Windows® *
> 
> 
> Spoiler: Driver Info!
> 
> 
> 
> *Resolves intermittent crashes experienced with Battlefield 4 on Windows 8 based systems*
> 
> 
> 
> *AMD Catalyst™ 13.11 Beta 7 Driver for Windows®*
> 
> 
> Spoiler: Driver Info!
> 
> 
> 
> *Increases AMD CrossFire ™ scaling up to an additional 20% for Battlefield 4*
> 
> 
> 
> *AMD Catalyst™ 13.11 Beta 6 Driver for Windows ®*
> 
> 
> Spoiler: Driver Info!
> 
> 
> 
> Includes support for the new prod ucts:
> *[AMD Radeon ™ R9 290X
> AMD Radeon R9 290*
> 
> Performance improvements
> * Batman: Arkham Origins - improves performance up to 35% with MSAA 8x enabled
> Total War™: Rome 2 - improves performance up to 10%
> Battlefield 3 - improves performance up to 10%
> GRID 2 - improves performance up to 8.5%
> DiRT Showdown - improves performance up to 10%
> Formula 1™ 2013 - improves performance up to 8%
> DiRT 3 - improves performance up to 7%
> Sleeping Dogs - improves performance up to 5%
> Automatic AMD Eyefinity Configuration
> Automatic "plug and play" configuration of supported Ultra HD/4K tiled displays*
> 
> 
> 
> *AMD Catalyst™ 13.10 Beta 2 Driver for Windows*
> 
> 
> Spoiler: Driver Info!
> 
> 
> 
> NOTE! This Catalyst Driver is provided "AS IS", and under the terms and conditions of the End User License Agreement provided therewith.
> 
> Feature Highlights of The AMD Catalyst 13.10 Beta2 Driver for Windows
> ◾AMD Catalyst 13.10 Beta2 includes all of the features/fixes found in AMD Catalyst 13.10 Beta
> ◾Includes a single GPU, and AMD Crossfire™ game profile for Battlefield 4™
> ◾Total War™: Rome 2 AMD CrossFire profile update
> ◾AMD CrossFire frame pacing improvements for CPU-bound applications
> ◾Resolves image corruption seen in Autodesk® Inventor® 2014
> ◾Resolves intermittent black screen when resuming from a S3/S4 sleep-state if the display is unplugged during the sleep-state on systems supporting AMD Enduro™ technology
> ◾Updated AMD Enduro technology application profiles for AMD Radeon HD notebook users
> ◦Profile Highlights:
> •Total War: Rome 2
> •Battlefield 4
> •Saints Row 4
> •Splinter Cell® Blacklist™
> •FIFA 14
> 
> 
> 
> Link to 700-760w PSU reviews, posted by "Shilka" - http://www.overclock.net/t/1482157/best-fully-modular-700-750-watts-psu#post_22109815
> 
> Thought this might be helpful due to the fact that it's about the recommended wattage for a 280x and will do great for 2 x 270/270x.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *USEFUL SOFTWARE*
> 
> 
> 
> 
> 
> 
> 
> 
> Links Below
> 
> *AMD Catalyst*
> 
> *Nvidia & AMD Drivers Un-install Utility*
> 
> *RadeonPro*
> 
> *Atiman Uninstaller v.7.0.2.msi*
> 
> *TechPowerUp GPU-Z*
> 
> *MSI Afterburner*
> 
> *Sapphire TriXX*
> 
> *ASUS GPU Tweak*
> 
> *OCCT*
> 
> *Fraps*
> 
> *How to Uninstall GPU Drviers by OCN member BradleyW*
> 
> *How To Uninstall Your ATI GPU Drivers*
> 
> *How To Uninstall Your NVIDIA GPU Drivers*
> 
> HELP WITH MANTLE
> 
> http://www.overclock.net/t/1429303/amd-mantle-discussion-thread
> 
> http://www.overclock.net/t/1464054/mantle-bug-thread
> 
> Wear the signature proudly if you like.
> *
> 
> 
> 
> 
> 
> 
> 
> [Official] AMD R9 280X / 280 & 270X Owners Club
> 
> 
> 
> 
> 
> 
> 
> *
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> [CENTER] [B]:devil:  [Official] AMD R9 280X / 280 & 270X Owners Club :devil:[/B] [/CENTER]






ASUS ROG Matrix Platinum 280x
GPU Core - 1250mhz OC
I was stable at 1260 but there was a pretty big performance drop in all of the benchmarks I ran so I went back down to 1250.
Memory Clock - 6600mhz OC

I also don't use the stock cooler on this. I have an NZXT Kraken g10 with a Corsair h55 set in push pull to cool it. I never really go above 41-42 degrees


----------



## Arkanon

Quote:


> Originally Posted by *xutnubu*
> 
> Those fans have a terrible reputation, just visit Newegg.
> 
> Some other users reported lower RPM as well followed by the death of the fans.


Care to link me to that page?
As long as everything worked fine those were actually good fans, kept my card around 75°c with voltage cranked up all the way to 1.4V+
Then they started dropping under load and right before i replaced them i couldn't get them over 1200rpm anymore @ 100% fan speed. They were vibrating very oddly and terribly, together with completely messing up my 12V line, which dropped to 11.40V under load. Might just look for replacing the 92mm fans on the original bracket, but so far i haven't found any replacement parts yet. On the plus side: GPU temperature remained about the same, the VRM temps dropped by 10°c and don't go over 85°c anymore lol. Also managed to bench stable at 1370MHz.
3dm 11 graphics score of 10400, which is a pretty high score for a 270x. Should rerun that bench with tesselation off and compare it to overclocked GTX760's now








And yes, I could RMA the card but i highly doubt that i'll get another card that runs 1350MHz+ and as far as i know the newer cards are voltage locked. Meh


----------



## xutnubu

Quote:


> Originally Posted by *Arkanon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xutnubu*
> 
> Those fans have a terrible reputation, just visit Newegg.
> 
> Some other users reported lower RPM as well followed by the death of the fans.
> 
> 
> 
> Care to link me to that page?
> As long as everything worked fine those were actually good fans, kept my card around 75°c with voltage cranked up all the way to 1.4V+
> Then they started dropping under load and right before i replaced them i couldn't get them over 1200rpm anymore @ 100% fan speed. They were vibrating very oddly and terribly, together with completely messing up my 12V line, which dropped to 11.40V under load. Might just look for replacing the 92mm fans on the original bracket, but so far i haven't found any replacement parts yet. On the plus side: GPU temperature remained about the same, the VRM temps dropped by 10°c and don't go over 85°c anymore lol. Also managed to bench stable at 1370MHz.
> 3dm 11 graphics score of 10400, which is a pretty high score for a 270x. Should rerun that bench with tesselation off and compare it to overclocked GTX760's now
> 
> 
> 
> 
> 
> 
> 
> 
> And yes, I could RMA the card but i highly doubt that i'll get another card that runs 1350MHz+ and as far as i know the newer cards are voltage locked. Meh
Click to expand...

Quote:


> Cons: Fans, Fans, Fans. Bought 2 cards, first one the fan died and basically is only running at 50% max speed. Before It would run at 3200 RPM or so at 100%. Now only 1800 or so. The card even idles at 75C with no load! That's outrageous! The card hit 95C with a fan pointed right at the video card while gaming, and fan set to 100%. The fans still spin, but don't run at full speed.


http://www.newegg.com/Product/Product.aspx?Item=N82E16814127759&cm_re=r9_280x-_-14-127-759-_-Product

Almost all bad reviews have to do with the fans failing.

I hope mine doesn't. I still have my old Sapphire card, I think the fans are the same size, just in case I need to replace them.


----------



## manofsteele87

Sapphire Vapor-X R9 270X "OC"

http://imgur.com/Hq68M1n


----------



## HBizzle

Anyone know the standard Amps a R9 280x is supposed to be pulling. Running TechPowerUp my amps are all over the place from low 30 Amps to 120 Amps in the VDDC Current sensor. Anyone know if this ok .


----------



## neurotix

Quote:


> Originally Posted by *manofsteele87*
> 
> Sapphire Vapor-X R9 270X "OC"
> 
> http://imgur.com/Hq68M1n


Nice choice! I have this card too. However, it's just sitting in the box in my storage room, it's a backup/benching card. How do you like it? What clocks are you able to obtain?


----------



## BurninPurp561

How's this 1175/ 1600 gigabyte r9 270??? I'm a super noob so if it's bad sorry I thought it was worth bragging about lol


----------



## agrims

Quote:


> Originally Posted by *BurninPurp561*
> 
> 
> 
> How's this 1175/ 1600 gigabyte r9 270??? I'm a super noob so if it's bad sorry I thought it was worth bragging about lol


Windows NT?? Also, 1368x768? Windowed mode?? Try running 1920x1080, full screen, 2-4x aa, moderate to no tessellation, ultra. I don't think anyone on a desktop runs 768 anymore, that's laptop territory on res. give us that and we'll tell you if it's good or not!


----------



## anubis1127

Quote:


> Originally Posted by *agrims*
> 
> Windows NT?? Also, 1368x768? Windowed mode?? Try running 1920x1080, full screen, 2-4x aa, moderate to no tessellation, ultra. I don't think anyone on a desktop runs 768 anymore, that's laptop territory on res. give us that and we'll tell you if it's good or not!


Windows NT 6.2 is Windows 8, or Server 2012.

But yes, I agree, try 1080p extreme preset.


----------



## Celisuis

Hello All,

Managed to OC my R9 280X to a higher respectable level (or so I've been told!)

*Core*: 1150MHz
*Memory*: 1575MHz
*Core Voltage*: 1300mV
*Memory Voltage*: 1600mV
*Power Limit*: 20%

*Screenshot of Desktop*


*GPU-Z Validation*: Here

*Heaven Score*



So, what do you all think? Honest answers please


----------



## BurninPurp561

Quote:


> Originally Posted by *agrims*
> 
> Windows NT?? Also, 1368x768? Windowed mode?? Try running 1920x1080, full screen, 2-4x aa, moderate to no tessellation, ultra. I don't think anyone on a desktop runs 768 anymore, that's laptop territory on res. give us that and we'll tell you if it's good or not!


Ok I tried what you said but my monitor does not support 1080p 

So I went with the highest res it would take 1440 x 900 with 4xaa and no tessellation

Now I kind of feel like tinkering again... I can o.c. higher


----------



## solar0987

Ok so how do i get any of the oc software to let me put more than 1.2 volts into my card? I have asked 3 diff times now in thsi forum to no answer.


----------



## anubis1127

Quote:


> Originally Posted by *solar0987*
> 
> Ok so how do i get any of the oc software to let me put more than 1.2 volts into my card? I have asked 3 diff times now in thsi forum to no answer.


I believe on the newer GB cards they have locked voltage. Which OC utilities have you tried?

Another thing you may be able to do is edit the vbios with VBE7, I read some were able to undervolt GB R9 270s, not sure if it will work for overvolting a 270X though.


----------



## BruceB

Can anyone tell me what the _Power Limit_ slider in MSI Afterburner does?









Thanks!


----------



## Dasboogieman

Quote:


> Originally Posted by *BruceB*
> 
> Can anyone tell me what the _Power Limit_ slider in MSI Afterburner does?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks!


All AMD cards since Cayman (perhaps even Evergreen) have implemented a PCB TDP power limit circuit. Essentially, the PCB is not allowed to draw more wattage than a preset limit.
Granted, this limit is actually quite high, it usually requires Furmark to breach it under stock settings. However, under heavy over voltage conditions, it is also possible to reach it.

That power limiter tab basically increases the specified TDP limit of the board, thus allowing more power to be drawn by the card before throttling occurs.


----------



## neurotix

Quote:


> Originally Posted by *Dasboogieman*
> 
> All AMD cards since Cayman (perhaps even Evergreen) have implemented a PCB TDP power limit circuit. Essentially, the PCB is not allowed to draw more wattage than a preset limit.
> Granted, this limit is actually quite high, it usually requires Furmark to breach it under stock settings. However, under heavy over voltage conditions, it is also possible to reach it.
> 
> That power limiter tab basically increases the specified TDP limit of the board, thus allowing more power to be drawn by the card before throttling occurs.


Great explanation, rep+

Basically, if you're overclocking and overvolting at all, you want to max out the power limit slider. (It will be either 20% or 50% depending on your card.) If you don't, you will probably throttle and downclock under load.


----------



## solar0987

I have tried using every oc software out currently.
Can someone point me in the right direction for modding my bios? I want to go over 1.2v by a good bit.


----------



## agrims

Google modded bios for your card. If you have a gigabyte card there is a linky to the bios file in this thread a few pages back...


----------



## End3R

Quote:


> Originally Posted by *solar0987*
> 
> I have tried using every oc software out currently.
> Can someone point me in the right direction for modding my bios? I want to go over 1.2v by a good bit.


Make sure you backup your current vbios though because if you do it wrong and don't have a backup to revert to, you can brick your card.


----------



## manofsteele87

Nice choice! I have this card too. However, it's just sitting in the box in my storage room, it's a backup/benching card. How do you like it? What clocks are you able to obtain?

@neurotix Thanks! It's a step up from my old HD6850 for sure. Honestly, I haven't tried yet. I might have to do some reading up and give it a try...this is, after all, overclock.net. haha.


----------



## neurotix

Quote:


> Originally Posted by *manofsteele87*
> 
> Nice choice! I have this card too. However, it's just sitting in the box in my storage room, it's a backup/benching card. How do you like it? What clocks are you able to obtain?
> 
> @neurotix Thanks! It's a step up from my old HD6850 for sure. Honestly, I haven't tried yet. I might have to do some reading up and give it a try...this is, after all, overclock.net. haha.


Should be roughly twice as powerful as a 6850 in benchmarks.


----------



## Hagleigh

Hey guys, is it normal for the third fan not to be running at all?
I just built this PC http://uk.pcpartpicker.com/user/Hagleigh/saved/8gbCmG.

I've been playing hitman absolution on ultra and battlefield 4 on high. I assumed these two would push my system to the limits and thus require all my fans to be running. However when I looked I could only see the 2 big ones working. I thought this was quite odd, do I have a defective graphics card? I've updated the drivers and everything.


----------



## Crowe98

Quote:


> Originally Posted by *Hagleigh*
> 
> Hey guys, is it normal for the third fan not to be running at all?
> I just built this PC http://uk.pcpartpicker.com/user/Hagleigh/saved/8gbCmG.
> 
> I've been playing hitman absolution on ultra and battlefield 4 on high. I assumed these two would push my system to the limits and thus require all my fans to be running. However when I looked I could only see the 2 big ones working. I thought this was quite odd, do I have a defective graphics card? I've updated the drivers and everything.


No matter what (unless you've adjusted the fan speed yourself) all of the fans should be running, even not at load. If you can check with the card still installed in your system follow the fan cables back to their connectors and make sure they're connected. If not, plug it in.

Other than that, you could maybe apply for an RMA, if defective fans is covered by your warranty; which it should be.


----------



## Hagleigh

Fan connectors?

I thought the only thing you plug into the GPU was the PCI cables?

I have the sapphire toxic version by the way


----------



## SeanOMatic

The fans plug into connectors on the GPU itself and these can come loose.


----------



## BruceB

Quote:


> Originally Posted by *Hagleigh*
> 
> Hey guys, is it normal for the third fan not to be running at all?
> I just built this PC http://uk.pcpartpicker.com/user/Hagleigh/saved/8gbCmG.
> I've been playing hitman absolution on ultra and battlefield 4 on high. I assumed these two would push my system to the limits and thus require all my fans to be running. However when I looked I could only see the 2 big ones working. I thought this was quite odd, do I have a defective graphics card? I've updated the drivers and everything.


Most Cards have a target temperature around 75°C so if the temps are lower it may lower fan Speeds (although I've never heard of fans being turned off).

When you boot your PC the GPU should run all the fans at 100% for a few seconds (usually until the PC Posts), if the fan dosen't spin then it's either not connected or broken.

[EDIT]:
Does anyone know what the deal with the current _Don't Settle Forever_ bundle is? I got my Card about 4 months ago and I didn't get a code with it, am I entitled to one?


----------



## Hagleigh

Cheers for the advice guys.Turns out the 2 PIN bit of one of the 8 PIN connectors wasn't it properly. I guess worse things could of happened on my first build...


----------



## Hagleigh

Quote:


> Originally Posted by *BruceB*
> 
> Most Cards have a target temperature around 75°C so if the temps are lower it may lower fan Speeds (although I've never heard of fans being turned off).
> 
> When you boot your PC the GPU should run all the fans at 100% for a few seconds (usually until the PC Posts), if the fan dosen't spin then it's either not connected or broken.
> 
> [EDIT]:
> Does anyone know what the deal with the current _Don't Settle Forever_ bundle is? I got my Card about 4 months ago and I didn't get a code with it, am I entitled to one?


I bought mine from ebuyer UK and my code came in an envelope outside the GPU packaging. I believe the manufacturer gives it to the retailers to distribute. I'd ask whoever you bought it from.


----------



## BruceB

Quote:


> Originally Posted by *Hagleigh*
> 
> I bought mine from ebuyer UK and my code came in an envelope outside the GPU packaging. I believe the manufacturer gives it to the retailers to distribute. I'd ask whoever you bought it from.


Thanks. I phoned the shop I bought it from today and the guy said he'll speak to his Boss who would email the code to me. I'll Keep you updated on how that pans out!


----------



## mtcn77

I recently noticed the fact that Sapphire 280x Toxic blows GTX 780's socks off. The price discrepany is astounding, imo.

MSI GTX 780
Sapphire R9 280X Toxic


----------



## Dasboogieman

Quote:


> Originally Posted by *Crowe98*
> 
> No matter what (unless you've adjusted the fan speed yourself) all of the fans should be running, even not at load. If you can check with the card still installed in your system follow the fan cables back to their connectors and make sure they're connected. If not, plug it in.
> 
> Other than that, you could maybe apply for an RMA, if defective fans is covered by your warranty; which it should be.


If you have a good overclocker and are unwilling to risk getting a crap card in return from RMA, you can buy the OEM fans yourself. They're pretty cheap on EBAY. I changed the broken fan on a $50 second hand GTX 560ti twin frozr II last month. Piece of cake and only cost $25
Quote:


> Originally Posted by *mtcn77*
> 
> I recently noticed the fact that Sapphire 280x Toxic blows GTX 780's socks off. The price discrepany is astounding, imo.
> 
> MSI GTX 780
> Sapphire R9 280X Toxic


I dunno, the benchmarks on Anandtech seem to show a discrepancy of about 16%-31% in favor of the 780. This is before accounting the fact that maybe 10% can be recovered by the Toxic design so it should bring parity at best. Bear in mind too that the 780 can be overclocked pretty well due to the lower operating temperature headroom, especially with the Skyn3t BIOS. However, I do strongly agree on the price, the price difference is more than 70% for the relatively small performance boost.


----------



## UnrealEdge

Sapphire R9 280x in crossfire with a MSI 7970 ghz edition. I know I have to figure out something to do about those gpu power wires lol. 

Rest of system is ASUS M5a99fx r2.0 board, 16 gb ram, AMD 8350 black edition CPU, Corsair H100i cpu cooler, EVGA 1000w gold power supply. I'm thinking about switching the 280x to the primary card, but not sure if I'd even see a gain of any kind.


----------



## mtcn77

Quote:


> Originally Posted by *Dasboogieman*
> 
> I dunno, the benchmarks on Anandtech seem to show a discrepancy of about 16%-31% in favor of the 780. This is before accounting the fact that maybe 10% can be recovered by the Toxic design so it should bring parity at best. Bear in mind too that the 780 can be overclocked pretty well due to the lower operating temperature headroom, especially with the Skyn3t BIOS. However, I do strongly agree on the price, the price difference is more than 70% for the relatively small performance boost.


The issue, here, is - there is no relativity.
I would rather pick AMD over Nvidia for superior render sampling, any day.
290 series do throttle, hence no custom cooler makes for a card worth their name in overclocking, but this is reliable and fast at the same time, imo.
Heat from hell
290's, also, broil your system as radiant copper plates; some even reaching 117 celcius in which case, the heat output doesn't even translate into an overclocking advantage, unfortunately. Some brand models are definitely more offensive than others, though regardless I find this notion disconcerting.
Scratch that: some 280X'es are no different themselves; eventhough the best 7970 passes with flying colors.
I cannot recommend MSI ubiquitously, then.
I hope the quality has remained the same through first and second generation Tahiti timeline.


----------



## neurotix

Quote:


> Originally Posted by *mtcn77*
> 
> The issue, here, is - there is no relativity.
> I would rather pick AMD over Nvidia for superior render sampling, any day.
> 290 series do throttle, hence no custom cooler makes for a card worth their name in overclocking, but this is reliable and fast at the same time, imo.
> Heat from hell
> 290's, also, broil your system as radiant copper plates; some even reaching 117 celcius in which case, the heat output doesn't even translate into an overclocking advantage, unfortunately. Some brand models are definitely more offensive than others, though regardless I find this notion disconcerting.
> Scratch that: some 280X'es are no different themselves; eventhough the best 7970 passes with flying colors.
> I cannot recommend MSI ubiquitously, then.
> I hope the quality has remained the same through first and second generation Tahiti timeline.


What the hell are you talking about?

Your 290 won't throttle if you 1) have a decent air cooler (Tri-X, Vapor-X or PCS+) or 2) put it under water. Go look at reviews of the air coolers like the Tri-X and PCS+. The professional reviews back this up.

I run both of mine (Tri-X) at 1200/1500mhz with 100% fan and in most games they don't even pass 75C under load- and they're safe up to 95C. I don't care about the loud fan noise because I have ear covering headphones, and when gaming I can't hear anything but the game.

Do you even OWN a 290, let alone one with a decent cooler? Are you qualified to talk about 290s or even the 280X when the newest card in your sig rig is an OLD 6870 from 2010?


----------



## mtcn77

Quote:


> Originally Posted by *neurotix*
> 
> What the hell are you talking about?
> 
> Your 290 won't throttle if you 1) have a decent air cooler (Tri-X, Vapor-X or PCS+) or 2) put it under water. Go look at reviews of the air coolers like the Tri-X and PCS+. The professional reviews back this up.
> 
> I run both of mine (Tri-X) at 1200/1500mhz with 100% fan and in most games they don't even pass 75C under load- and they're safe up to 95C. I don't care about the loud fan noise because I have ear covering headphones, and when gaming I can't hear anything but the game.
> 
> Do you even OWN a 290, let alone one with a decent cooler? Are you qualified to talk about 290s or even the 280X when the newest card in your sig rig is an OLD 6870 from 2010?


Should I remove my post just because you say so?
I'm sorry, if my previous post offends you. I'm trying to formalize various reviews into a constructive ordeal, rather than defending individual preferences. My intention is formal; eventhough I'm excited.
If you were careful, you would have noticed that none of the references have anything in bad faith of the brand you preferred.


----------



## Dasboogieman

Quote:


> Originally Posted by *mtcn77*
> 
> I recently noticed the fact that Sapphire 280x Toxic blows GTX 780's socks off. The price discrepany is astounding, imo.
> 
> MSI GTX 780
> Sapphire R9 280X Toxic


Quote:


> Originally Posted by *mtcn77*
> 
> The issue, here, is - there is no relativity.
> I would rather pick AMD over Nvidia for superior render sampling, any day.
> 290 series do throttle, hence no custom cooler makes for a card worth their name in overclocking, but this is reliable and fast at the same time, imo.
> Heat from hell
> 290's, also, broil your system as radiant copper plates; some even reaching 117 celcius in which case, the heat output doesn't even translate into an overclocking advantage, unfortunately. Some brand models are definitely more offensive than others, though regardless I find this notion disconcerting.
> Scratch that: some 280X'es are no different themselves; eventhough the best 7970 passes with flying colors.
> I cannot recommend MSI ubiquitously, then.
> I hope the quality has remained the same through first and second generation Tahiti timeline.


I understand you excitement, AMD have fantastic performance vs price.
What you have to notice on those graphs are that those 100+ temperatures are for the VRM assembly, not the core itself. This is where the cooling massively varies between manufacturers, the XFX DD has a known issue with VRM thermal management (ask @Sgt Bilko) which explains the 117 degree temperatures. It is quite normal for temperatures to be in the 80-90 degree zone for a 5+1+1 design to power a 250W GPU. Even NVIDIA cards would be in the same ballpark, the only difference is that the NVIDIA boards use a 6+1+1 design (thus slightly lower temperatures) with no temperature sensors.

There is a common misconception with the Hawaii reference cooler. The GPU is not hot because of poor design, people tend to forget that this is a 250W GPU with a small die area. What people also fail to realize is that the stock cooler is an amazingly cost effective solution (AMD aren't exactly swimming in cash for RnD). It has the same thermal design capacity as the one for the Tahiti cards, the only difference is it has to dissipate more energy due to a hotter GPU. The solution? design the transistors on the Hawaii chip so that it can withstand 95 degrees. That is why Hawaii cards run at 95 degrees, so that more energy can be dissipated without having to redesign the reference cooler. For the record, the reference cooler on Hawaii is capable of keeping the VRM temperatures below 80 degrees which is excellent, it actually surpasses some of the weaker aftermarket solutions. The throttling is simply to keep the fan from spinning up on the reference cooler, most owners report that the throttling disappears once the fan speed is dialed to 55 or 60%.

If you want a more in-depth idea on how aftermarket solutions perform on the Hawaii cards, head over to the 290/X owners lounge.

I was unaware there is still a discrepancy in the IQ between NVIDIA and AMD solutions. I was under the impression this reached parity during the Tahiti-Kepler era.


----------



## anubis1127

Quote:


> Originally Posted by *mtcn77*
> 
> I recently noticed the fact that Sapphire 280x Toxic blows GTX 780's socks off. The price discrepany is astounding, imo.
> 
> MSI GTX 780
> Sapphire R9 280X Toxic


Lol, nope. I own cards from both camps, and can honestly say there is a bit of a noticeable difference between a 780, and a 7970/280x, and its not in the 280x favor.


----------



## mtcn77

Quote:


> Originally Posted by *anubis1127*
> 
> *Lol*, nope. I own cards from both camps, and can honestly say there is a bit of a noticeable difference between a 780, and a 7970/280x, and its not in the 280x favor.


Hah! Somebody with some sense of humour left remaining. That is the proper discrimination I would expect.
That Sapphire Toxic one is the first of its name. I contend for an exemption.


----------



## Devildog83

Hey All,

I have been absent for a bit, sorry about that.

I have added *v3n0m90 - manofsteel87 and BurpinPurp561*, please read the 1st post if you wish to be added please post the make/model/a pic/clocks to be added. There are a couple of others that I did not add due to lack of info.

Welcome to those I did add and if you are missing any info please post it.

Thanks,

DD


----------



## anubis1127

You have been absent DD, tisk, tisk, jk.

Did you ever get to testing/OCing that Pentium G3258?


----------



## bartledoo

I am having a little trouble with my overclocking, even though i modded my bios to 1280/1150 in AB it shows i have stock clocks during benching and gaming


----------



## Devildog83

Quote:


> Originally Posted by *anubis1127*
> 
> You have been absent DD, tisk, tisk, jk.
> 
> Did you ever get to testing/OCing that Pentium G3258?


Somewhat, I got it to 4.4 Ghz stable and it seemed pretty snappy.

In Valley I got very close to my AMD set-up but it's not CPU intensive, in 3DMark11 the Physics and Combined were total crap. It stayed very cool in my system but the CPU was watercooled.
The Asrock Z97 Pro 4 mobo was very impressive for $110. High end built but slightly narrower than most full ATX. I just have not have time to do more extensive overclocking and benching yet but from what I can figure out about Intel overclocking it's very easy with the OC tool that comes with the board.
It has M2 capabilities and decent sound too.

Good on Asrock, IMO not all budget anymore. They stepped it up. My full review is at NewEgg under my name Bart H.


----------



## anubis1127

Quote:


> Originally Posted by *Devildog83*
> 
> Somewhat, I got it to 4.4 Ghz stable and it seemed pretty snappy.
> 
> In Valley I got very close to my AMD set-up but it's not CPU intensive, in 3DMark11 the Physics and Combined were total crap. It stayed very cool in my system but the CPU was watercooled.
> The Asrock Z97 Pro 4 mobo was very impressive for $110. High end built but slightly narrower than most full ATX. I just have not have time to do more extensive overclocking and benching yet but from what I can figure out about Intel overclocking it's very easy with the OC tool that comes with the board.
> It has M2 capabilities and decent sound too.
> 
> Good on Asrock, IMO not all budget anymore. They stepped it up. My full review is at NewEgg under my name Bart H.


Nice. I am running mine at 4.2Ghz right now, it doesn't do much, so that is plenty. I have a 4.5Ghz profile saved for benching / gaming, if I ever use it for gaming, not likely with the 4930k sitting next to it though.

Glad to hear the ASRock board faired well. I have a cheap MSI z97 board to test out too, haven't even unboxed it yet, not expecting much (the board cost me $40), but if I can take the 4.2ghz settings I have on the gigabyte board, and get that stable on the MSI board I will be happy. Sorry for all the Off Topic, heh.

Slightly on topic, I picked up another Pitcairn card..



Its a 7870 though, so not truly on topic. I haven't been very impressed with it, there is no cooling on the VRM area other than the fan blowing through the GPU's heatsink.


----------



## Arkanon

Quote:


> Originally Posted by *Devildog83*
> 
> Good on Asrock, IMO not all budget anymore. They stepped it up. My full review is at NewEgg under my name Bart H.


Since AsRock broke free of Asus the quality of their mainboards is about just as good as any other good manufacturer, really.


----------



## SeanOMatic

Quote:


> Originally Posted by *Arkanon*
> 
> Since AsRock broke free of Asus the quality of their mainboards is about just as good as any other good manufacturer, really.


The only knock I have on ASRock is their thin-paper PCB's. The boards themselves are good quality. I actually liked the Fatal1ty P67 board.


----------



## anubis1127

Quote:


> Originally Posted by *SeanOMatic*
> 
> The only knock I have on ASRock is their thin-paper PCB's. The boards themselves are good quality. I actually liked the Fatal1ty P67 board.


Their implementation of LLC is pretty wicked too, at least it was on my p67 extreme4, haven't tried another ASRock board since.


----------



## SeanOMatic

Quote:


> Originally Posted by *anubis1127*
> 
> Their implementation of LLC is pretty wicked too, at least it was on my p67 extreme4, haven't tried another ASRock board since.


That's what I noticed with their P67 Fatal1ty. It was very nice in terms of BIOS features, layout and overall build quality. The Extreme stuff is good too. ASRock is a great value. If you can get over the kinda goofy looks of their boards and thin PCB's, they are a great buy!


----------



## UnrealEdge

Forgot a bunch of info when I posted lol. 

Sapphire R9 280x Vapor-X on the clocks it ships with (1,100 MHz core clock, 1,500 MHz memory clock) in crossfire with a MSI 7970 GHz edition (1,050 MHz core clock, 1,375 memory clock)

Ran Skydiver in 3DMark for proof of clocks, and hardware. Everything is on the clocks it shipped to me with, the CPU (FX 8350) is in turbo mode (if that does anything lol) Also on the 14.2 AMD driver.

Link proof
http://www.3dmark.com/3dm/3619405?


----------



## anubis1127

I don't know if Sapphire made the heatsink long enough on that 280x. xD


----------



## UnrealEdge

It's not touching my hard drives yet, so I'd say they could go longer.


----------



## crazymania88

Hi Guys,
I am new here, I just got the Gigabyte r9 280x windforce OC, using it stock 1100/1500.

I've couple of questions:
I cannot see VRM temperatures on such high-end card, so it is software related? I had those with my 7870 XT

Second,
Card runs 41C 29% fan speed idle, it's brand new is it fine? I can bump it if needed, I don't think tho...


it is 70C under load 70% fan speed, so I think it is fine.
Also I cannot change fan speed in OC guru, but I can with Sapphire Trixx.

Anyone knows if Gigabyte missing VRM Temperature Sensors or I just have a software issue?
It'll be hard to Overclock if I don't know those.


----------



## DiceAir

Quote:


> Originally Posted by *crazymania88*
> 
> Hi Guys,
> I am new here, I just got the Gigabyte r9 280x windforce OC, using it stock 1100/1500.
> 
> I've couple of questions:
> I cannot see VRM temperatures on such high-end card, so it is software related? I had those with my 7870 XT
> 
> Second,
> Card runs 41C 29% fan speed idle, it's brand new is it fine? I can bump it if needed, I don't think tho...
> 
> 
> it is 70C under load 70% fan speed, so I think it is fine.
> Also I cannot change fan speed in OC guru, but I can with Sapphire Trixx.
> 
> Anyone knows if Gigabyte missing VRM Temperature Sensors or I just have a software issue?
> It'll be hard to Overclock if I don't know those.


Same issue here. Running 2x club3d r9 280x. I tested both cards individually and can't seem to see vrm temps. The only way I can see it is by running them in crossfire. Very weird but for me at least I'm not to worried as 2 of them is good enough and i don't need to overclock. Try updating the bios


----------



## crazymania88

Quote:


> Originally Posted by *DiceAir*
> 
> Same issue here. Running 2x club3d r9 280x. I tested both cards individually and can't seem to see vrm temps. The only way I can see it is by running them in crossfire. Very weird but for me at least I'm not to worried as 2 of them is good enough and i don't need to overclock. Try updating the bios


so they are visible in CF but not visible without it?_ LOL!
Bios is latest, so I've no bios to update.

also I cannot seem to change fan speed in OC Guru, but in Sapphire trixx, not that I need but it bothers me.
After all The GPU is great! Realy Great! I am impressed after Sapphire 7870 XT.
My asic is well 60.5% it would be kinda dissapointing
but the card runs at 1100mhz 69C 70% Fan speed so now I started to think if asic is really the case LOL.

anyone with a fix? I don't wanna edit the bios of brand new GPU to see if it works LOL.

edit:
and LOl now OC Guru fan speed works, LOL!
edit2:
and Now I cannot click on Fan Settings again







I wanna set it back to default buggy software LOL.
edit3:
Now I can click again, it has a pattern I gonna find it LOLOLOL


----------



## DiceAir

Quote:


> Originally Posted by *crazymania88*
> 
> so they are visible in CF but not visible without it?_ LOL!
> Bios is latest, so I've no bios to update.
> 
> also I cannot seem to change fan speed in OC Guru, but in Sapphire trixx, not that I need but it bothers me.
> After all The GPU is great! Realy Great! I am impressed after Sapphire 7870 XT.
> My asic is well 60.5% it would be kinda dissapointing
> but the card runs at 1100mhz 69C 70% Fan speed so now I started to think if asic is really the case LOL.
> 
> anyone with a fix? I don't wanna edit the bios of brand new GPU to see if it works LOL.
> 
> edit:
> and LOl now OC Guru fan speed works, LOL!
> edit2:
> and Now I cannot click on Fan Settings again
> 
> 
> 
> 
> 
> 
> 
> I wanna set it back to default buggy software LOL.
> edit3:
> Now I can click again, it has a pattern I gonna find it LOLOLOL


Yes in CF it works but not without crossfire. I think you can't see the vrm temps


----------



## Devildog83

Quote:


> Originally Posted by *crazymania88*
> 
> Hi Guys,
> I am new here, I just got the Gigabyte r9 280x windforce OC, using it stock 1100/1500.
> 
> I've couple of questions:
> I cannot see VRM temperatures on such high-end card, so it is software related? I had those with my 7870 XT
> 
> Second,
> Card runs 41C 29% fan speed idle, it's brand new is it fine? I can bump it if needed, I don't think tho...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> it is 70C under load 70% fan speed, so I think it is fine.
> Also I cannot change fan speed in OC guru, but I can with Sapphire Trixx.
> 
> Anyone knows if Gigabyte missing VRM Temperature Sensors or I just have a software issue?
> It'll be hard to Overclock if I don't know those.


Try HWinfo64 for monitoring software, it should show everything you need there.


----------



## crazymania88

Quote:


> Originally Posted by *Devildog83*
> 
> Try HWinfo64 for monitoring software, it should show everything you need there.


I've it, it also doesn't show VRM Temperatures.

How can I learn if the GPU actually has the sensors? Should I mail Gigabyte.
Not a big deal tho, I am not going to OC already 1100/1500 GPU







, I am just curious how cool does VRMs run?


----------



## anubis1127

Quote:


> Originally Posted by *crazymania88*
> 
> I've it, it also doesn't show VRM Temperatures.
> 
> How can I learn if the GPU actually has the sensors? Should I mail Gigabyte.
> Not a big deal tho, I am not going to OC already 1100/1500 GPU
> 
> 
> 
> 
> 
> 
> 
> , I am just curious how cool does VRMs run?


I am in the same boat with my XFX 7870, no vrm temps from software, looking into IR thermometer, heh.


----------



## neptunus

Quote:


> Originally Posted by *crazymania88*
> 
> I cannot see VRM temperatures on such high-end card, so it is software related?.


Never buy AMD cards by Gigabyte. Its not software related, Gigabyte for some reason used non-reference components such as ADP 4100 (voltage controller), that does not support any I2C communication (therefore no info about anything).


----------



## crazymania88

Quote:


> Originally Posted by *neptunus*
> 
> Never buy AMD cards by Gigabyte. Its not software related, Gigabyte for some reason used non-reference components such as ADP 4100 (voltage controller), that does not support any I2C communication (therefore no info about anything).


The gpu is great when it comes to performance, noise and temperatures.
So I do not think what voltage controller they've used is a big deal, people around overclock.net also unlocked the voltage so they can change it.

so can you please enlighten me, is it really a bad thing that ADP4100, because I have no knowledge about it and I am curious.

@anubis
how do you use IR Thermometer? I think you take temps from VRMs heatsink right?
I also don't know how does it work lol.


----------



## solar0987

I cannot unlock the voltage on my gigabyte 270x no matter what i try. Its stuck at 1.2 which is the stock voltage of the card.


----------



## anubis1127

Quote:


> Originally Posted by *crazymania88*
> 
> The gpu is great when it comes to performance, noise and temperatures.
> So I do not think what voltage controller they've used is a big deal, people around overclock.net also unlocked the voltage so they can change it.
> 
> so can you please enlighten me, is it really a bad thing that ADP4100, because I have no knowledge about it and I am curious.
> 
> @anubis
> how do you use IR Thermometer? I think you take temps from VRMs heatsink right?
> I also don't know how does it work lol.


People generally measure the temps from the back of the PCB. One could also take readings from the heatsinks as well if they were exposed. I mainly want it to measure temps on the XFX 7870 vs my 270s, and other 7870s. The XFX card doesn't have any vrm heatsinks, so I'm just curious what the temp difference would be vs the 7870s with vrm heatsinks.


----------



## crazymania88

Quote:


> Originally Posted by *solar0987*
> 
> I cannot unlock the voltage on my gigabyte 270x no matter what i try. Its stuck at 1.2 which is the stock voltage of the card.


I didnt try but people on overclock.net used this method to unlock 280x's.
and I used this method to unlock my 7870XT's voltage.

Change voltage value in vga bios, then flash it now you can use sapphire trixx to play with it.


----------



## neptunus

You can't unlock the voltage. Only ASUS GPU Tweak supports the ADP4100 (IIRC), other than that the only way to change voltage is to flash modified bios, however even VBE won't allow you to pick exact values as it has only presets (0.03V steps) available.

I can't imagine the background you are coming from if you are calling this card "great". It is very loud at anything above 50% fan and at 24C ambient with the fan set to auto (20-100% IIRC) it gets up to 75C (constant 99% GPU load, playing Witcher 2).

As much as you want to like this card, don't lie to yourself. My friend has 280X by HIS - you won't be able to hear it and it never goes above 75C. Gigabyte with this ****ty thin PCB that bends when you hold it with one hand and 3 heat pipes instead of 5 like they use on Rev 2. cooling system for cards by NVIDIA... they don't deserve your money.


----------



## crazymania88

Quote:


> Originally Posted by *neptunus*
> 
> You can't unlock the voltage. Only ASUS GPU Tweak supports the ADP4100 (IIRC), other than that the only way to change voltage is to flash modified bios, however even VBE won't allow you to pick exact values as it has only presets (0.03V steps) available.
> 
> I can't imagine the background you are coming from if you are calling this card "great". It is very loud at anything above 50% fan and at 24C ambient with the fan set to auto (20-100% IIRC) it gets up to 75C (constant 99% GPU load, playing Witcher 2).
> 
> As much as you want to like this card, don't lie to yourself. My friend has 280X by HIS - you won't be able to hear it and it never goes above 75C. Gigabyte with this ****ty thin PCB that bends when you hold it with one hand and 3 heat pipes instead of 5 like they use on Rev 2. cooling system for cards by NVIDIA... they don't deserve your money.


I've rev2 version so they used 3 pipes on rev2 amd coolers and 5 on rev2 nvidia fine,
my card is not loud at all







, also the black pcb looks fine and it is 69-70c in 28C room temp.

I used sapphire dual-x before this one, I can say the exact things you say about Giga 280x about the Dual-x 7870 XT.

So you sure they all are same? I am curious now, even at 100% fan speed card doesn't make such high noise ? Hardocp were calling card silent and techpowerup was telling the opposite.
So when I get the card, I've got it under stress test for 20 mins in 28c room, I didn't really hear any noise.
So I started to think they're not all same?!? and it was just 70C


----------



## Internet Swag

Through OC'ing alone I got my Unigine Heaven Benchmarks to go from 705-786, is that good?

Core Clock: 900-1050
Mem Clock: 1200-1250

I can't push my Core Clock further than 1050, the slider won't go further and when I try to push Mem Clock to 1300 I start to get some issues.

So, that's pretty impressive I think since I haven't touched Core Voltage or Power Limit% yet.

Is it worth doing those things?


----------



## Recr3ational

Anyone tried having an eyefinity setup and having a fourth monitor as a auxiliary monitor? I'm so confused on how it works..


----------



## Roboyto

Quote:


> Originally Posted by *Recr3ational*
> 
> Anyone tried having an eyefinity setup and having a fourth monitor as a auxiliary monitor? I'm so confused on how it works..


3 monitors are hooked up with DVI/HDMI connection.

The 4th monitor must use DisplayPort natively, or you must have an active DisplayPort adapter. I believe with the 4th auxiliary monitor you can run 3 wide eyefinity with a game and still have your desktop displayed on the 4th monitor; not 100% certain however.

6 monitors is possible with a special MST hub. This allows you to drive 3 monitors with the single DisplayPort CONNECTOR, as well as the 3 monitors from DVI/HDMI.

http://support.amd.com/en-us/search/faq/170


----------



## Recr3ational

Quote:


> Originally Posted by *Roboyto*
> 
> 3 monitors are hooked up with DVI/HDMI connection.
> 
> The 4th monitor must use DisplayPort natively, or you must have an active DisplayPort adapter. I believe with the 4th auxiliary monitor you can run 3 wide eyefinity with a game and still have your desktop displayed on the 4th monitor; not 100% certain however.
> 
> 6 monitors is possible with a special MST hub. This allows you to drive 3 monitors with the single DisplayPort CONNECTOR, as well as the 3 monitors from DVI/HDMI.
> 
> http://support.amd.com/en-us/search/faq/170


Yeah I have the three already working fine with an active display port.

My 4th one however is not showing the display fully. I mean it's got black borders around it. It's an inch bigger does that make a difference? Thanks mate


----------



## dabby91

Quote:


> Originally Posted by *Recr3ational*
> 
> Yeah I have the three already working fine with an active display port.
> 
> My 4th one however is not showing the display fully. I mean it's got black borders around it. It's an inch bigger does that make a difference? Thanks mate


Are you able to adjust underscan/overscan settings in CCC when the monitor is connected through displayport?


----------



## Recr3ational

Quote:


> Originally Posted by *dabby91*
> 
> Are you able to adjust underscan/overscan settings in CCC when the monitor is connected through displayport?


Yeah I literally just saw that haha. Thanks mate! It's finally looking good. REP+


----------



## dabby91

Quote:


> Originally Posted by *Recr3ational*
> 
> Yeah I literally just saw that haha. Thanks mate! It's finally looking good. REP+


Cheers, I wasn't too sure myself as I thought them settings might have been just for HDMI


----------



## Recr3ational

Quote:


> Originally Posted by *dabby91*
> 
> Cheers, I wasn't too sure myself as I thought them settings might have been just for HDMI


My forth is through HDMI, I never saw the overscan thing before today. I saw it literally a second before reading your post haha.


----------



## techjesse

Add me








Powercolor R9 280x TurboDuo X2 & Gigabyte R9 280x OC x2












Water Cooled Powercolor R9 280x TurboDuo X2


Gigabyte R9 280x OC x2 spare rig










YEAH!


----------



## Devildog83

Quote:


> Originally Posted by *techjesse*
> 
> Add me
> 
> 
> 
> 
> 
> 
> 
> 
> Powercolor R9 280x TurboDuo X2 & Gigabyte R9 280x OC x2
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Water Cooled Powercolor R9 280x TurboDuo X2
> 
> 
> Gigabyte R9 280x OC x2 spare rig
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> YEAH!


*techjesse* has been added. You got some clocks? Everyday or overclocks, either is good.


----------



## techjesse

Yes, I run them all at 1100/1500







Thanks


----------



## Falcorion

Hi guys, just got myself a MSI Twin Frozr OC r9 280x and have been playing with the clocks and voltage. I've it stable at 1150/1650 with no voltage bumps or power limit increase. But when I try to increase voltage or power limit to milk some more out of the card it either gets artifacts right away or locks up/crashes. Any ideas?


----------



## smoke2

Quote:


> Originally Posted by *DiceAir*
> 
> Same issue here. Running 2x club3d r9 280x. I tested both cards individually and can't seem to see vrm temps. The only way I can see it is by running them in crossfire. Very weird but for me at least I'm not to worried as 2 of them is good enough and i don't need to overclock. Try updating the bios


Try Hw-info, and then sensors.
Please, then post VRM temps on your *Club3D Royal Queen*, because I'm wondering to buy one








Also would like to know VRM temps in load on *HIS 280X IceQ X2* from some owner.


----------



## DiceAir

Will post VRM temps now. Just going to check it while playing bf4 in test range.

OK with vsync off 2560x1440 @ 95Hz and ultra in test range Highest temp in CF was 86C and 74C


----------



## smoke2

Quote:


> Originally Posted by *DiceAir*
> 
> Will post VRM temps now. Just going to check it while playing bf4 in test range


Thanks.
Please check max. temps.
You could also try some benchmark, but if not nevermind


----------



## DiceAir

OK with vsync off 2560x1440 @ 95Hz and ultra in test range Highest temp in CF was 86C and 74C


----------



## smoke2

Quote:


> Originally Posted by *DiceAir*
> 
> OK with vsync off 2560x1440 @ 95Hz and ultra in test range Highest temp in CF was 86C and 74C


Haven't you tested a single card, not in CF?
CF can possibly increase temperature.
In synthetic benchmark the temps are even higher?


----------



## DiceAir

Quote:


> Originally Posted by *smoke2*
> 
> Haven't you tested a single card, not in CF?
> CF can possibly increase temperature.
> In synthetic benchmark the temps are even higher?


Haven't tested synthetic benchmarks will do that tomorrow but can't test in single card as i can only see it in crossfire for some reason


----------



## smoke2

Quote:


> Originally Posted by *DiceAir*
> 
> OK with vsync off 2560x1440 @ 95Hz and ultra in test range Highest temp in CF was 86C and 74C


Quote:


> Originally Posted by *DiceAir*
> 
> Haven't tested synthetic benchmarks will do that tomorrow but can't test in single card as i can only see it in crossfire for some reason


Thanks. Synthetic benchmark will be great.
Have you tried hw-info for VRM temps for single card?


----------



## smoke2

Quote:


> Originally Posted by *DiceAir*
> 
> OK with vsync off 2560x1440 @ 95Hz and ultra in test range Highest temp in CF was 86C and 74C


What temperature had GPU core?
Don't you remember?


----------



## DiceAir

Quote:


> Originally Posted by *smoke2*
> 
> What temperature had GPU core?
> Don't you remember?


GPU core was 82C hottest. at 70% fan speed and no haven't tried that hw info and not going to take out the card now.


----------



## End3R

Quote:


> Originally Posted by *smoke2*
> 
> Also would like to know VRM temps in load on *HIS 280X IceQ X2* from some owner.


Mine is a 270X but it's the same cooler, I've never seen my card go over 77C and that was after running valley benchmark for about 30 minutes straight on extreme preset. Even after hours of gaming, it won't be over 77C. It's at stock speeds 1050/1400.


----------



## smoke2

Quote:


> Originally Posted by *DiceAir*
> 
> GPU core was 82C hottest. at 70% fan speed and no haven't tried that hw info and not going to take out the card now.


70% fan speed is stock fan speed or do you have manually set fan curve?


----------



## DiceAir

Quote:


> Originally Posted by *smoke2*
> 
> 70% fan speed is stock fan speed or do you have manually set fan curve?


Manual fan curve. The default manual fan curve in afterburner


----------



## Daveros

My first AMD card, Sapphire Vapor-X 280X, been running it for 36 hours, since moving from my GTX 580 Lightning, and I am very much enjoying the move.

Had a bit of a play with OCing at stock VID (1.162v) and managed to get a reasonable result. Tried a higher OC with higher volts, but couldn't seem to get much from it without a major bump, so am leaving it at 1150/1750 for now.




Question, though, does anyone know if you can turn off the illuminated Sapphire logo on the card?


----------



## neurotix

Quote:


> Originally Posted by *Daveros*
> 
> My first AMD card, Sapphire Vapor-X 280X, been running it for 36 hours, since moving from my GTX 580 Lightning, and I am very much enjoying the move.
> 
> Had a bit of a play with OCing at stock VID (1.162v) and managed to get a reasonable result. Tried a higher OC with higher volts, but couldn't seem to get much from it without a major bump, so am leaving it at 1150/1750 for now.
> 
> 
> 
> 
> Question, though, does anyone know if you can turn off the illuminated Sapphire logo on the card?


Is that score for Fire Strike or 3dmark11? In either case it looks low for a 280X, *even with* an i7 920.

Afaik there's no way to disable the light up logo on the side of the card. You may have to remove the cooler and cut the connection to the LED. This would probably void your warranty if you have one.


----------



## Roboyto

Quote:


> Originally Posted by *Daveros*
> 
> Question, though, does anyone know if you can turn off the illuminated Sapphire logo on the card?


Should be a separate plug for the led lighting it. This is how it was on my 4890 vapor-X. Probably have to remove cooler to get at the plug


----------



## Daveros

Quote:


> Originally Posted by *neurotix*
> 
> Is that score for Fire Strike or 3dmark11? In either case it looks low for a 280X, *even with* an i7 920.


Oh good. Well, that's nice to know. What should I be seeing, do you think? (Fire Strike, btw)

This is my OC result (the one I meant to post in the first place):


NB: Thanks everyone for the lighting info.


----------



## neurotix

Quote:


> Originally Posted by *Daveros*
> 
> Oh good. Well, that's nice to know. What should I be seeing, do you think? (Fire Strike, btw)
> 
> This is my OC result (the one I meant to post in the first place):
> 
> 
> NB: Thanks everyone for the lighting info.


I would suggest messing with and possibly lowering your memory clock if it's really at 1750mhz. Too high a memory OC can reduce your fps in *any* bench because error checking kicks in. Try putting your memory to 1600mhz and see if your score increases. Additionally, you want core clock most of all, so see if you can hit 1200mhz.

7600 is a much better score and is in line with other 280X/7970 owners. Realistically, even my 7870XT gets 7000 in Fire Strike. http://hwbot.org/submission/2543590_neurotix_3dmark___fire_strike_radeon_hd_7870_%28tahiti_core%29_7038_marks This is with an i7 4770k though, which gets a much better physics score.

Moreover, your combined score is extremely low, and this has plagued other users @Devildog83 for some time. The solution to this is to install Service Pack 1 if you're running Windows 7. I used to have low combined scores until I installed SP1. However, *please avoid* the Windows Updates "Platform Update KB2670838". After installing that update and another one I don't have on hand (both called "Platform Updates" and changing Direct3D) I lost about 2000 points in my Fire Strike runs at the same clocks.

So make sure you have SP1 if on Windows 7 but avoid those updates, and try lowering your memory clocks to see if your score improves. You can also try raising CPU clocks if possible.


----------



## Daveros

Quote:


> Originally Posted by *neurotix*
> 
> I would suggest messing with and possibly lowering your memory clock if it's really at 1750mhz. Too high a memory OC can reduce your fps in *any* bench because error checking kicks in. Try putting your memory to 1600mhz and see if your score increases. Additionally, you want core clock most of all, so see if you can hit 1200mhz.
> 
> 7600 is a much better score and is in line with other 280X/7970 owners. Realistically, even my 7870XT gets 7000 in Fire Strike. http://hwbot.org/submission/2543590_neurotix_3dmark___fire_strike_radeon_hd_7870_%28tahiti_core%29_7038_marks This is with an i7 4770k though, which gets a much better physics score.


Thanks for your help. I'm a bit confused, though, because of all the 124 recorded benchmarks of Fire Strike with a i7 920 + single 280X, my result is #5? Doesn't seem too bad?

(though, for some reason it's not recording my overclocks on CPU or GPU on the result?)


----------



## neurotix

Quote:


> Originally Posted by *Daveros*
> 
> Thanks for your help. I'm a bit confused, though, because of all the 124 recorded benchmarks of Fire Strike with a i7 920 + single 280X, my result is #5? Doesn't seem too bad?
> 
> (though, for some reason it's not recording my overclocks on CPU or GPU on the result?)


I didn't even look at that, but if you're #5 overall then your score must be better than I thought. I'd say you should be able to do 8k, but maybe that's pushing it...

Honestly, your CPU is going to make more of a difference in overall score, and is definitely your weakest link. Remember, the IPC jump from Bloomfield to Sandy Bridge was pretty high, then there's another 15% or so from Ivy and then Haswell. You probably realize this though. Going by your sig rig, if you could run your i7 at 4ghz (or more), even just while benching, your score would improve. Same for running your 280X at 1200mhz or above. When I bench I run my CPU at 4.6ghz even though it isn't stable at all, and I've passed XTU at 4.7ghz. But for daily clocks and gaming, I run at 4.5ghz.

Actually, I have a 7970 Vapor-X in my backup rig that I need to re-bench, I also need to re-bench my 270X Vapor-X. If your 280X has a light up logo, it might be basically the same card. (Afaik there are actually *TWO* 280X Vapor-X- one is the older style with a light up Sapphire logo, the other is newer and has the Tri-X cooler and blue accents.)



That's the 7970 I have.


----------



## Daveros

Quote:


> Originally Posted by *neurotix*
> 
> Honestly, your CPU is going to make more of a difference in overall score, and is definitely your weakest link.


Indeed, you are quite correct. It's just holding steady till X99 hits and then I can chuck it in the bin. It's served me well, but it's time...

Mine is the older Vapor-X, I believe, as I got it second hand (though new from RMA) in a trade. The light up "Sapphire" logo is white, and it has a blue-lit button next to the logo, which I assume is for dual bios or OC mode (not sure which?). This is the card:


I might not bother pushing it any harder until I make the move from 1080p, because with that bottleneck it's all superfluous at the moment. I'm think I'm happy with the 1150 on stock volts.

PS: I like your set up.


----------



## neurotix

Quote:


> Originally Posted by *Daveros*
> 
> Indeed, you are quite correct. It's just holding steady till X99 hits and then I can chuck it in the bin. It's served me well, but it's time...
> 
> Mine is the older Vapor-X, I believe, as I got it second hand (though new from RMA) in a trade. The light up "Sapphire" logo is white, and it has a blue-lit button next to the logo, which I assume is for dual bios or OC mode (not sure which?). This is the card:
> 
> 
> I might not bother pushing it any harder until I make the move from 1080p, because with that bottleneck it's all superfluous at the moment. I'm think I'm happy with the 1150 on stock volts.
> 
> PS: I like your set up.


Yeah, that's the older Vapor-X, and it's identical to my 7970.

The blue button is for dual bios and the bios should be the same.

Thanks.


----------



## kersoz2003

My older good motherboard is out of order and I purchased a new cheapest motherboard for my 4790k as a temporary solution and updated bios and works as a charm. however when I tried the same overclock on my r9 280x which is 1200/1700 @ 1.281 voltage , it gets warmer like 3-4 degrees than it was in my older motherboard and starts to make artifacts and gives directx errror after sometime. however when I lower to 1150/1650 @ 1.225 there is no error or flickering and works perfect (still s3-4 degrees hotter compared to older motherboard at these values ) . So why is 1200/1700 causes such problems in new board. Is it because it is the cheapest board ( MSI H81M-P33) ? what will be the cause of this ?


----------



## Majentrix

Power delivery probably won't be as solid as on a high end board. You'll most likely get higher clocks with a better board, though that's just my guess.


----------



## kersoz2003

Quote:


> Originally Posted by *Majentrix*
> 
> Power delivery probably won't be as solid as on a high end board. You'll most likely get higher clocks with a better board, though that's just my guess.


so this extra heat caused of poor power delivery stability ? I thought the same. maybe the board is too low to give best support for such card. It is a 40 dollar board and I have 4970K and r9 280x on it







maybe too much for it right ?

or it wouldn't have to do anything with mobo ? can someone who is expert on this clarify for us







?


----------



## kersoz2003

Also I had reapplied to termal paste of gpu before I start games in new motherboard. I checked everything termal pads on vram and etc are perfectly seated.


----------



## Gh0sT169

Hi everybody!I started a separate disscusions thread specialy for the R7 series owners and I would like to present it to you.We can talk about everything you want about the R7 series graphic cards.

Linky:http://www.overclock.net/t/1504512/amd-r7-series-discussion-thread/0_30


----------



## Devildog83

Quote:


> Originally Posted by *Gh0sT169*
> 
> Hi everybody!I started a separate disscusions thread speccialy for the R7 series owners and I would like to present it to you.We can talk about everything you want about the R7 series graphic cards.
> 
> Linky:http://www.overclock.net/t/1504512/amd-r7-series-discussion-thread/0_30


I have edited the OP to include a link to the R7 club and replaced the pic.


----------



## BruceB

Quote:


> Originally Posted by *kersoz2003*
> 
> Also I had reapplied to termal paste of gpu before I start games in new motherboard. I checked everything termal pads on vram and etc are perfectly seated.


I'd blame the MB, having a good MB is vital CPU OC'ing, I'd expect it to be similar for a GPU. Bad MB=low OC.
Quote:


> Originally Posted by *Devildog83*
> 
> I have edited the OP to include a link to the R7 club and replaced the pic.


----------



## kersoz2003

Guys if you have sapphire vapor-x r9 280x you must download new bios







It is perfect . I had the old bios 0.41 , then modded 0.39 which was also good . But now I upgraded to 42 and more cool ( especially when you manually give fan percentage if you overclock) and I get more fps 55-70 fps with my rig. It is a 5 fps boost over other bioses I tried







)

here is the link :

http://www.sendspace.com/file/ydznox


----------



## End3R

Quote:


> Originally Posted by *kersoz2003*
> 
> Guys if you have sapphire vapor-x r9 280x you must download new bios
> 
> 
> 
> 
> 
> 
> 
> It is perfect . I had the old bios 0.41 , then modded 0.39 which was also good . But now I upgraded to 42 and more cool ( especially when you manually give fan percentage if you overclock) and I get more fps 55-70 fps with my rig. It is a 5 fps boost over other bioses I tried
> 
> 
> 
> 
> 
> 
> 
> )
> 
> here is the link :
> 
> http://www.sendspace.com/file/ydznox


Unrelated, you totally look like my brother when he was younger.


----------



## smoke2

I have an opportunity to buy Sapphire 280X Dual-X card, but cannot find practically any review about it.
I would like to ask, if it is a noisy card in load and idle?
Second question is, if Sapphire admit on warranty the coil whine issue?


----------



## jrizzz

Is this a good Firestrike score for my setup? I see other people with higher scores and was wondering if its just my CPU holding back the score. My 280x was at 1175/1500 when I ran the benchmark. My CPU turbos to 4.0 on 1 core and 3.8 on all cores. http://www.3dmark.com/fs/2496363


----------



## BruceB

Quote:


> Originally Posted by *jrizzz*
> 
> Is this a good Firestrike score for my setup? I see other people with higher scores and was wondering if its just my CPU holding back the score. My 280x was at 1175/1500 when I ran the benchmark. My CPU turbos to 4.0 on 1 core and 3.8 on all cores. http://www.3dmark.com/fs/2496363


Looks good to me, I get ~6600 marks with my 280X @ stock (1030/1500MHz) and my phenom IIx4 965 @ 3.9GHz


----------



## Devildog83

Quote:


> Originally Posted by *jrizzz*
> 
> Is this a good Firestrike score for my setup? I see other people with higher scores and was wondering if its just my CPU holding back the score. My 280x was at 1175/1500 when I ran the benchmark. My CPU turbos to 4.0 on 1 core and 3.8 on all cores. http://www.3dmark.com/fs/2496363


Maybe a bit, here is mine with an 8350.



Trust me I have never been able to explain why my combined score is so low and I have spent loads of time and effort trying.


----------



## jrizzz

Quote:


> Originally Posted by *Devildog83*
> 
> Maybe a bit, here is mine with an 8350.
> 
> 
> 
> Trust me I have never been able to explain why my combined score is so low and I have spent loads of time and effort trying.


Yea that is strange even with higher scores. I scored 7392 with my 7950 at 1175/1600 (cpu was 100mhz higher in this run)


I thought the 280x at the same core clock would scale a bit higher. I guess my cpu is holding back the score compared to other results. I might try what Neurotix said and uninstall those two optional platform updates.


----------



## smoke2

Quote:


> Originally Posted by *DiceAir*
> 
> Manual fan curve. The default manual fan curve in afterburner


Did you test synthetic benchmark test?
Do you have it on stock clocks?


----------



## DiceAir

Quote:


> Originally Posted by *smoke2*
> 
> Did you test synthetic benchmark test?
> Do you have it on stock clocks?


will test synthetic benchmarks tonight. do a valley run and then post results. I'm also running stock clocks. I play bf4 on almost low settings but let's say about medium settings some high and some low and some medium. I want my 95FPS all the time on 2560x1440 so on the test range my usage goes to 50% both cards but average about 35%. I checked my gpu core temps was about 50-60C and my case fans on low. Then i decided to up my fan speeds to full speed and my temps dropped about 3-5C. I would say if I up the graphics and make my gpu's work harder then I would see a bigger difference when having fans on 100%.

My case is the corsair air 540 with 3x cougar vortex fans in the front, 1x af140 at the back and my h100 blowing air inside on the top in push configuration.

Anyway I will post tonight when I get home from work. I will post my scores as well as gpu temp and rm temps.


----------



## smoke2

Ok, please post max. VRM and GPU temps. Thanks.


----------



## Devildog83

Quote:


> Originally Posted by *jrizzz*
> 
> Yea that is strange even with higher scores. I scored 7392 with my 7950 at 1175/1600 (cpu was 100mhz higher in this run)
> 
> 
> I thought the 280x at the same core clock would scale a bit higher. I guess my cpu is holding back the score compared to other results. I might try what Neurotix said and uninstall those two optional platform updates.


Actually at those clocks you could be a bit higher but I don't think drastically, the CPU is holding you back some with the overall score but the graphics is not bad for those clocks. My Devil's are on par or better than most R9 290's as far as a graphics score goes but I have issues with AMD not working so well on Futuremark bench's.


----------



## NameUnknown

Anyone know why 280X prices are dropping right now? Is there a 3xxX series coming soon?


----------



## anubis1127

Quote:


> Originally Posted by *NameUnknown*
> 
> Anyone know why 280X prices are dropping right now? Is there a 3xxX series coming soon?


Probably because they are almost 3 years old at this point?

It doesn't hurt that the GPU mining craze is over either.


----------



## NameUnknown

I guess I'll wait still and see how low they go before I buy a second one. Wasn't sure if there was some other factor like a new GPU lineup coming that was causing the drop.


----------



## anubis1127

Quote:


> Originally Posted by *NameUnknown*
> 
> I guess I'll wait still and see how low they go before I buy a second one. Wasn't sure if there was some other factor like a new GPU lineup coming that was causing the drop.


I don't think we'll see any new GPUs from AMD until they get a die shrink. At least I hope not, a rebadge of a rebadge would be really unnecessary, heh.


----------



## SeanOMatic

Quote:


> Originally Posted by *NameUnknown*
> 
> Anyone know why 280X prices are dropping right now? Is there a 3xxX series coming soon?


The 280X was way overpriced for quite a while due to GPU mining. These cards are years old and based on old technology. Had they not been inflated all-along, chances are they'd be even cheaper now.


----------



## SeanOMatic

Quote:


> Originally Posted by *NameUnknown*
> 
> I guess I'll wait still and see how low they go before I buy a second one. Wasn't sure if there was some other factor like a new GPU lineup coming that was causing the drop.


There is going to be a new GPU lineup.


----------



## anubis1127

Quote:


> Originally Posted by *SeanOMatic*
> 
> There is going to be a new GPU lineup.


When? More 28nm refreshes? I did read something about a new "Tonga" 28nm core to replace Tahiti pro in r9 280 series. Lame.gif

20nm is not happening in 2014.


----------



## NameUnknown

Another refresh isn't worth waiting for, I may not even upgrade again until a die shrink gets refreshed/revised. Given the games I play there is no real reason, heck Xfire is hard to justify


----------



## smoke2

Quote:


> Originally Posted by *DiceAir*
> 
> will test synthetic benchmarks tonight. do a valley run and then post results. I'm also running stock clocks. I play bf4 on almost low settings but let's say about medium settings some high and some low and some medium. I want my 95FPS all the time on 2560x1440 so on the test range my usage goes to 50% both cards but average about 35%. I checked my gpu core temps was about 50-60C and my case fans on low. Then i decided to up my fan speeds to full speed and my temps dropped about 3-5C. I would say if I up the graphics and make my gpu's work harder then I would see a bigger difference when having fans on 100%.
> 
> My case is the corsair air 540 with 3x cougar vortex fans in the front, 1x af140 at the back and my h100 blowing air inside on the top in push configuration.
> 
> Anyway I will post tonight when I get home from work. I will post my scores as well as gpu temp and rm temps.


Don't you forgot on me, my friend?


----------



## DiceAir

Quote:


> Originally Posted by *smoke2*
> 
> Ok, please post max. VRM and GPU temps. Thanks.


Sorry for not posting had no power last night. just remind me via pm later to do so tonight if i have power. I'm so sorry


----------



## BruceB

Quote:


> Originally Posted by *NameUnknown*
> 
> Another refresh isn't worth waiting for, I may not even upgrade again until a die shrink gets refreshed/revised. Given the games I play there is no real reason, heck Xfire is hard to justify


True. I'm hoping to hold on to my 280X for 2 or 3 years, then I'll get the 480X or 580X or whatever it's called by then (knowing AMD's naming system it'll be a 5967PPQ1... ...X







)


----------



## DiceAir

Ok what do you guys think of this card. Cheapest one i can find here in South Africa.

http://www.powercolor.com/Global/products_features.asp?id=524

I'm planning to do this for my old pc that I'm setting up for my racing games.

i7-3770k
8gb ram
My old SSD and secondary HDD.
750w cougar psu

This will also be connected to my old 1080p 60hz monitor. I'm sure it should do the job but what about the r7 260x. We talking about racing games after all


----------



## BruceB

Quote:


> Originally Posted by *DiceAir*
> 
> Ok what do you guys think of this card. Cheapest one i can find here in South Africa.
> 
> http://www.powercolor.com/Global/products_features.asp?id=524
> 
> I'm planning to do this for my old pc that I'm setting up for my racing games.
> i7-3770k
> 8gb ram
> My old SSD and secondary HDD.
> 750w cougar psu
> 
> This will also be connected to my old 1080p 60hz monitor. I'm sure it should do the job but what about the r7 260x. We talking about racing games after all


I've got a Powercolor 280X and it runs cool and quiet, it's not as hevily factory OC'd as some other card brands but that's ok with me.









If you've got the money get the 270X, it's just better. If you buy the 260x you might end up buying a new GPU next year or whenever you decide that a 260x isn't powerful enough anymore. Just get the 270X and call it a day!









If budget is a consideration I'd suggest an i5-4670k instead of an i7-3770k, the i5 won't bottleneck a 270x and it's 60EUR chaper than the i7. You could then put the 60EUR towards a more powerful GPU (like a 270X or even a 280X)









[EDIT]
What I suggested is only true for the euro-zone, you'll have to check prices in your region to see if swapping the i7 for an i5 is worth it!


----------



## DiceAir

Quote:


> Originally Posted by *BruceB*
> 
> I've got a Powercolor 280X and it runs cool and quiet, it's not as hevily factory OC'd as some other card brands but that's ok with me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you've got the money get the 270X, it's just better. If you buy the 260x you might end up buying a new GPU next year or whenever you decide that a 260x isn't powerful enough anymore. Just get the 270X and call it a day!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If budget is a consideration I'd suggest an i5-4670k instead of an i7-3770k, the i5 won't bottleneck a 270x and it's 60EUR chaper than the i7. You could then put the 60EUR towards a more powerful GPU (like a 270X or even a 280X)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [EDIT]
> What I suggested is only true for the euro-zone, you'll have to check prices in your region to see if swapping the i7 for an i5 is worth it!


ok thanks for the info. I just upgraded my pc to the following.

i7-4790k
corsair h100i
r9 280x crossfire
16GB corsair 1600mhz ram
2x SSD and 1x HDD
corsair air 540
corsair ax850w
and so on

So this is what i have from some old parts I got from the office and my old pc

i7-3770k
8GB 1333Mhz ram
1x SSD and 1xHDD(2TB)
standard Office Pc case with 250mm gpu sapce
cougar 750w PSU

so as you can see I'm already on a beast pc. Now i was thinking setting this up as racing pc with my g27 steering wheel that I mounted to a "wheel stand pro". this will save me the effort of having to unpack the thing when I'm playing racing games and want to go back to any other game. So I can just go play racing and when I'm tired of racing can just pause the race and continue on my main pc and play some more demanding games etc etc. This is why I'm asking if the r7 260x should be enough.

The price difference is not Super huge but it's somewhat huge here in South Africa. in our currency it's about +-R930 difference between the r7 260x and r9 270x. I can get any other brand r9 270 but it's more expensive than the powercolor r9 270x so i see no point going for the r9 270. Also no stock in the r9 265


----------



## BruceB

Quote:


> Originally Posted by *DiceAir*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> ok thanks for the info. I just upgraded my pc to the following.
> 
> i7-4790k
> corsair h100i
> r9 280x crossfire
> 16GB corsair 1600mhz ram
> 2x SSD and 1x HDD
> corsair air 540
> corsair ax850w
> and so on
> 
> So this is what i have from some old parts I got from the office and my old pc
> 
> i7-3770k
> 8GB 1333Mhz ram
> 1x SSD and 1xHDD(2TB)
> standard Office Pc case with 250mm gpu sapce
> cougar 750w PSU
> 
> 
> so as you can see I'm already on a beast pc. Now i was thinking setting this up as racing pc with my g27 steering wheel that I mounted to a "wheel stand pro". this will save me the effort of having to unpack the thing when I'm playing racing games and want to go back to any other game. So I can just go play racing and when I'm tired of racing can just pause the race and continue on my main pc and play some more demanding games etc etc. This is why I'm asking if the r7 260x should be enough.
> 
> The price difference is not Super huge but it's somewhat huge here in South Africa. in our currency it's about +-R930 difference between the r7 260x and r9 270x. I can get any other brand r9 270 but it's more expensive than the powercolor r9 270x so i see no point going for the r9 270. Also no stock in the r9 265


I understand.
In that case it depends on which racing games you'll be playing, check out some benchmarks for those games of the 260x and 270x and see if the additional performance of the 270x is worth the money. If the 260x can output 60+FPS then there's no reason to buy the 270x! However if the 260x is only getting 40FPS I would consider the 270x. At the end of the day it's really up to you to decide if it's worth it though









I'm not sure which games you have in mind but personally I'd be inclined to get the 270x anyway for future proofing.

Tell us what you find and what you decide!


----------



## DiceAir

Quote:


> Originally Posted by *BruceB*
> 
> I understand.
> In that case it depends on which racing games you'll be playing, check out some benchmarks for those games of the 260x and 270x and see if the additional performance of the 270x is worth the money. If the 260x can output 60+FPS then there's no reason to buy the 270x! However if the 260x is only getting 40FPS I would consider the 270x. At the end of the day it's really up to you to decide if it's worth it though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm not sure which games you have in mind but personally I'd be inclined to get the 270x anyway for future proofing.
> 
> Tell us what you find and what you decide!


I was thinking of project cars but some other games too. Games like dirt 2+3, Grid 2 + autosport, The f1 games, NFS shift 1 + 2 and so on.


----------



## NameUnknown

Quote:


> Originally Posted by *BruceB*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NameUnknown*
> 
> Another refresh isn't worth waiting for, I may not even upgrade again until a die shrink gets refreshed/revised. Given the games I play there is no real reason, heck Xfire is hard to justify
> 
> 
> 
> True. I'm hoping to hold on to my 280X for 2 or 3 years, then I'll get the 480X or 580X or whatever it's called by then (knowing AMD's naming system it'll be a 5967PPQ1... ...X
> 
> 
> 
> 
> 
> 
> 
> )
Click to expand...

I'll probably end up with Crossfired 280X's or Trifired once I have money available. That will likely tide me over until it really is the 5967PPQ1


----------



## BruceB

Quote:


> Originally Posted by *DiceAir*
> 
> I was thinking of project cars but some other games too. Games like dirt 2+3, Grid 2 + autosport, The f1 games, NFS shift 1 + 2 and so on.


I can't say from personal experience but I found this link for dirt 3 (skip back to page 5 for testing method & settings) and this link for F1 2012 the 260X performs suprisingly well in both, I didn't find any Benchmarks for _NFS Shift_ or _Project Cars_ but the 260X is a strong Performer for the Money.

I'm afraid that's as far as I can help you, it's up to you to decide if the 270X is worth the extra Money or not.


----------



## DiceAir

Quote:


> Originally Posted by *BruceB*
> 
> I can't say from personal experience but I found this link for dirt 3 (skip back to page 5 for testing method & settings) and this link for F1 2012 the 260X performs suprisingly well in both, I didn't find any Benchmarks for _NFS Shift_ or _Project Cars_ but the 260X is a strong Performer for the Money.
> 
> I'm afraid that's as far as I can help you, it's up to you to decide if the 270X is worth the extra Money or not.


Ok so I made up my mind. will go 270x rather cause I would rather have performance to spare. It's also not that the card is going to break the bank. I will just be more comfortable having something that I can almost for sure say it will run 60fps on all the games. for 1080p I think 2GB should be enough. I will also be turning down aa if my FPS is not at 60.

Thanks for the help


----------



## anubis1127

@DiceAirThe r7 260X also has True Audio, a bonus feature that may be cool to have. If for nothing more than to test the feature and hear it in person.

Edit, nvm seems you made your decision.


----------



## dabby91

Quote:


> Originally Posted by *DiceAir*
> 
> I was thinking of project cars but some other games too. Games like dirt 2+3, Grid 2 + autosport, The f1 games, NFS shift 1 + 2 and so on.


Having a quick look at some benchmarks (see links below) the R9 270X is able to run at 60fps on GRID autosport (Ultra settings, no AA), the R9 260X will be averaging around 40-45fps. While the older games will play no problem on both cards, the R7 260X will be forced to run on lower graphics settings to play at decent frame rates on future games. For this reason, I would say get the R9 270X, since it will last longer achieving playable fps than the other card, meaning you won't have to upgrade until at least 2-3 years down the line.

http://www.gamersnexus.net/game-bench/1548-grid-autosport-gpu-benchmark
http://www.tomshardware.co.uk/radeon-r9-280x-r9-270x-r7-260x,review-32796-12.html

E: Oops, took so long writing the reply I see you've decided anyway xD


----------



## Dasboogieman

Quote:


> Originally Posted by *NameUnknown*
> 
> Anyone know why 280X prices are dropping right now? Is there a 3xxX series coming soon?


Probably because 290/X cards are going for cheap as chips. Otherwise, not much point in buying a Tahiti card if the Hawaii card is only $20 or so away considering the performance is better by 35-40%


----------



## DiceAir

Quote:


> Originally Posted by *anubis1127*
> 
> @DiceAirThe r7 260X also has True Audio, a bonus feature that may be cool to have. If for nothing more than to test the feature and hear it in person.
> 
> Edit, nvm seems you made your decision.


Also have a point there but I don't see many games using true audio so will be a waste of money anyway. I've be hearing good things about powercolor cards and this will be my first one. This is some of the cheapest cards you can find in south Africa so maybe if this works out great i will buy these from now on.


----------



## Aleckazee

bought a 280x yesterday







should be here in a few days (although I'm not certain I made the right choice). anyway, I'm just wondering if anyone has had any experience running these cards on a lower power psu? I've been told (in another thread) that my 450W ST45SF-G will work fine but I thought I'd just ask here as well. I'll be replacing the i3 with my 2500k as soon as I get a new mobo for it.


----------



## boot318

Quote:


> Originally Posted by *Aleckazee*
> 
> bought a 280x yesterday
> 
> 
> 
> 
> 
> 
> 
> should be here in a few days (although I'm not certain I made the right choice). anyway, I'm just wondering if anyone has had any experience running these cards on a lower power psu? I've been told (in another thread) that my 450W ST45SF-G will work fine but I thought I'd just ask here as well. I'll be replacing the i3 with my 2500k as soon as I get a new mobo for it.


Never seen my 3570k and 280x (both OCed) hit over 350watts. It might've peak over that some times, but I have never seen it.


----------



## Aleckazee

Ahk, thanks, that's good to hear


----------



## Roboyto

Quote:


> Originally Posted by *Aleckazee*
> 
> bought a 280x yesterday
> 
> 
> 
> 
> 
> 
> 
> should be here in a few days (although I'm not certain I made the right choice). anyway, I'm just wondering if anyone has had any experience running these cards on a lower power psu? I've been told (in another thread) that my 450W ST45SF-G will work fine but I thought I'd just ask here as well. I'll be replacing the i3 with my 2500k as soon as I get a new mobo for it.


Running an OC'd R9 290 and OC'd 4770k off of a Rosewill Capstone 450W in my HTGC. THe 290 is clocked at 1075/1375 +37mV, the 4770k at 4.3GHz, and 8GB 2133 RAM. It's also powering a few fans, drives, and (2) AIO pumps.

http://www.overclock.net/t/1478544/the-build-formerly-known-as-the-devil-inside-a-gaming-htpc-for-my-wife-4770k-corsair-250d-powercolor-r9-290/0_20

You should be OK as long as you're not expecting to push both pieces of hardware to the brink and then run FurMark and IBT simultaneously


----------



## DiceAir

So I have my Powercolor r9 270x turbo duo. Just waiting to get home from work to install it. Should be awesome in my build now. I hope it can handle 1080p 60fps well. I'm tempted to try out steam os but it's missing 2 features.

1. No video player (unless you install one in desktop mode)

2. Unable to install games on a second hard drive. This is very important if you running SSD as main os drive and have secondary drive installed.

if they can include these 2 features it should be much easier to justify installing it.

i'm actually surprised at how small the card is. I'm actually happy about that cause now it will fit for sure.


----------



## End3R

Quote:


> Originally Posted by *DiceAir*
> 
> So I have my Powercolor r9 270x turbo duo. Just waiting to get home from work to install it. Should be awesome in my build now. I hope it can handle 1080p 60fps well.


Here are some benches I ran of some games @ 1080p using the rig in my sig (r9 270x and fx 8320) both stock speeds. (The settings are all max, AA adjusted on each game)


----------



## DiceAir

Quote:


> Originally Posted by *End3R*
> 
> Here are some benches I ran of some games @ 1080p using the rig in my sig (r9 270x and fx 8320) both stock speeds. (The settings are all max, AA adjusted on each game)


Thanks for the info. I can clearly see that I'm not use to 60fps anymore. I'm on 2560x1440 @ 96hz and 60hz feels laggy but that's just me. I only tested out bf4 so far so maybe it's just that game. I will test some more games tomorrow as it's late here.


----------



## Aleckazee

My card arrived yesterday and I'm really happy with it, performance and noise wise but every so often I seem to be getting these strange artifacts where the screen jerks for a split second maybe 3 times in a row, it happens so fast I don't even know how to describe it but i think the screen goes skewed with colorful lines cutting across. It's noticeable enough to be annoying. I'm on driver 14.4. I've tried using dvi and mini display port and there's no difference. I'll go try a different driver now but do I have to completely uninstall 14.4 first to install an older version? Is this a known/fixable problem?


----------



## End3R

Quote:


> Originally Posted by *Aleckazee*
> 
> My card arrived yesterday and I'm really happy with it, performance and noise wise but every so often I seem to be getting these strange artifacts where the screen jerks for a split second maybe 3 times in a row, it happens so fast I don't even know how to describe it but i think the screen goes skewed with colorful lines cutting across. It's noticeable enough to be annoying. I'm on driver 14.4. I've tried using dvi and mini display port and there's no difference. I'll go try a different driver now but do I have to completely uninstall 14.4 first to install an older version? Is this a known/fixable problem?


I had a similar issue with the first r9 I got, seems to be a common issue with r9s - had to RMA it but I haven't had any issues like that with my new card and I've had it for months now.


----------



## DarthBaggins

yes completely remove the 14.4 driver before you install a new one







but sounds like a card issue not driver.


----------



## Aleckazee

Damn :/ so should I just RMA it and not even bother with the different drivers?


----------



## End3R

Quote:


> Originally Posted by *Aleckazee*
> 
> Damn :/ so should I just RMA it and not even bother with the different drivers?


By all means still troubleshoot the issue, I didn't RMA it until I had tried updating to every version of driver available, re-seating the card, and flashing then re-flashing the vbios


----------



## crazymania88

Quote:


> Originally Posted by *Aleckazee*
> 
> My card arrived yesterday and I'm really happy with it, performance and noise wise but every so often I seem to be getting these strange artifacts where the screen jerks for a split second maybe 3 times in a row, it happens so fast I don't even know how to describe it but i think the screen goes skewed with colorful lines cutting across. It's noticeable enough to be annoying. I'm on driver 14.4. I've tried using dvi and mini display port and there's no difference. I'll go try a different driver now but do I have to completely uninstall 14.4 first to install an older version? Is this a known/fixable problem?


is it asus r9 280x m8?
if yes, you shouldn't get it in first place, because it's well known issue.
anyways, they say Asus have an updated bios fixing this issue, if it's asus check it out.


----------



## Aleckazee

Nah its a MSI radeon 3gb gaming. I'll try reseating it and installing other drivers when I get home.


----------



## crazymania88

Quote:


> Originally Posted by *Aleckazee*
> 
> Nah its a MSI radeon 3gb gaming. I'll try reseating it and installing other drivers when I get home.


As I've seen from benchmarks and heared from people, MSI also have a heat issue, (apperantly new bios fixes it? not sure?),
Selecting a 280x is really tricky m8







, I said "Yeah asus quality!" then, I've learned they are bad,

I said Sapphire has good cooler!, Then I've heared some people have hard time cooling VRM and some even burned the card.
(have seen 2 threads within a day)

it seems Powercolor r9 280x is really good, some say it coil whines, some say it doesn't. So some does some doesn't.
Gigabyte's r9 280x seems fine, I am using one myself and nothing is wrong and the noise isn't that high, well at least with mine it's under 30dbA.

I have no knowledge about XFX's or others









Wish you luck finding why it artifacts, if you can just return it, I would return it.


----------



## Aleckazee

Yeah I'll probably return it. I had no idea its so hard to find a good 280x. From the couple games I've played so far it hasn't gone over 60c but that was just motogp 13 (onlyt game I had installed, and GTA 3







). I have read about heating issues with the msi but i thought it has been fixed, I'm not too concerned about it because I plan on putting a block on it some time soon hopefully. I did notice coil whine but its only really loud if its drawing thousands of frames in game menus, which isn't a problem since I cap all my games at 60fps anyway.


----------



## neurotix

Quote:


> Originally Posted by *crazymania88*
> 
> As I've seen from benchmarks and heared from people, MSI also have a heat issue, (apperantly new bios fixes it? not sure?),
> Selecting a 280x is really tricky m8
> 
> 
> 
> 
> 
> 
> 
> , I said "Yeah asus quality!" then, I've learned they are bad,
> 
> I said Sapphire has good cooler!, Then I've heared some people have hard time cooling VRM and some even burned the card.
> (have seen 2 threads within a day)
> 
> it seems Powercolor r9 280x is really good, some say it coil whines, some say it doesn't. So some does some doesn't.
> Gigabyte's r9 280x seems fine, I am using one myself and nothing is wrong and the noise isn't that high, well at least with mine it's under 30dbA.
> 
> I have no knowledge about XFX's or others
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wish you luck finding why it artifacts, if you can just return it, I would return it.


What Sapphire card? Can you provide links to the threads? I'm interested.


----------



## crazymania88

Quote:


> Originally Posted by *Aleckazee*
> 
> Yeah I'll probably return it. I had no idea its so hard to find a good 280x. From the couple games I've played so far it hasn't gone over 60c but that was just motogp 13 (onlyt game I had installed, and GTA 3
> 
> 
> 
> 
> 
> 
> 
> ). I have read about heating issues with the msi but i thought it has been fixed, I'm not too concerned about it because I plan on putting a block on it some time soon hopefully. I did notice coil whine but its only really loud if its drawing thousands of frames in game menus, which isn't a problem since I cap all my games at 60fps anyway.


ofc you'll have coil whine with high-fps.
I was talking about Powercolor r9 280x. Some say it coil whines, some say it doesn't.
But it looks like the best 280x out there and best price







.
(Hynix chips, backplate,good cooler)

if you return it back and get your cash back,
Search a little bit. Some people dislike the
Gigabyte r9 280x windforce,
some dislike powercolor,
Some say "Sapphire da best" then they cannot watercool it under warranity also it has some issues with VRM cooling.
MSİ, well you have one.

but I didn't hear any big issues about Gigabyte r9 280x and Powercolor 280x turboduo.
You can consider getting one, but make sure Gigabyte one is Rev2.0.

Rev1.0 has less quality PCB, Higher Noise and (didn't check myself and I am not sure) they say Rev2.0 has better VRMs.
Card is not good overclocker, it comes 1100/1500 from factory and it most goes to 1230 with a good luck without voltage,
mine can go till 1190 I am at lucky side







(you can unlock voltage, I didn't)

@Neurotix
They were in a Turkish forum, so it'll be pointless.
But make a google search about it, this just came up first so sharing.
http://www.techpowerup.com/forums/threads/solution-is-found-to-sapphire-vapor-x-r9-280x-vrm-over-heating-and-throttling-problem.196972/

and I think there was a thread in OCN too...

also my Sapphire 7870XT had the same VRM temperature issues.


----------



## Aleckazee

I'm happy to stay with the MSI unless there is a brand that doesn't suffer from artifacts, or do they all have that problem? Or if there's a card that's slightly shorter I'll also choose it over the MSI.


----------



## End3R

Quote:


> Originally Posted by *Aleckazee*
> 
> I'm happy to stay with the MSI unless there is a brand that doesn't suffer from artifacts, or do they all have that problem? Or if there's a card that's slightly shorter I'll also choose it over the MSI.


When I was researching the issue, it seemed fairly widespread across all r9s, some people say it's due to the clocks being set too high, or vrm not being cooled, regardless of the cause, it seems to be luck of the draw if the r9 you get is a good one. That being said, when you get a good one, they're solid cards. (Despite that I prefer nvidia for the phsyx goodies)


----------



## crazymania88

Quote:


> Originally Posted by *Aleckazee*
> 
> I'm happy to stay with the MSI unless there is a brand that doesn't suffer from artifacts, or do they all have that problem? Or if there's a card that's slightly shorter I'll also choose it over the MSI.


So try getting another msi card,
I didn't hear too many artifacting issues with MSI r9 280x, there were just overheating and it seems fixed with yoursi
so maybe u just got a bad one.


----------



## neurotix

Quote:


> Originally Posted by *crazymania88*
> 
> ofc you'll have coil whine with high-fps.
> I was talking about Powercolor r9 280x. Some say it coil whines, some say it doesn't.
> But it looks like the best 280x out there and best price
> 
> 
> 
> 
> 
> 
> 
> .
> (Hynix chips, backplate,good cooler)
> 
> if you return it back and get your cash back,
> Search a little bit. Some people dislike the
> Gigabyte r9 280x windforce,
> some dislike powercolor,
> Some say "Sapphire da best" then they cannot watercool it under warranity also it has some issues with VRM cooling.
> MSİ, well you have one.
> 
> but I didn't hear any big issues about Gigabyte r9 280x and Powercolor 280x turboduo.
> You can consider getting one, but make sure Gigabyte one is Rev2.0.
> 
> Rev1.0 has less quality PCB, Higher Noise and (didn't check myself and I am not sure) they say Rev2.0 has better VRMs.
> Card is not good overclocker, it comes 1100/1500 from factory and it most goes to 1230 with a good luck without voltage,
> mine can go till 1190 I am at lucky side
> 
> 
> 
> 
> 
> 
> 
> (you can unlock voltage, I didn't)
> 
> @Neurotix
> They were in a Turkish forum, so it'll be pointless.
> But make a google search about it, this just came up first so sharing.
> http://www.techpowerup.com/forums/threads/solution-is-found-to-sapphire-vapor-x-r9-280x-vrm-over-heating-and-throttling-problem.196972/
> 
> and I think there was a thread in OCN too...
> 
> also my Sapphire 7870XT had the same VRM temperature issues.


I don't know about the newer R9 280X with the Tri-X cooler (this one), but my 7970 Vapor-X has absolutely no issues with this, even under heavy load at 1200mhz folding I didn't see temps pass 75C on the VRM and 60C on the core. There are actually TWO 280X Vapor-X, one is the old style, identical to my 7970 but with a backplate, with the light up Sapphire logo on the left side. The other is the newer one with the Tri-X cooler. The same cooler that's on the new 290X Vapor-X.

If you look here at Kitguru, the new Vapor-X has VRM cooling directly on the heatsink. It's the purple line on the side of the cooler. These directly touch the VRMs and cool them. Additionally, here you can see the older style 280X, which also has VRM cooling in the form of passive heatsinks on the VRM assembly itself. Further, both cards have direct memory cooling with thermal pads directly touching the heatsink assembly.

These are both quality cards and in regular operating conditions, they should run extremely cool. You have to consider the things you're reading. What are the ambient temperatures these people are running? Does their house even have air conditioning? What was the fan speed set to? (I put mine to 100% while gaming) What were they doing to make the card so hot (E.g. Mining)? Does their case have proper airflow with positive pressure? Is their CPU adequately cooled and set up to exhaust hot air (e.g. a hot CPU will probably mean hotter cards)?

Moreover, do you have experience with these cards yourself?

I also have a Sapphire 7870XT and it has absolutely no issues with VRM cooling, and it's in a very poor quality case with only a weak intake and exhaust fan. (see my rig "Green").

If I were going to recommend any 280X I would most certainly recommend the Sapphire Toxic 280X, or the newer style Vapor-X with the Tri-X cooler.

Just because you read some other people have problems with them doesn't mean everyone does, and even then (from the link you posted) Sapphire provided a fix via a bios update.


----------



## crazymania88

Quote:


> Originally Posted by *neurotix*
> 
> I don't know about the newer R9 280X with the Tri-X cooler (this one), but my 7970 Vapor-X has absolutely no issues with this, even under heavy load at 1200mhz folding I didn't see temps pass 75C on the VRM and 60C on the core. There are actually TWO 280X Vapor-X, one is the old style, identical to my 7970 but with a backplate, with the light up Sapphire logo on the left side. The other is the newer one with the Tri-X cooler. The same cooler that's on the new 290X Vapor-X.
> 
> If you look here at Kitguru, the new Vapor-X has VRM cooling directly on the heatsink. It's the purple line on the side of the cooler. These directly touch the VRMs and cool them. Additionally, here you can see the older style 280X, which also has VRM cooling in the form of passive heatsinks on the VRM assembly itself. Further, both cards have direct memory cooling with thermal pads directly touching the heatsink assembly.
> 
> These are both quality cards and in regular operating conditions, they should run extremely cool. You have to consider the things you're reading. What are the ambient temperatures these people are running? Does their house even have air conditioning? What was the fan speed set to? (I put mine to 100% while gaming) What were they doing to make the card so hot (E.g. Mining)? Does their case have proper airflow with positive pressure? Is their CPU adequately cooled and set up to exhaust hot air (e.g. a hot CPU will probably mean hotter cards)?
> 
> Moreover, do you have experience with these cards yourself?
> 
> I also have a Sapphire 7870XT and it has absolutely no issues with VRM cooling, and it's in a very poor quality case with only a weak intake and exhaust fan. (see my rig "Green").
> 
> If I were going to recommend any 280X I would most certainly recommend the Sapphire Toxic 280X, or the newer style Vapor-X with the Tri-X cooler.
> 
> Just because you read some other people have problems with them doesn't mean everyone does, and even then (from the link you posted) Sapphire provided a fix via a bios update.


I've seen 2 VRM burned Sapphire 280x tri-x myself, in gaming not mining.
I was unable to cool vrms of my OWN 7870XT with even 2x120MM PCI cooler at 1800rpm in a case being cooled by 6 1800rpm fans with nice air flow.
and Yeah, I said some of them, not all.. Some of them have this issue, btw air conditioner or the ambient temps shouldn't be the case, nor we have to freeze in our houses to incase keep our GPUs cool.

I love Asus directCU II but I will never recommend an asus 280x directCU II to anyone, because incase it can be the one of the bad ones.

So, why should we take the risk, while the whole 280x' are the same?









in case, I've seen another Sapphire 290 with same cooler, it has no issues at all, so it is obvious some cards have this weird vrm thing.

I am not going to fight or argue with you, personal choices after all,
I always go from safest road and tell people to do the same when they seek help from me.

I was going to get Asus 280x directCU II, I still wish I could but I couldn't take the risk







.
I want that DCUII cooler so bad.


----------



## BruceB

Quote:


> Originally Posted by *Aleckazee*
> 
> I'm happy to stay with the MSI unless there is a brand that doesn't suffer from artifacts, or do they all have that problem? Or if there's a card that's slightly shorter I'll also choose it over the MSI.


Artifacting usually means you're pushing the memory too hard/ letting it get too hot. If you're getting a new card with memory OCing in mind, look for one with Hynix memory.









Quote:


> Originally Posted by *crazymania88*
> 
> I've seen 2 VRM burned Sapphire 280x tri-x myself, in gaming not mining.
> I was unable to cool vrms of my OWN 7870XT with even 2x120MM PCI cooler at 1800rpm in a case being cooled by 6 1800rpm fans with nice air flow.
> and Yeah, I said some of them, not all.. Some of them have this issue, btw air conditioner or the ambient temps shouldn't be the case, nor we have to freeze in our houses to incase keep our GPUs cool.
> I love Asus directCU II but I will never recommend an asus 280x directCU II to anyone, because incase it can be the one of the bad ones.
> So, why should we take the risk, while the whole 280x' are the same?
> 
> 
> 
> 
> 
> 
> 
> 
> in case, I've seen another Sapphire 290 with same cooler, it has no issues at all, so it is obvious some cards have this weird vrm thing.
> I am not going to fight or argue with you, personal choices after all,
> I always go from safest road and tell people to do the same when they seek help from me.
> I was going to get Asus 280x directCU II, I still wish I could but I couldn't take the risk
> 
> 
> 
> 
> 
> 
> 
> .
> I want that DCUII cooler so bad.


Unless you want a record-breaking OC, the brand dosen't matter. All cards should work fine at stock clocks, if there's a problem with your card the manufacturer will replace it. That's what the RMA system is for.









If you want extra security to your purchace I would recommend buying it from a shop instead of over the internet, then you have the additional warranty from the shop (this is what I do, it cost me an extra 10EUR but the shop gives me a 2 year warranty independant of the manufacturer, to me that's worth 10 EUR







). This will make a much larger difference than buying a different brand.


----------



## crazymania88

Quote:


> Originally Posted by *BruceB*
> 
> Artifacting usually means you're pushing the memory too hard/ letting it get too hot. If you're getting a new card with memory OCing in mind, look for one with Hynix memory.
> 
> 
> 
> 
> 
> 
> 
> 
> Unless you want a record-breaking OC, the brand dosen't matter. All cards should work fine at stock clocks, if there's a problem with your card the manufacturer will replace it. That's what the RMA system is for.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you want extra security to your purchace I would recommend buying it from a shop instead of over the internet, then you have the additional warranty from the shop (this is what I do, it cost me an extra 10EUR but the shop gives me a 2 year warranty independant of the manufacturer, to me that's worth 10 EUR
> 
> 
> 
> 
> 
> 
> 
> ). This will make a much larger difference than buying a different brand.


Yep, I buy my stuff from shops instead of internet.

But Asus r9 280x's are disaster, a guy RMA his card 3 times and they all had artifacts







, some say Asus fixed it later on.
I am just avoiding stuff with known factory related issues and chance of getting one , it helps a lot.


----------



## BruceB

Quote:


> Originally Posted by *crazymania88*
> 
> Yep, I buy my stuff from shops instead of internet.
> But Asus r9 280x's are disaster, a guy RMA his card 3 times and they all had artifacts
> 
> 
> 
> 
> 
> 
> 
> , some say Asus fixed it later on.
> I am just avoiding stuff with known factory related issues and chance of getting one , it helps a lot.


I find it had to belive that someone got 3 cards one after the other and all had the same problem and it _wasn't_ his fault.









When it comes to manufactoring problems brands don't mean much because many/most of the brands don't make their own cards, the company that actually makes the cards is called a _vendor_, this can usually be checked via google. (eg. Powercolor, Club3D and VTX3D all have the same vendor: Tul Corporation. Thier cards are technically the same.)

Seriously though, ASUS is a respectable vendor, and as a brand their RMA policy is (AFAIK) good; I don't think you'd regret buying one of their cards (that DCII cooler really does look good!).

Try not to be put off particular brands because some people had a bad experience, there is (in this case) an overwhelming number of people who have bought these cards and had no problems whatsoever, you just don't hear from them because they have nothing to complain about!









If you're looking at other brands I would recommend Powercolor, the price is good and their RMA policy is top notch, they're not known for being great OC'ers though...


----------



## Aleckazee

Quote:


> Originally Posted by *BruceB*
> 
> Artifacting usually means you're pushing the memory too hard/ letting it get too hot. If you're getting a new card with memory OCing in mind, look for one with Hynix memory.
> 
> 
> 
> 
> 
> 
> 
> 
> Unless you want a record-breaking OC, the brand dosen't matter. All cards should work fine at stock clocks, if there's a problem with your card the manufacturer will replace it. That's what the RMA system is for.


Strange thing is that it never happens while I'm gaming, always just if it's on desktop/browsing online, so I don't think it's a heating issue. Either way I'm going to rma it, in the mean time I've been modding my sg05 and test fitting..


Phone camera







I only just managed to squeeze the front panel onto the case, it's actually slightly pushing against the end of the pcb. Might make a small build log when I get the new card and mobo for my 2500k, after that comes the water


----------



## BruceB

Quote:


> Originally Posted by *Aleckazee*
> 
> Strange thing is that it never happens while I'm gaming, always just if it's on desktop/browsing online, so I don't think it's a heating issue. Either way I'm going to rma it, in the mean time I've been modding my sg05 and test fitting..
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Phone camera
> 
> 
> 
> 
> 
> 
> 
> I only just managed to squeeze the front panel onto the case, it's actually slightly pushing against the end of the pcb. Might make a small build log when I get the new card and mobo for my 2500k, after that comes the water


I think an RMA is a good idea at this point. Just a tip: phone them to see if they've got a partner near you, then you can do the whole thing in a day in person.








That build looks cool, I like these smaller rigs!









Just as a side note for my own knowledge to anyone who can answer: is it possible that @Aleckazee's GPU has undervolted its memory too much in its idle state?


----------



## Aleckazee

Quote:


> Originally Posted by *BruceB*
> 
> I think an RMA is a good idea at this point. Just a tip: phone them to see if they've got a partner near you, then you can do the whole thing in a day in person.
> 
> 
> 
> 
> 
> 
> 
> 
> That build looks cool, I like these smaller rigs!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just as a side note for my own knowledge to anyone who can answer: is it possible that @Aleckazee's GPU has undervolted its memory too much in its idle state?


Thanks







It will look a lot better once my loop is back in and i've organized the cables, really running out of room now haha.
I was thinking the same sort of thing, I noticed in afterburner the core clock is 300Mhz and memory 150 MHz at idle but occasionally it will spike to 500 and 1500MHz respectively, don't know if that's normal or not?


----------



## BruceB

Quote:


> Originally Posted by *Aleckazee*
> 
> I was thinking the same sort of thing, I noticed in afterburner the core clock is 300Mhz and memory 150 MHz at idle but occasionally it will spike to 500 and 1500MHz respectively, don't know if that's normal or not?


Yeah, that's normal, it'll be in it's lowest power state to save power until it's given something to do, then it steps up a power state to deliver enough performance for the task.








They tend to have 3 states: _Idle_, _Light Load_ and _Full Load_ (these aren't the real names but you get the idea







).

On my powercolor _idle_ is 300Mhz Core, _light load_ is 501MHz core and _full load_ is 1030MHz core, the memory gets underclocked too but I don't know the values off the top of my head.

However, it doesn't just change the speed, it also changes the voltage to save power and generate less heat.
What I was thinking was if your memory works fine when gaming (_full load_ = full voltage) but is unstable when browsing the web (_idle_ = reduced voltage) it could be that your memory is stabe with full voltage but isn't stable at the reduced _idle_ voltage.
This is just conjecture though, unless someone can confirm that it's possible?

[EIDT]
Have you got a build log for that rig?


----------



## Aleckazee

Quote:


> Originally Posted by *BruceB*
> 
> Yeah, that's normal, it'll be in it's lowest power state to save power until it's given something to do, then it steps up a power state to deliver enough performance for the task.
> 
> 
> 
> 
> 
> 
> 
> 
> They tend to have 3 states: _Idle_, _Light Load_ and _Full Load_ (these aren't the real names but you get the idea
> 
> 
> 
> 
> 
> 
> 
> ).
> 
> On my powercolor _idle_ is 300Mhz Core, _light load_ is 501MHz core and _full load_ is 1030MHz core, the memory gets underclocked too but I don't know the values off the top of my head.
> 
> However, it doesn't just change the speed, it also changes the voltage to save power and generate less heat.
> What I was thinking was if your memory works fine when gaming (_full load_ = full voltage) but is unstable when browsing the web (_idle_ = reduced voltage) it could be that your memory is stabe with full voltage but isn't stable at the reduced _idle_ voltage.
> This is just conjecture though, unless someone can confirm that it's possible?
> 
> [EIDT]
> Have you got a build log for that rig?


Ah, I see.. is there an easy way to up the idle voltage to see if it would help? I haven't really got much experience overclocking GPUs.
This was my build log when I first got the case but I've made a lot of changes since then. Like I said, I'll put another one together once I get some last parts.


----------



## BruceB

Quote:


> Originally Posted by *Aleckazee*
> 
> Ah, I see.. is there an easy way to up the idle voltage to see if it would help? I haven't really got much experience overclocking GPUs.
> This was my build log when I first got the case but I've made a lot of changes since then. Like I said, I'll put another one together once I get some last parts.


Cool I'll check it out!
I think those voltage settings are in the BIOS, OC tools like MSI AfterBurner only modify the _Full Load_ settings. In any case, if it's giving you problems at stock clocks you should get it RMA'd!


----------



## bigpoppapump

My 270 is making a quiet buzzing sound when it's under full load. It's not coming from the fans and as far as I can tell the card is performing completely fine. Is it something I should worry about?


----------



## BruceB

Quote:


> Originally Posted by *bigpoppapump*
> 
> My 270 is making a quiet buzzing sound when it's under full load. It's not coming from the fans and as far as I can tell the card is performing completely fine. Is it something I should worry about?


That could be _coil whine_, without hearing it myself I can't say exactly but if it is coil whine, it's nothing to worry about









Which 270 do you have?


----------



## zila

I just got a MSI R9 270 Gaming 2G. Gonna install it in a few minutes and play some BF4 and see how she goes, upgrade from an HD5870.


----------



## bigpoppapump

Quote:


> Originally Posted by *BruceB*
> 
> That could be _coil whine_, without hearing it myself I can't say exactly but if it is coil whine, it's nothing to worry about
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Which 270 do you have?


That's what I was hoping. It's this guy, specifically: http://www.newegg.com/Product/Product.aspx?Item=N82E16814131545R


----------



## crazymania88

Quote:


> Originally Posted by *BruceB*
> 
> I find it had to belive that someone got 3 cards one after the other and all had the same problem and it _wasn't_ his fault.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When it comes to manufactoring problems brands don't mean much because many/most of the brands don't make their own cards, the company that actually makes the cards is called a _vendor_, this can usually be checked via google. (eg. Powercolor, Club3D and VTX3D all have the same vendor: Tul Corporation. Thier cards are technically the same.)
> 
> Seriously though, ASUS is a respectable vendor, and as a brand their RMA policy is (AFAIK) good; I don't think you'd regret buying one of their cards (that DCII cooler really does look good!).
> 
> Try not to be put off particular brands because some people had a bad experience, there is (in this case) an overwhelming number of people who have bought these cards and had no problems whatsoever, you just don't hear from them because they have nothing to complain about!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you're looking at other brands I would recommend Powercolor, the price is good and their RMA policy is top notch, they're not known for being great OC'ers though...


With asus r9 280x, I assure you, not "some people" but "Over the half of the buyers" had artifacting. It was because of poor memory chips they've used on the cards.
Otherwise I would get the asus r9 280x as I said, you've to see the Asus rog forum it is sad and hilarious same time.

Powercolor is good brand and their new Cards look awesome, and they seem they gonna OC awesome too because of Hynix memory, my Gigabyte has Elpida, I didn't even try to oc.

my old will never buy again brand Sapphire 7870 XT (because of their warranity policies, their bad cooling yes bad, we've talked about it mine was bad too, not all are bad)
could make 1.700mhz with Elpida, but I didn't see an effect on performance, also with my old gts 250 nothing was different so I don't really care about memory OC anyway.


----------



## BruceB

Quote:


> Originally Posted by *crazymania88*
> 
> With asus r9 280x, I assure you, not "some people" but "Over the half of the buyers" had artifacting. It was because of poor memory chips they've used on the cards.
> Otherwise I would get the asus r9 280x as I said, you've to see the Asus rog forum it is sad and hilarious same time.
> Powercolor is good brand and their new Cards look awesome, and they seem they gonna OC awesome too because of Hynix memory, my Gigabyte has Elpida, I didn't even try to oc.
> my old will never buy again brand Sapphire 7870 XT (because of their warranity policies, their bad cooling yes bad, we've talked about it mine was bad too, not all are bad)
> could make 1.700mhz with Elpida, but I didn't see an effect on performance, also with my old gts 250 nothing was different so I don't really care about memory OC anyway.


I stand corrected.








I've spent some time looking around on the internet. There does/did (I'm not sure if it's ongoing, the latest source I could find was from April 2014) appear to be a Problem with the Memory on the ASUS Cards.

Take a look at the 3DMark top single GPU Benchmarks for the 280X (LINK here) you can see that in the top 50 or so the most common vendors are _PC Partner_ (brands are Sapphire (for AMD) and Zotac (for nVidia) ), _ASUSTek_ (brand is ASUS), _Micro-Star International_ (MSI) and a couple from _VisionTek_ (under the brand VisionTek, which I haven't heard of but they seem to be cheap and mad-good OC'ers). Tul Corporation (which makes Powercolor, VTX3D and Club3D) dosen't get a look in until way down the list.









I think, given These 3DMark results, it's fair to say if you're looking for a good OC'er get a Sapphire or (if you have the Patience to RMA it until you get a good one) ASUS. VisonTek and MSI Show up a couple of times too but I haven't heard of VisonTek and don't know what their customer Service is like so I'd say go MSI as a second choice (based on These Benchmarks).
















I like my powercolor Card and I like powercolor as a brand, they have great customer Service. But I _did_ have to RMA my first Card from them due to poorly applied TIM and they really aren't known for their OC ability.
If I could have _any_ 280X I would get a Sapphire Toxic because it's just the best factory OC'd Card. However, Money is a factor for me so if I were to spend my money on a 280X (again) I would go Powercolor (again) because of the Price/Performance Ratio (Club3D is also a good choice) and customer Service.


----------



## BruceB

Quote:


> Originally Posted by *bigpoppapump*
> 
> That's what I was hoping. It's this guy, specifically: http://www.newegg.com/Product/Product.aspx?Item=N82E16814131545R


Then it's nothing to worry about.








The Powercolor Cards either have slightly more coil whine than other Cards or (as I like to think







) their cooler it quiet enough for you to hear it. My Powercolor 280X has noticable coil whine at full load.


----------



## crazymania88

Quote:


> Originally Posted by *BruceB*
> 
> I stand corrected.
> 
> 
> 
> 
> 
> 
> 
> 
> I've spent some time looking around on the internet. There does/did (I'm not sure if it's ongoing, the latest source I could find was from April 2014) appear to be a Problem with the Memory on the ASUS Cards.
> 
> Take a look at the 3DMark top single GPU Benchmarks for the 280X (LINK here) you can see that in the top 50 or so the most common vendors are _PC Partner_ (brands are Sapphire (for AMD) and Zotac (for nVidia) ), _ASUSTek_ (brand is ASUS), _Micro-Star International_ (MSI) and a couple from _VisionTek_ (under the brand VisionTek, which I haven't heard of but they seem to be cheap and mad-good OC'ers). Tul Corporation (which makes Powercolor, VTX3D and Club3D) dosen't get a look in until way down the list.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think, given These 3DMark results, it's fair to say if you're looking for a good OC'er get a Sapphire or (if you have the Patience to RMA it until you get a good one) ASUS. VisonTek and MSI Show up a couple of times too but I haven't heard of VisonTek and don't know what their customer Service is like so I'd say go MSI as a second choice (based on These Benchmarks).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I like my powercolor Card and I like powercolor as a brand, they have great customer Service. But I _did_ have to RMA my first Card from them due to poorly applied TIM and they really aren't known for their OC ability.
> If I could have _any_ 280X I would get a Sapphire Toxic because it's just the best factory OC'd Card. However, Money is a factor for me so if I were to spend my money on a 280X (again) I would go Powercolor (again) because of the Price/Performance Ratio (Club3D is also a good choice) and customer Service.


Actually the 280x toxic and Gigabyte Windforce are equal OCed from factory, just telling.
I'll avoid Sapphire according to my experiences with them, maybe they're not all same everywhere.
A friend of mine, a heavy sapphire fan who always says "Get Sapphire", "Buy sapphire" "Sapphire have best GPUs", even turned against Sapphire







, Just to give you an idea how much of an issue were going on.

I love Powercolor, they're getting better and better, their last Cards are Top-notch the look and quality is really amazing, and price? It's hilarious low for those GPUs.

and if you read my earlier posts, you'll see I am right on my researchs before I buy anything.
I've said "Some Powercolor r9 280x has coil whine, some doesn't" so here we go you have it,
My ears are sensitive so it would drive me crazy (I can hear the Noise of OFF TV from outside the apartment... Czzzzzzzzz in my HEAD, LCDs are really saved me, I would like to have that 280x TurboDuo


----------



## Mr.N00bLaR

Does anyone know why my Sapphire R9-280X doesn't support core voltage control in MSI After burner but it does in Sapphire Trixx? I have confirmed by observing overclocks are more stable (duh) and GPU-Z. Sapphire trixx is not my favorite, MSI Afterburner is and I would like to use that application.

EDIT: Unofficial overclocking and voltage adjustments options have been enabled / set and I still do nnot get vcore adjustment capability.


----------



## crazymania88

Quote:


> Originally Posted by *Mr.N00bLaR*
> 
> Does anyone know why my Sapphire R9-280X doesn't support core voltage control in MSI After burner but it does in Sapphire Trixx? I have confirmed by observing overclocks are more stable (duh) and GPU-Z. Sapphire trixx is not my favorite, MSI Afterburner is and I would like to use that application.
> 
> EDIT: Unofficial overclocking and voltage adjustments options have been enabled / set and I still do nnot get vcore adjustment capability.


if you do not have the voltage bar at all, it's obvious msi afterburner doesn't support your VRM, with my old 7870 XT I had to change some configs in afterburner to get it working.

Make a search on web, I am sure you'll find it.

edit:
By the way, I do not advise overvolting at all, you won't get much of a performance, actually you'll get nothing in fact, because with a 280x you'll go CPU bound in any game when it comes to low FPS







, and high fps isn't really relevant.

There's still time for us to overvolt r9 280x to improve fps


----------



## Aleckazee

Quote:


> Originally Posted by *crazymania88*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Actually the 280x toxic and Gigabyte Windforce are equal OCed from factory, just telling.
> I'll avoid Sapphire according to my experiences with them, maybe they're not all same everywhere.
> A friend of mine, a heavy sapphire fan who always says "Get Sapphire", "Buy sapphire" "Sapphire have best GPUs", even turned against Sapphire
> 
> 
> 
> 
> 
> 
> 
> , Just to give you an idea how much of an issue were going on.
> 
> I love Powercolor, they're getting better and better, their last Cards are Top-notch the look and quality is really amazing, and price? It's hilarious low for those GPUs.
> 
> 
> and if you read my earlier posts, you'll see I am right on my researchs before I buy anything.
> I've said "Some Powercolor r9 280x has coil whine, some doesn't" so here we go you have it,
> My ears are sensitive so it would drive me crazy (I can hear the Noise of OFF TV from outside the apartment... Czzzzzzzzz in my HEAD, LCDs are really saved me, I would like to have that 280x TurboDuo


Me too :/ I can even hear the tiny LED flashing on and off on my laptop if I put it to sleep. Always have to hibernate/shutdown and unplug the charger otherwise it drives me nuts trying to get to sleep. My MSI card has coil whine even at idle, it's not really noticeable if there's a bit of background noise or I have headphones but I'm hoping the replacement won't be so bad :/


----------



## BruceB

Quote:


> Originally Posted by *crazymania88*
> 
> Actually the 280x toxic and Gigabyte Windforce are equal OCed from factory, just telling.
> I'll avoid Sapphire according to my experiences with them, maybe they're not all same everywhere.
> A friend of mine, a heavy sapphire fan who always says "Get Sapphire", "Buy sapphire" "Sapphire have best GPUs", even turned against Sapphire
> 
> 
> 
> 
> 
> 
> 
> , Just to give you an idea how much of an issue were going on.
> I love Powercolor, they're getting better and better, their last Cards are Top-notch the look and quality is really amazing, and price? It's hilarious low for those GPUs.
> and if you read my earlier posts, you'll see I am right on my researchs before I buy anything.
> I've said "Some Powercolor r9 280x has coil whine, some doesn't" so here we go you have it,
> My ears are sensitive so it would drive me crazy (I can hear the Noise of OFF TV from outside the apartment... Czzzzzzzzz in my HEAD, LCDs are really saved me, I would like to have that 280x TurboDuo


TBH, you can only hear coild while with the powercolor Cards because the cooler is just _so_ quiet. I'd bet many of the other 7970/280x's have coil whine at load that gets drowned out by the Sound of the cooler








If you have an un-decoupled HDD in your PC the noise from that will drown out the powercolor's coild whine, just as a reference for how loud it is.


----------



## hatchet_warrior

I've had a XFX 5770 since the day it came out. Needless to say, an upgrade was necessary. So I picked up a stupid cheap Gigabyte 280X OC (hooray for the crash of mining!)

With a little tweaking I now have it running at 1150/1575. http://www.techpowerup.com/gpuz/uupfp/
It think it has a lot more room to grow, but I like that the fans don't have to scream. It gets 1989 in Valley with temps staying under 70C.


----------



## Pursuit of OC

My first GPU for my first build got it from a friend who never used it and need the money. Sold it to me for 120 dollars

Using MSI afterburner I could only get it to 1145 MHz core clock and 1450 MEM Clock, now with saphire Trixx I was able to boost the voltage and got it to 1250 Core clock and 1485 Mem clock from the stock 1050 and 1400

XFX DD R9 270x


----------



## HoneyBadger84

I will be joining this fine club soon ^_^ got these on the way, should be here Thursday or Friday, very excited







They're for [email protected] but still gonna be neat to see what they can do in testing before I put'em to work:


Quote:


> Originally Posted by *Pursuit of OC*
> 
> 
> 
> My first GPU for my first build got it from a friend who never used it and need the money. Sold it to me for 120 dollars
> 
> Using MSI afterburner I could only get it to 1145 MHz core clock and 1450 MEM Clock, now with saphire Trixx I was able to boost the voltage and got it to 1250 Core clock and 1485 Mem clock from the stock 1050 and 1400
> 
> XFX DD R9 270x


Welcome to the "dark side" that is AMD GPUs ^_^ $120 is quite a steal indeed, is it a 280 or 280X, or which? I like the aesthetics of the XFX DD cards, but I've heard their actual cooling performance is pretty meh in comparison to the likes of the Tri-X & Vapor-X coolers from Sapphire.


----------



## End3R

welcome


----------



## bigpoppapump

Quote:


> Originally Posted by *BruceB*
> 
> Then it's nothing to worry about.
> 
> 
> 
> 
> 
> 
> 
> 
> The Powercolor Cards either have slightly more coil whine than other Cards or (as I like to think
> 
> 
> 
> 
> 
> 
> 
> ) their cooler it quiet enough for you to hear it. My Powercolor 280X has noticable coil whine at full load.


Awesome. Thanks man.


----------



## Pursuit of OC

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I will be joining this fine club soon ^_^ got these on the way, should be here Thursday or Friday, very excited
> 
> 
> 
> 
> 
> 
> 
> They're for [email protected] but still gonna be neat to see what they can do in testing before I put'em to work:
> 
> 
> Welcome to the "dark side" that is AMD GPUs ^_^ $120 is quite a steal indeed, is it a 280 or 280X, or which? I like the aesthetics of the XFX DD cards, but I've heard their actual cooling performance is pretty meh in comparison to the likes of the Tri-X & Vapor-X coolers from Sapphire.


Thanks man its a pretty midrange cooler and gets the job done never exceeded 60 Celsius running unigine heaven an never goes past 70 with a voltage of 1.285v I plan on getting another one when the new 300 series comes out. It will be like two for the price of one


----------



## crazymania88

Quote:


> Originally Posted by *hatchet_warrior*
> 
> I've had a XFX 5770 since the day it came out. Needless to say, an upgrade was necessary. So I picked up a stupid cheap Gigabyte 280X OC (hooray for the crash of mining!)
> 
> With a little tweaking I now have it running at 1150/1575. http://www.techpowerup.com/gpuz/uupfp/
> It think it has a lot more room to grow, but I like that the fans don't have to scream. It gets 1989 in Valley with temps staying under 70C.


Yours cooler than mine,
in 30C summer day, mine gets 68-70C in Valley and fans are 2600~~ +-300rpm without OC.
Mine is Rev2.0, so i don't know yours but it doesn't make so much noise, You can only hear it if you run it 100%.
Can you tell me your gpu temp? Room temp and Fan speed? I am curious.

You've 0.41 bios, I've 0.38 and when I check for update, there's nothing for me.
I guess our cards are different maybe?

OC Guru says mine is F60 bios.
Gigabyte does that for different revisions, so is yours rev1.0 or do we both have r2.0s with different revisions between eachother?


----------



## hatchet_warrior

Quote:


> Originally Posted by *crazymania88*
> 
> Yours cooler than mine,
> in 30C summer day, mine gets 68-70C in Valley and fans are 2600~~ +-300rpm without OC.
> Mine is Rev2.0, so i don't know yours but it doesn't make so much noise, You can only hear it if you run it 100%.
> Can you tell me your gpu temp? Room temp and Fan speed? I am curious.
> 
> You've 0.41 bios, I've 0.38 and when I check for update, there's nothing for me.
> I guess our cards are different maybe?
> 
> OC Guru says mine is F60 bios.


I have a Rev 2.0 with Hynix memory. Ambient is normally between 20c and 30c. Right now it's closer to 20c and I run Valley at 59c. fans stay about the same, never reaching 3000. I only see bios .38 too on the Gigabyte site, yet it says mine is the most recent according to gpuz's database. I'm a little confused by it myself.

Where did you find the bios in OC Guru?


----------



## crazymania88

Quote:


> Originally Posted by *hatchet_warrior*
> 
> I have a Rev 2.0 with Hynix memory. Ambient is normally between 20c and 30c. Right now it's closer to 20c and I run Valley at 59c. fans stay about the same, never reaching 3000. I only see bios .38 too on the Gigabyte site, yet it says mine is the most recent according to gpuz's database. I'm a little confused by it myself.
> 
> Where did you find the bios in OC Guru?


Press online support you'll see it. Appearantly yours is F70 with hynix and mine is F60 with Elpida.
I wonder if they have any other differences about coolers?

Your GPU is probably better than mine then, I've elpida memory and with 60-70%fan speed it's 69-70C in valley.
But mine doesn't make such high noise @ high rpms.

F12 and F3 REV2.0 have this -> PowerTune Limit: 200 to 295
Ours have this -> 220 to 265
So F12 and F3 Overclocks better out of box.
Our cards are recent ones,
Even tho I've bought it 2 weeks ago, mine seems to be older than yours because it seems gigabyte started to use hynix memory instead of elpida after F60, unlucky me









I wish I could get one with hynix but I've never had a single device with Hynix, always elpida








Phones, Tablet PCs, GPUs...

edit:
wait what? I've just run valley and my GPU core clock stuck at 1000, even tho it shows 1100 in GPU-Z, it doesn't go 1100 :/ Driver related? what is it :/
edit2:
I know what is it, LOL at myself even tho I know why, I've forgetten a flash video running... You must be aware of that too guys.
If a flash running on your browser, your GPU will not reach boost level, at least mines doesnt't 7870 xt and 280x

edit3:
this is mine on valley:
http://gpuz.techpowerup.com/14/08/06/2x3.png 68-69, never seen 70C for now
(morning here so not that hot yet but I am swetting)
My asic is 60.7% but I don't think this really matters anymore after seeing gigabyte 280x.
It runs cold on air, overclocks good with stock voltage (1190 for me)


----------



## hatchet_warrior

Quote:


> Originally Posted by *crazymania88*
> 
> I wish I could get one with hynix but I've never had a single device with Hynix, always elpida
> 
> 
> 
> 
> 
> 
> 
> 
> Phones, Tablet PCs, GPUs...


Honestly complete luck. I picked it up of ebay for $160. The seller claimed to have bought it in January. I was expecting Elpida due to the Hynix fire. So I was quite shocked when I saw Hynix. My Asic score is 70.6%. OC Guru gives me an F70 for my bios with no updates found.


----------



## crazymania88

Quote:


> Originally Posted by *hatchet_warrior*
> 
> Honestly complete luck. I picked it up of ebay for $160. The seller claimed to have bought it in January. I was expecting Elpida due to the Hynix fire. So I was quite shocked when I saw Hynix. My Asic score is 70.6%. OC Guru gives me an F70 for my bios with no updates found.


it makes be feel depressed, I'll RMA the gpu








MY asic is a boomer, my Memory is a boomer, my temps are boomer









and F70 is obviously better in mining, so it's obvious GPU is better after all.
(740-755KH/s on F60 vs 780-815KH/s on F70 at most OC)

You got one good card for 160$, I do not buy used stuff to be safe









edit:
appearantly they've used 2 hynix and 1 elpida on those cards.
1 hynix is the best, 1 hynix is the worst and they scale like this:
AFR Hynix - Elpida - MFR Hynix
Great -Fine- Okay

, I guess I am not that bad after all







and found a modded bios for better memory management, will try later.


----------



## hatchet_warrior

So apparently I have the good kind of Hynix. This card just keeps getting better!


----------



## crazymania88

Quote:


> Originally Posted by *hatchet_warrior*
> 
> So apparently I have the good kind of Hynix. This card just keeps getting better!


ROFL, and I do not feel as depressed as I was


----------



## neurotix

Sapphire R9 270X Vapor-X 1270/1500mhz - Core i7 4770k 4.5ghz - DDR3 CL10 2400mhz

3dmark11


3dmark11 Extreme


3dmark Fire Strike


3dmark Fire Strike Extreme


All the rest of the benches with this card are on my hwbot profile. http://hwbot.org/user/neurotix

Enjoy.


----------



## Particle

I made the transition today to 270X owner when I purchased a pair of second-hand Diamond 270X cards off eBay. This coming week I'll hopefully also make the transition from Radeon 6970 user to 270X user. Mostly, I think the time is right to own something new. I was determined to skip the 7000 series when it came out. Having been a 6970 user since December of 2010, I think I've waited long enough.

The power should be nice--they're 45% faster for 35% less power than what I've got. At idle they pull even less. Each 6970 pulls about 30W at idle while each 270X should pull about 4W. Each watt present 24/7 like in my system equates to about $1/yr in power, so this change alone will save me $50/yr in power. This should be a fun way to spend some time until the 300 series comes out next year.


----------



## G2O415

I'm here to update my membership status,

I've swapped out my Sapphire R9 270X Dual-X 4GB for an EVGA GTX 780 Dual FTW.

Won't be using my 270X and will most likely sell it off. It's been a good run with the card though, my first owned video card ever


----------



## HoneyBadger84

Got muh cards in. I'll post some pics & test results today, including some TriFire 280X numbers


----------



## crazymania88

Quote:


> Originally Posted by *G2O415*
> 
> I'm here to update my membership status,
> 
> I've swapped out my Sapphire R9 270X Dual-X 4GB for an EVGA GTX 780 Dual FTW.
> 
> Won't be using my 270X and will most likely sell it off. It's been a good run with the card though, my first owned video card ever


You've got a really good GPU now,
I am happy that you didn't fall into fanboy hype "2GB is ENUUFFF NO GAMES GONNA USE OVER 2GGGBB", and got the 4gb version.

Because then, hey? so Why do we have 2GB in our cards now, and we had 1GB enough 4 years ago?
but ofc they all will go on yelling at you LOL.

Happy for you, you got the 4GB version and didn't spend your money to a 2GB card in present day.


----------



## GoLDii3

Quote:


> Originally Posted by *crazymania88*
> 
> You've got a really good GPU now,
> I am happy that you didn't fall into fanboy hype "2GB is ENUUFFF NO GAMES GONNA USE OVER 2GGGBB", and got the 4gb version.
> 
> Because then, hey? so Why do we have 2GB in our cards now, and we had 1GB enough 4 years ago?
> but ofc they all will go on yelling at you LOL.
> 
> Happy for you, you got the 4GB version and didn't spend your money to a 2GB card in present day.


The problem it's not whether we don't need over 2 GB,which i agree is indeed false,but rather if the card can provide enough horsepower to use all the memory.

That's why even if you get a 4 GB 270X you are never going to be able to use over 2 GB,so you are waisting money.


----------



## Particle

Video memory in my experience is one of the most limiting factors these days for how long a card remains useful. I had to punt quad 4850s back in the day not because they didn't have enough raw computational power but because the 512 MB frame buffer ensured I'd have a choppy experience with data constantly paging in and out.


----------



## crazymania88

Quote:


> Originally Posted by *GoLDii3*
> 
> The problem it's not whether we don't need over 2 GB,which i agree is indeed false,but rather if the card can provide enough horsepower to use all the memory.
> 
> That's why even if you get a 4 GB 270X you are never going to be able to use over 2 GB,so you are waisting money.


I assure you, I do not need those informations,
and also did I said anything made you think I don't know those?

270x 4GB is ridicilous,
2GB 770 is ridicilous.

270x should be 2GB,
and 770 should be 4GB.

The companies are ridicilous but they know they gonna always find lots to sell their crap.

Where THE HELL would a 270x use 4GB memory? I am still curious what those engineers smoke out there.

@Particle
and yea Nvidia play that game SO GOOD.
They even had a BEAST GPU with 1512 MB Memory, They are freaking joking.
1.5GB GTX 480! Seriously? 1.5GB?







Look at those Good guys working @ Nvidia, they
are trying as hard as they can to bring a high-end product to lower prices! if it was 2GB it would be damn expensive!!!

Even thinking that makes me anger, I am a person of honesty, can't stand stuff like this.


----------



## HoneyBadger84

Anyone else run TriFire of these things (R9 280Xs) and have issues getting it to activate? It's saying I have the bridges on wrong, but I have both connected, 1 running from card 1 to 2, the other running from card 2 to 3... very odd, I ran the bridges exactly like this on my 7970s in TriFire & they had no issues.


----------



## HoneyBadger84

Got it working, had to rearrange bridges a few times & shut down twice. Was beginning to think I had a bad bridge or two, thank goodness that's not the case









Pictures for all to enjoy (warning, they're big if you view them original size, taken with my phone & I'm pretty sure it's camera is 8MP or so):





Should I post a picture with my name in it or is that good enough for membership?


----------



## Recr3ational

How is it running triple 280x?
I got 2 and want to go triple, thing is, is it worth it as i have to buy blocks too.


----------



## End3R

Quote:


> Originally Posted by *Recr3ational*
> 
> How is it running triple 280x?
> I got 2 and want to go triple, thing is, is it worth it as i have to buy blocks too.


I can't think of any practical reason to, 2 is already more than you need for any game out there - hell, just one 270x (my rig) can play every game out there on max settings @1080p resolutions. You'll see a boost in benchmarks for sure, but I wouldn't think you'd see any practical benefit. But if you want to it's your rig


----------



## Recr3ational

Quote:


> Originally Posted by *End3R*
> 
> I can't think of any practical reason to, 2 is already more than you need for any game out there - hell, just one 270x (my rig) can play every game out there on max settings @1080p resolutions. You'll see a boost in benchmarks for sure, but I wouldn't think you'd see any practical benefit. But if you want to it's your rig


Hmm thanks.


----------



## DarthBaggins

Go triple, nuff said







For me crossfired 270x/7870 is great, just wish I could get a full cover block for my cards


----------



## Recr3ational

Quote:


> Originally Posted by *DarthBaggins*
> 
> Go triple, nuff said
> 
> 
> 
> 
> 
> 
> 
> For me crossfired 270x/7870 is great, just wish I could get a full cover block for my cards


It's so tempting man, thing is I just found out that, I have a 7950 in my cupboard with a block, thing is the blocks a different colour so have to get a new block. I might just go triple 7950s just for the fun of it.


----------



## Particle

I know I mentioned it a while back, but I'm very much looking forward to my dual 270X setup this coming week. I'm coming from a pair of Radeon 6970s. Anyone else use 270X's in crossfire?


----------



## Majentrix

I'm interested in it. Apparently Sapphire's 270x Toxic supports three and even four way crossfire, but no one's ever tested it or even knows if the drivers support it.
I will certainly test it out one day.


----------



## DarthBaggins

The toxxic 270x uses the 280x toxxic's pcb


----------



## crazymania88

Quote:


> Originally Posted by *Particle*
> 
> I know I mentioned it a while back, but I'm very much looking forward to my dual 270X setup this coming week. I'm coming from a pair of Radeon 6970s. Anyone else use 270X's in crossfire?


270x CF>280x OC, probably equal to 290x performance wise.
but they say CF stutters, I've never used.


----------



## HoneyBadger84

Figured out the flickering issue I was having in TriFire in benchmarks: Bad Crossfire bridge, swapped out a few combinations & found one that runs benchmarks fine.

Got 9503 on FireStrike Extreme (with my CPU @ 4.2GHz, cards at stock): http://www.3dmark.com/fs/2574595

& X9778 on 3DMark 11 Extreme: http://www.3dmark.com/3dm11/8602347


----------



## Majentrix

Quote:


> Originally Posted by *DarthBaggins*
> 
> The toxxic 270x uses the 280x toxxic's pcb


I know that, I messaged a Sapphire rep on Facebook about it and he confirmed it. Still, several sources confirm that it works ~in theory~ but I can't be certain until I actually have it up and running.


----------



## Devildog83

*Pursuit of Overclock* Has been added !!! WELCOME

Anyone else please post the card type, a pic and clocks.

Thanks DD.


----------



## slick2500

How do I get mine added I posted it back in January.


----------



## BruceB

Quote:


> Originally Posted by *slick2500*
> 
> How do I get mine added I posted it back in January.


Did you post a Picture and clocks? If so could you post a link to that post?

This sounds like a case for @Devildog83.


----------



## crazymania88

-


----------



## BruceB

Quote:


> Originally Posted by *crazymania88*
> 
> I think I didn't post a pic yet, so I doubt I am in the list.
> 
> I've Gigabyte r9 280x Windforce R2.0 F60 Elpida
> 1100/1500
> 40C İdle, 70C 100% load temps in Summer.
> 
> 
> Spoiler: Warning: Spoiler!


That is one _nice_ looking Card!


----------



## slick2500

Quote:


> Originally Posted by *slick2500*


Is there anything that I missed?


----------



## Devildog83

I have a conference call in a few but I will update the OP with the new members shortly.


----------



## crazymania88

Quote:


> Originally Posted by *BruceB*
> 
> That is one _nice_ looking Card!


some say it has thin pcb, I dunno about old revisions but mine apperantly a good strong card.
you can see from pic in the case :/ is that PCB thin? (I am really asking)


----------



## BruceB

Quote:


> Originally Posted by *crazymania88*
> 
> some say it has thin pcb, I dunno about old revisions but mine apperantly a good strong card.
> you can see from pic in the case :/ is that PCB thin? (I am really asking)


Hard to say, can you measure it? That'd be far more accurate than my guessing!


----------



## crazymania88

Quote:


> Originally Posted by *BruceB*
> 
> Hard to say, can you measure it? That'd be far more accurate than my guessing!


well, it wasn't that important but I'll next time I open up my case.


----------



## HoneyBadger84

I have 3 Sapphire Vapor-X R9 280Xs that are clocked at 1070/1550 stock, as pictured previously:


----------



## D3VIL82

-[Repost]
Hi

I just bought an ASUS r9 280X GPU but facing some problem

When I first boot the system I get the intel splash screen. If I leave the system alone, it goes to a BLACK SCREEN (Not Blank Screen) with a 0_ in the bottom right hand corner . On the actual motherboard, I get 00 on the LED display.

I have tried to do is Boot to a Bootable USB and/or Bootable DVD

I simply select the USB or DVD. What happens? As soon as I press the enter key on my keyboard, a little 0 pops up in the bottom right hand corner of the monitor. The system does not even attempt to process my command to boot to the media. It's like it is predeterminedthat it will not boot no matter what I do.

I was told that 00 on the LED means the board is in normal boot state and that there may be a problem with the board. I have no other tricks up my sleeve when it comes to making this board boot. I have removed all unnecessary devices and tried the most basic boot to no luck.

Processor -i7 2600k
Motherboard - DZ68ZV
Graphics Card - ASUS r9 280X DirectCU II TOP
PSU - AX750


----------



## BruceB

Quote:


> Originally Posted by *D3VIL82*
> 
> -[Repost]
> Hi...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I just bought an ASUS r9 280X GPU but facing some problem
> 
> When I first boot the system I get the intel splash screen. If I leave the system alone, it goes to a BLACK SCREEN (Not Blank Screen) with a 0_ in the bottom right hand corner . On the actual motherboard, I get 00 on the LED display.
> 
> I have tried to do is Boot to a Bootable USB and/or Bootable DVD
> 
> I simply select the USB or DVD. What happens? As soon as I press the enter key on my keyboard, a little 0 pops up in the bottom right hand corner of the monitor. The system does not even attempt to process my command to boot to the media. It's like it is predeterminedthat it will not boot no matter what I do.
> 
> I was told that 00 on the LED means the board is in normal boot state and that there may be a problem with the board. I have no other tricks up my sleeve when it comes to making this board boot. I have removed all unnecessary devices and tried the most basic boot to no luck.
> 
> Processor -i7 2600k
> Motherboard - DZ68ZV
> Graphics Card - ASUS r9 280X DirectCU II TOP
> PSU - AX750


Normally 'FF' (which apparently stands for 'Fully Functional') is what is shown on your MB's LEDs if there's no Problem. Take a look in the Manual and see what the code '00' means. I'm guessing by the unreadable model Name it's a Gigabyte board?









[EDIT]
An Intel board?! Interesting....
Anyway, the code '00' means all good for that MB, so that's a good start, right?
Have you got another GPU with which to try it?

[EDIT EDIT]
Here's a link to a thread where someone had the same Problem with a 7950 (a very similar Card):
https://communities.intel.com/thread/52204
He solved it by booting the PC with a second GPU as the main GPU and then installing the 7950's Drivers then removing the other GPU.

[EDIT EDIT EDIT]
This guy: http://www.tomshardware.co.uk/forum/331453-33-560ti-functioning-build was also having the same Problem, a new BIOS fixed it for him.


----------



## D3VIL82

Quote:


> Originally Posted by *BruceB*
> 
> Normally 'FF' (which apparently stands for 'Fully Functional') is what is shown on your MB's LEDs if there's no Problem. Take a look in the Manual and see what the code '00' means. I'm guessing by the unreadable model Name it's a Gigabyte board?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [EDIT]
> An Intel board?! Interesting....
> Anyway, the code '00' means all good for that MB, so that's a good start, right?
> Have you got another GPU with which to try it?
> 
> [EDIT EDIT]
> Here's a link to a thread where someone had the same Problem with a 7950 (a very similar Card):
> https://communities.intel.com/thread/52204
> He solved it by booting the PC with a second GPU as the main GPU and then installing the 7950's Drivers then removing the other GPU.
> 
> [EDIT EDIT EDIT]
> This guy: http://www.tomshardware.co.uk/forum/331453-33-560ti-functioning-build was also having the same Problem, a new BIOS fixed it for him.


-Yes code '00' means all ok

-Its boot up fine with 9500 , tried both card at same time
9500 all ok
but 280x same prob shows only 0_

-No bios update !
latest is already installed :/


----------



## BruceB

Quote:


> Originally Posted by *D3VIL82*
> 
> -Yes code '00' means all ok
> 
> -Its boot up fine with 9500 , tried both card at same time
> 9500 all ok
> but 280x same prob shows only 0_
> 
> -No bios update !
> latest is already installed :/


From what I could find on Google the answer is either: update BIOS (done), install second GPU as main GPU (closest to the CPU and using the Video out from this Card) boot and install Drivers for 280x, RMA.
You can try to boot it with both Cards again (with the 9500 as the main Card) and see if you can get into Windows (the guy on the other thread said it showed the "_0" Screen for a Little while before continuing to boot).
Have you got another PC in which to test the 280x?


----------



## D3VIL82

Quote:


> Originally Posted by *BruceB*
> 
> From what I could find on Google the answer is either: update BIOS (done), install second GPU as main GPU (closest to the CPU and using the Video out from this Card) boot and install Drivers for 280x, RMA.
> You can try to boot it with both Cards again (with the 9500 as the main Card) and see if you can get into Windows (the guy on the other thread said it showed the "_0" Screen for a Little while before continuing to boot).
> Have you got another PC in which to test the 280x?


also did that..
at both slot 9500 works fine and 280x just give error
can you please provide driver link








did RMA this is my 2nd card , shopkeeper checked in front of me , it booted up fine !

yes i am able to reach there with 9500
i kept pc on with that open setup for nearly 1 hr but no luck

nope , don't have any other nor friends have latest mobo...


----------



## NeoReaper

Hey guys, check out my thread








http://www.overclock.net/t/1507363/support-ticket-sent-to-xfx-and-this-is-what-i-got/0_100


----------



## Devildog83

*crazymania88, Slick2500 and HoneyBadger84 have been added to the club* Welcome all !!!

Sorry for being so absent lately, very busy with work and life at the moment but I promise to be around more.


----------



## redfaction95

Just bought a R9 280X radeon, just wanted to share the unboxing:-
http://www.pakgamers.com/forums/f139/unboxing-msi-r9-280x-oc-205941/


----------



## Devildog83

Quote:


> Originally Posted by *redfaction95*
> 
> Just bought a R9 280X radeon, just wanted to share the unboxing:-
> http://www.pakgamers.com/forums/f139/unboxing-msi-r9-280x-oc-205941/


Nice job, nice card. What clocks are you running if any overclock at all.


----------



## redfaction95

Thanks a lot man








I haven't been running this card at all becasue I have a thermaltake Litepower 600 unit which people told me that will blow out my system, (I just saw this thread now and luckily this thread and I found out crazymania88 a member in this thread is also using the exact same PSU, that I have, on his gigabyte 280x and I have already sent him a PM regarding this).
As I live in very far off place, I have already bought a Rosewill Capstone 550 but it will take a month to deliver, and here is some info if you can help?

Thermaltake Litepower LP-600


It has a multi rail 12v rail with only 36.7A on 12v rail giving just 440W on 12v rail

PS according to the model name "LP-600" it is NOT 600W, it is 500W continuous and 600 peak.
Do you have any word on it? It will help a lot


----------



## redfaction95

Quote:


> Originally Posted by *Devildog83*
> 
> Nice job, nice card. What clocks are you running if any overclock at all.


Sorry, didn't know how to quote as I am new.
Refer to post# 7181


----------



## HoneyBadger84

Lawl dat PSU doe. That's so silly


----------



## redfaction95

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Lawl dat PSU doe. That's so silly


So I should not try plugging in my 280x then?


----------



## HoneyBadger84

Quote:


> Originally Posted by *redfaction95*
> 
> So I should not try plugging in my 280x then?


Oh. It should be okay if you can plug one plug in to each rail. They aren't really that power hungry. I only saw about 850W max load at the plug on 3 280Xs running [email protected] at 1111MHz Core (no voltage increase). And keep in mind my system idles at about 190-230W.


----------



## redfaction95

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Oh. It should be okay if you can plug one plug in to each rail. They aren't really that power hungry. I only saw about 850W max load at the plug on 3 280Xs running [email protected] at 1111MHz Core (no voltage increase). And keep in mind my system idles at about 190-230W.


Got it.
But how am I supposed to plug "plug one plug in to each rail", somebody also suggested me that, but how am I supposed to do that?
It is not like I should simply plug one 8pin and one 6pin (My PSU has 1x8 pin and 1x 6 pin connector already) into the card as it demands?, sorry if I am sounding noob.


----------



## HoneyBadger84

Quote:


> Originally Posted by *redfaction95*
> 
> Got it.
> But how am I supposed to plug "plug one plug in to each rail", somebody also suggested me that, but how am I supposed to do that?
> It is not like I should simply plug one 8pin and one 6pin (My PSU has 1x8 pin and 1x 6 pin connector already) into the card as it demands?, sorry if I am sounding noob.


Let me look up that unit right quick...

I can't find anything too specific, but it should have one plug on each rail automatically if it only has 2 plugs period. I'd go ahead & give it a go, and run something stressful on the GPU like 3DMark FireStrike, Unigine Valley or similar. If it shuts down, you know your PSU doesn't have enough juice.


----------



## redfaction95

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Let me look up that unit right quick...
> 
> I can't find anything too specific, but it should have one plug on each rail automatically if it only has 2 plugs period. I'd go ahead & give it a go, and run something stressful on the GPU like 3DMark FireStrike, Unigine Valley or similar. If it shuts down, you know your PSU doesn't have enough juice.


Oh ok, that is fine. But are you sure it will just be shutting itself down rather than killing anything, anyway I will be upgrading to Rosewill Capstone 550 giving 546W (45.5A) on single 12v rail within 20 days (have already ordered it).


----------



## HoneyBadger84

Quote:


> Originally Posted by *redfaction95*
> 
> Oh ok, that is fine. But are you sure it will just be shutting itself down rather than killing anything, anyway I will be upgrading to Rosewill Capstone 550 giving 546W (45.5A) on single 12v rail within 20 days (have already ordered it).


I doubt the unit is so bad as to fry something instead of just shutting down due to OCP. If you don't want to chance it, I'd just wait.


----------



## crazymania88

As I told him from Private messages, I'll post it here too.
I have same PSU,
when I use it with my 200W FX8320, it shuts itself down RARELY(not even under full GPU stress+CPU stress same time), otherwise it works.

just rarely it goes into protection.
Gonna switch to a 1000W psu, someone will give it to me for free, otherwise I am going to get
Evga Supernova 750 or Seasonic G-750

But that psu caused me to Fry motherboard, actualyl not realy but like that:
I had a short in Case, while looking for short I've punched my case (not that hard),
Things got strange on screen and then when I restart I've seen 120C+ stuck on Socket Temp(AMD),
I thought a sensor issue, then 2 mins later when I boot into windows I've got that smell XD
(YEAH THE DAMN m5a97 R2.0 LET PC RUN AT 120C







)

I shut it down, remove mobo from case, I've ran it outside.
Mobo still works, not a thing is wrong (I am even on 1.4V LLC overclock on that 120C smelling socket).
Everytime I turn pc on after long time, a smell comes from the case, question is:
What did I actually burned? (CPU Socket 120C+)
So how it works then at 4.3GHZ 1.4V if I've got a burned CPU socket or VRM?

I seriously need a dedective








it's open for bets, go on.


----------



## redfaction95

Quote:


> Originally Posted by *crazymania88*
> 
> As I told him from Private messages, I'll post it here too.
> I have same PSU,
> when I use it with my 200W FX8320, it shuts itself down RARELY(not even under full GPU stress+CPU stress same time), otherwise it works.
> 
> just rarely it goes into protection.
> Gonna switch to a 1000W psu, someone will give it to me for free, otherwise I am going to get
> Evga Supernova 750 or Seasonic G-750


Yeah, that was sooo helpful, can not wait to test mine one ASAP on that LP 600
And why "750" if you plan for single GPU do not go above 650 as it will decrease the efficiency curve


----------



## crazymania88

Quote:


> Originally Posted by *redfaction95*
> 
> Yeah, that was sooo helpful, can not wait to test mine one ASAP on that LP 600
> And why "750" if you plan for single GPU do not go above 650 as it will decrease the efficiency curve


will go 200W+ on FX8320, wanna make sure.
and Evga Supernova 750 and Seasonic G-750 cheap considering other CRAPS around here.
It's 170$, Perfect Voltage Regulation, Perfect Ripple, Gold rated.

so instead of getting a Corsair C(rap)X for 140$, I would rather get one Evga or Seasonic G-750
It's really impossible to find a 600W good PSU here, they all are crap, or priced among with those Evga 750W and G-750.

so really, I wish I could find a good 600W here -_-,
There's just the one Seasonic m12ii620, not the EVO version and Hardocp says voltage regulation is bad, and it's bad unit...

it costs me around 110$


----------



## redfaction95

Quote:


> Originally Posted by *crazymania88*
> 
> will go 200W+ on FX8320, wanna make sure.
> and Evga Supernova 750 and Seasonic G-750 cheap considering other CRAPS around here.
> It's 170$, Perfect Voltage Regulation, Perfect Ripple, Gold rated.
> 
> so instead of getting a Corsair C(rap)X for 140$, I would rather get one Evga or Seasonic G-750
> It's really impossible to find a 600W good PSU here, they all are crap, or priced among with those Evga 750W and G-750.
> 
> so really, I wish I could find a good 600W here -_-,
> There's just the one Seasonic m12ii620, not the EVO version and Hardocp says voltage regulation is bad, and it's bad unit...
> 
> it costs me around 110$


Oh yes, Amazon problem.
750 + Gold + sesonic g series or Evga would be great G2 and P2 are very solid for 280x hungry gpu and your beast 200W cpu.


----------



## Roaches

Mind I ask if a 630W can handle a FX-8350 and 7970 Lightning? My PSU is a Rosewill Green 630W which I already own a few since the holiday sales during 2013...

http://www.hardwaresecrets.com/article/Rosewill-Green-Series-630-W-RG630-S12-Power-Supply-Review/881

Just want to be sure I won't run into any shutdowns incase running the FX overclocked during some bench test I'll be doing later this month.


----------



## DarthBaggins

I don't think you'll have issues with the 630 psu


----------



## Roaches

Quote:


> Originally Posted by *DarthBaggins*
> 
> I don't think you'll have issues with the 630 psu


Thanks, I've tested it the other day pairing with a 270 and 270X which it did fine, I didn't paired it with a 7970 yet with concerns of power draw limitations.


----------



## slick2500

Quote:


> Originally Posted by *Devildog83*
> 
> *crazymania88, Slick2500 and HoneyBadger84 have been added to the club* Welcome all !!!
> 
> Sorry for being so absent lately, very busy with work and life at the moment but I promise to be around more.


I'm running a Vapor-X


----------



## HoneyBadger84

Quote:


> Originally Posted by *Roaches*
> 
> Mind I ask if a 630W can handle a FX-8350 and 7970 Lightning? My PSU is a Rosewill Green 630W which I already own a few since the holiday sales during 2013...
> 
> http://www.hardwaresecrets.com/article/Rosewill-Green-Series-630-W-RG630-S12-Power-Supply-Review/881
> 
> Just want to be sure I won't run into any shutdowns incase running the FX overclocked during some bench test I'll be doing later this month.


630W is enough for pretty much any single GPU single CPU system, unless you're getting up in to the ridiculous range on CPU OC & overvolting the crap out of your GPU, I can't see you getting anywhere near 630W.
Quote:


> Originally Posted by *Roaches*
> 
> Thanks, I've tested it the other day pairing with a 270 and 270X which it did fine, I didn't paired it with a 7970 yet with concerns of power draw limitations.


The power draw difference between a 270X & 7970 is noticeable but not enough to be concerned about. Talkin' maybe 30-40W at most, nothing big.


----------



## Roaches

Quote:


> Originally Posted by *HoneyBadger84*
> 
> 630W is enough for pretty much any single GPU single CPU system, unless you're getting up in to the ridiculous range on CPU OC & overvolting the crap out of your GPU, I can't see you getting anywhere near 630W.
> The power draw difference between a 270X & 7970 is noticeable but not enough to be concerned about. Talkin' maybe 30-40W at most, nothing big.


Thanks for the heads up, 4.5 Ghz is my target clock...Just wanted to be sure it can handle a factory OC 7970 during a CPU overclock.


----------



## HoneyBadger84

Well, folding earlier today on a 290X, & 2 280Xs only hit about 650W at the wall... and all three were minorly OCed, and my CPU (which is power hungry) was at 4.2GHz. So I doubt even a good OC on your system will hit that much actual PSU power draw, seeing as 650W at the wall is like 550W actual usage for my PSU.


----------



## Particle

It depends on the apps of course. I'd like to harken back to the good old days of me running through the Battlefield 4 singleplayer campaign with quad 6970s and hitting 1100W. Curiously, BF4 draws substantially more power in the menus than in the game.


----------



## redfaction95

Can any body tell that If I downclock my core clock to "800mhz or less" from 1020mhz(default) in msi afterburner, will it decrease the power/amps draw...? Considering I just touch the "Core Clock" and NONE other thing, every thing including voltages will be at stock.
Kindly reply ASAP.


----------



## HoneyBadger84

If you're going to underclock you might as well undervolt as well, turning down the clock isn't going to lessen the power draw too much without also turning down the voltage.


----------



## redfaction95

Quote:


> Originally Posted by *HoneyBadger84*
> 
> If you're going to underclock you might as well undervolt as well, turning down the clock isn't going to lessen the power draw too much without also turning down the voltage.


Actually I dont want to "touch" voltages as I have never done GPU over clocking before. And how much volts should be there in msi afterburner, my volts graph keeps jumping while I am In game because of the crap PSU i think, it remain 0.8 when I am roaming in windows but under load, it touches 1.1volts








And temps are very good,, I have a case with nearly zero ventilation and have very high ambient temperature and still I get 44 idle 18% fan


----------



## HoneyBadger84

Voltage is going to vary a bit in game based on the clock & GPU effort required to run said game. That's perfectly normal.


----------



## redfaction95

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Voltage is going to vary a bit in game based on the clock & GPU effort required to run said game. That's perfectly normal.


Actually I have ran game for just 10 minutes, after first 3 minute, as the gpu got load, the PC just got stuck and you know the common "BUZZ" sound from the speakers that came and I had to press the power button until the PC got completely hard shutdown, but overall that was perfect, I think that was just related to PSU (or game, but I did not get this common stuckup from 2 years so I hardly agree on game crash). after again starting up I downclocked to 800mhz and then played 7 more minutes as smooth as silk, but as you told, unless i do not undervolt, the power draw will be the same.


----------



## xutnubu

Quote:


> Originally Posted by *redfaction95*
> 
> Just bought a R9 280X radeon, just wanted to share the unboxing:-
> http://www.pakgamers.com/forums/f139/unboxing-msi-r9-280x-oc-205941/


Congrats, mate.

For what I see, you and I both got the same batch *31SB*, even manufactured on the same date.

This batch supposedly uses one custom VRM that is voltage locked, but using MSI AB you can adjust some type of offset that works, because I've managed to reduce my temperatures by undervolting.

It's weird though, if you check the sensors with HWiNFO64 you'll see that the VDDC is always locked to 1.2v but the VRM Voltage is the one that changes.


----------



## redfaction95

Quote:


> Originally Posted by *xutnubu*
> 
> Congrats, mate.
> 
> For what I see, you and I both got the same batch *31SB*, even manufactured on the same date.
> 
> This batch supposedly uses one custom VRM that is voltage locked, but using MSI AB you can adjust some type of offset that works, because I've managed to reduce my temperatures by undervolting.
> 
> It's weird though, if you check the sensors with HWiNFO64 you'll see that the VDDC is always locked to 1.2v but the VRM Voltage is the one that changes.


Thanks a lot mate








Nice to know we share the same batch







and that is one useful info. And btw do you also think that decreasing "only" core clock will not help in any less power draw...?
Actually I am currently using not-so-good PSU and already ordered a new one, till then I will be using the crap one. So I am experiencing game hangups like in NFS rivals and most wanted that I did not experienced with my less power hungry 560ti. But played FIFA 13 as smooth as silk for an hour with no hangup.


----------



## Devildog83

Quote:


> Originally Posted by *HoneyBadger84*
> 
> 630W is enough for pretty much any single GPU single CPU system, unless you're getting up in to the ridiculous range on CPU OC & overvolting the crap out of your GPU, I can't see you getting anywhere near 630W.
> The power draw difference between a 270X & 7970 is noticeable but not enough to be concerned about. Talkin' maybe 30-40W at most, nothing big.


I ran a 7870/270x x-fire + a 8350 all overclocked with Valley and prime95 running at the same time and drew only just over 600w from the wall. Actual load on the PSU was only about 550w.

The main thing about PSU's is quality and performance. A 280x and 8320 should run all day long with a 550w PSU provided it's a good one, even under heavy load. My 8350 @ 4.8 under prime95 draws no were near 200w by itself.


----------



## xutnubu

Quote:


> Originally Posted by *redfaction95*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xutnubu*
> 
> Congrats, mate.
> 
> For what I see, you and I both got the same batch *31SB*, even manufactured on the same date.
> 
> This batch supposedly uses one custom VRM that is voltage locked, but using MSI AB you can adjust some type of offset that works, because I've managed to reduce my temperatures by undervolting.
> 
> It's weird though, if you check the sensors with HWiNFO64 you'll see that the VDDC is always locked to 1.2v but the VRM Voltage is the one that changes.
> 
> 
> 
> Thanks a lot mate
> 
> 
> 
> 
> 
> 
> 
> 
> Nice to know we share the same batch
> 
> 
> 
> 
> 
> 
> 
> and that is one useful info. And btw do you also think that decreasing "only" core clock will not help in any less power draw...?
> Actually I am currently using not-so-good PSU and already ordered a new one, till then I will be using the crap one. So I am experiencing game hangups like in NFS rivals and most wanted that I did not experienced with my less power hungry 560ti. But played FIFA 13 as smooth as silk for an hour with no hangup.
Click to expand...

Decreasing core clock only is not going to do much in releasing PSU stress.

You have to undervolt which is safer than overvolting.

I'd enable voltage control on MSI AB and drag the slider to the left to -50mV and then set the core clock to 800-900MHz


----------



## Particle

The two 270X's are in now. My motherboard's VRM heatsink retention clip broke during the swap though. Stupid plastic push pins. I swapped out the GA-990FXA-UD3 for a 990FXA-GD80V2 and everything seems to work. Power draw is impressive during idle. My system pulls about 60 watts less at idle than it did with two 6970s in it.

The only peculiar thing to me is how both GPUs show up in device manager and GPU-Z says crossfire is enabled, but neither CCC nor AOD seem to see both cards.


----------



## HoneyBadger84

Did you redo your drivers with a cleaning in between?


----------



## Particle

I updated to the latest beta after posting and the issue ceased. After the upgrade I could see both GPUs with MSI Afterburner and had crossfire options in CCC. I believe the card was working correctly before the driver upgrade now despite the visibility issue since the frame rate in a game I tested was the same before and after with high utilization reported on both GPUs once I could see them.


----------



## robmcrock

Can I get added to the club. Running a R9 270X vapor-x 1150/1500.

Cheers


----------



## M1kuTheAwesome

I've been off OCN for too long.
Since PSUs seem to be a popular topic, would my 750W Seasonic M12 II EVO handle a second 280X with my 8350 at 4.8GHz or will I need a new one? I've seen a few second hand cards on sale for half the price of a new one and I'm tempted. But if that means spending money on a new PSU as well I simply won't afford it.


----------



## Recr3ational

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> I've been off OCN for too long.
> Since PSUs seem to be a popular topic, would my 750W Seasonic M12 II EVO handle a second 280X with my 8350 at 4.8GHz or will I need a new one? I've seen a few second hand cards on sale for half the price of a new one and I'm tempted. But if that means spending money on a new PSU as well I simply won't afford it.


Yeah that would be able to power 2x 280x and 8350.


----------



## Devildog83

*robmcrock* has been added, welcome to the club.

Nice little system there. I love the Obsidian series cases. I have the 750D, the big brother to your 450D and when I first saw it in person it was incredible. The pics and videos do not do these cases justice.


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *Recr3ational*
> 
> Yeah that would be able to power 2x 280x and 8350.


Even if I overvolt both cards to 1.3?
Even if not, I think I'm still going for it. I can live without the overvolt since 2 cards will still run everything completely maxed out for a pretty long time. No need to upgrade and I can finally spend my money elsewhere. Unless Crysis 4 gets made.


----------



## Devildog83

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Even if I overvolt both cards to 1.3?
> Even if not, I think I'm still going for it. I can live without the overvolt since 2 cards will still run everything completely maxed out for a pretty long time. No need to upgrade and I can finally spend my money elsewhere. Unless Crysis 4 gets made.


2 of them might be pushing it just a tad if you overclock at all. If it were me I would start looking to get a slightly bigger PSU but if you don't push too hard and it's a good one you will most likely be OK. Even mine as good as it is I have been looking to upgrade to a platinum 760 just for an extra 100w of headroom. From what I have read most PSU's run most efficiently at 50 to 80%, I am at the high end of that when under extreme load. Is it worth the extra money? I don't know for sure but am thinking about it.


----------



## Recr3ational

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Even if I overvolt both cards to 1.3?
> Even if not, I think I'm still going for it. I can live without the overvolt since 2 cards will still run everything completely maxed out for a pretty long time. No need to upgrade and I can finally spend my money elsewhere. Unless Crysis 4 gets made.


I have ran 2 x 280x and a single 7950 with 4770k at 4.5ghz using a 850w bronze psu.
I got pumps fans etc & it was fine.
Buy the cards first. See if you can overclock etc, then buy psu?


----------



## DarthBaggins

The PSU should be fine, you could always check your rigs power consumption:thumb: And that will give you an idea


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *Devildog83*
> 
> 2 of them might be pushing it just a tad if you overclock at all. If it were me I would start looking to get a slightly bigger PSU but if you don't push too hard and it's a good one you will most likely be OK. Even mine as good as it is I have been looking to upgrade to a platinum 760 just for an extra 100w of headroom. From what I have read most PSU's run most efficiently at 50 to 80%, I am at the high end of that when under extreme load. Is it worth the extra money? I don't know for sure but am thinking about it.


Pretty much what I imagined, thanks for clearing it up. I thought if need be I might even try undervolting. I will probably look around for a new PSU though.
Quote:


> Originally Posted by *Recr3ational*
> 
> I have ran 2 x 280x and a single 7950 with 4770k at 4.5ghz using a 850w bronze psu.
> I got pumps fans etc & it was fine.
> Buy the cards first. See if you can overclock etc, then buy psu?


That will be the battle plan. Hopefully I will be fine til at least christmas with my current PSU though.
Quote:


> Originally Posted by *DarthBaggins*
> 
> The PSU should be fine, you could always check your rigs power consumption:thumb: And that will give you an idea


I haven't come across that gizmo yet but I'll do some research and if need be I will order one from abroad.


----------



## Maximus Knight

hey guys!

i sold off my GTX 780 and have just purchased this as i plan to game only on weekends.

felt there wasn't a need for such graphic horsepower as i only use my PC to watch YT, reply emails, write proposals and trade the markets.

hope i can join the club!









PowerColor R9 280 TurboDuo


----------



## robmcrock

Quote:


> Originally Posted by *Devildog83*
> 
> *robmcrock* has been added, welcome to the club.
> 
> Nice little system there. I love the Obsidian series cases. I have the 750D, the big brother to your 450D and when I first saw it in person it was incredible. The pics and videos do not do these cases justice.


Exactly how i felt. I didn't like the look of the case until i saw it at my local store. Cheers for the add.


----------



## BruceB

Quote:


> Originally Posted by *Maximus Knight*
> 
> hey guys!
> 
> i sold off my GTX 780 and have just purchased this as i plan to game only on weekends.
> 
> felt there wasn't a need for such graphic horsepower as i only use my PC to watch YT, reply emails, write proposals and trade the markets.
> 
> hope i can join the club!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *PowerColor R9 280 TurboDuo*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


You sir have fine taste (and a damn clean System







).








Stock clocks for now or are you diving right in with a massive OC?


----------



## Particle

Do I need anything else to be part of the club? Pictured are two 270X GPUs. They're Diamond branded which I picked because they had the same output options as my Radeon 6970s. No new adapters needed.


----------



## HoneyBadger84

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> I've been off OCN for too long.
> Since PSUs seem to be a popular topic, would my 750W Seasonic M12 II EVO handle a second 280X with my 8350 at 4.8GHz or will I need a new one? I've seen a few second hand cards on sale for half the price of a new one and I'm tempted. But if that means spending money on a new PSU as well I simply won't afford it.


The most power consumption I saw from TriFire 280Xs was about 1000W at the wall and that was with my CPU @ 4.6GHz so I don't see a system with 2 cards taking more than 700W max unless you overvolt them a lot.


----------



## Maximus Knight

Quote:


> Originally Posted by *BruceB*
> 
> You sir have fine taste (and a damn clean System
> 
> 
> 
> 
> 
> 
> 
> ).
> 
> 
> 
> 
> 
> 
> 
> 
> Stock clocks for now or are you diving right in with a massive OC?


i just bought it so i haven't played with the clocks yet.

will play around tonight but i need to learn how hahaha

only familiar with over volt and bios flashing for my old 780 heehee

thanks for your compliment tho!


----------



## redfaction95

Quote:


> Originally Posted by *Maximus Knight*
> 
> i just bought it so i haven't played with the clocks yet.
> 
> will play around tonight but i need to learn how hahaha
> 
> only familiar with over volt and bios flashing for my old 780 heehee
> 
> thanks for your compliment tho!


Holy Moly, extreme sick PC, congratz man, Can I ask which casing and sleeving are you using?


----------



## Maximus Knight

Quote:


> Originally Posted by *redfaction95*
> 
> Holy Moly, extreme sick PC, congratz man, Can I ask which casing and sleeving are you using?


hahah sure thank you!

NZXT Phantom 630 Ultra Gunmetal

i'm using Silverstone extensions.

paracord ftw! =)


----------



## Maximus Knight

here is my GPU-Z!



my ASIC is just a mere 52 haha..good overvolter..?

i can't seem to DL afterburner as the link is dead..???


----------



## TopicClocker

I've been playing with an MSI R9 280, the R9 280 could be the best mid-range card currently for the price you pay because of it's overclocking capability, that's two cards from AMD that have insane price per performance, the R9 280 and the R9 290!
The price drops on AMD cards have really done them well.

Although I'm unsure of how well the R9 290 overclocks on average, I've seen a couple matching the 290X.

Quote:


> Originally Posted by *Maximus Knight*
> 
> here is my GPU-Z!
> 
> 
> 
> my ASIC is just a mere 52 haha..good overvolter..?
> 
> i can't seem to DL afterburner as the link is dead..???


It might not be too bad, I got one that had 60 or 60 something ASIC to 1150MHz with ease, I haven't tried any higher yet.
That card looks lovely in your system btw.


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *HoneyBadger84*
> 
> The most power consumption I saw from TriFire 280Xs was about 1000W at the wall and that was with my CPU @ 4.6GHz so I don't see a system with 2 cards taking more than 700W max unless you overvolt them a lot.


So I should be fine. Thanks.
Now, what can you people tell me about Gigabyte 280X cards? There's one on sale for only 130€ but I have no experience at all about Gigabyte cards. How's the cooler, are they voltage locked etc. I have yet to clarify the revision of the card but in general what can you people tell me about them?


----------



## Maximus Knight

Quote:


> Originally Posted by *TopicClocker*
> 
> I've been playing with an MSI R9 280, the R9 280 could be the best mid-range card currently for the price you pay because of it's overclocking capability, that's two cards from AMD that have insane price per performance, the R9 280 and the R9 290!
> The price drops on AMD cards have really done them well.
> 
> Although I'm unsure of how well the R9 290 overclocks on average, I've seen a couple matching the 290X.
> It might not be too bad, I got one that had 60 or 60 something ASIC to 1150MHz with ease, I haven't tried any higher yet.
> That card looks lovely in your system btw.


haha thank you very much!


----------



## agrims

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> So I should be fine. Thanks.
> Now, what can you people tell me about Gigabyte 280X cards? There's one on sale for only 130€ but I have no experience at all about Gigabyte cards. How's the cooler, are they voltage locked etc. I have yet to clarify the revision of the card but in general what can you people tell me about them?


Gigabyte has an awesome cooler. Mine runs good and cool, and a 1100/1500 card is running like a top. It is also very quiet, just make sure you get the rev 2. Also they are voltage locked but a simple bios mod will net you 1.325 volts, and you can slide the voltage down from there.


----------



## murzyn

I dont know do should i write here or make new threat but anyway :
I just bought used for 5 month gigabyte 280x and i have problem with OC.
Testing on furmark max stable core rate is 1140mhz anything above - i got artifacts.
Also when im playing BF4 the only stable core rate is 1100mhz if I add even 10mhz i get artifacts...
Temperatures are super low after 2 hours spended on BF4 i had 64 degree.
Is it normal? Looking on my temperatures i through i will get super high OC but i fail everytime.
[sorry for my english im from Poland







]


----------



## TopicClocker

Quote:


> Originally Posted by *murzyn*
> 
> I dont know do should i write here or make new threat but anyway :
> I just bought used for 5 month gigabyte 280x and i have problem with OC.
> Testing on furmark max stable core rate is 1140mhz anything above - i got artifacts.
> Also when im playing BF4 the only stable core rate is 1100mhz if I add even 10mhz i get artifacts...
> Temperatures are super low after 2 hours spended on BF4 i had 64 degree.
> Is it normal? Looking on my temperatures i through i will get super high OC but i fail everytime.
> [sorry for my english im from Poland
> 
> 
> 
> 
> 
> 
> 
> ]


It could mean that your card isn't a great overclocker, but then again I'm unsure of how well the 280X OCs.
Have you adjusted any voltages or anything, and does this only happen when you overclock?


----------



## murzyn

Quote:


> Originally Posted by *TopicClocker*
> 
> It could mean that your card isn't a great overclocker, but then again I'm unsure of how well the 280X OCs.
> Have you adjusted any voltages or anything, and does this only happen when you overclock?


I only used Power Limit +20 Gpu-z says that card is running at 1.2V
Without overclock i dont have any problems with this card


----------



## TopicClocker

Quote:


> Originally Posted by *murzyn*
> 
> I only used Power Limit +20 Gpu-z says that card is running at 1.2V
> Without overclock i dont have any problems with this card


I've read that a couple of Gigabyte 280Xs are voltage locked and that you may be able to unlock them through a bios flash, that could be what you need to get a higher overclock if you are willing to flash a bios for that. (Can be risky)
Do you know the model number of your card or the revision of it? like rev 2.0?
Quote:


> Originally Posted by *agrims*
> 
> Gigabyte has an awesome cooler. Mine runs good and cool, and a 1100/1500 card is running like a top. It is also very quiet, just make sure you get the rev 2. Also they are voltage locked but a simple bios mod will net you 1.325 volts, and you can slide the voltage down from there.


----------



## murzyn

I got 2.0 REV so should i flash my bios to get control of voltage?
What bios specifically should i download?


----------



## TopicClocker

Quote:


> Originally Posted by *murzyn*
> 
> I got 2.0 REV so should i flash my bios to get control of voltage?
> What bios specifically should i download?


I believe so.
I'm unsure which bios exactly, but someone in the thread should be able to help you, good luck!


----------



## Devildog83

*Maximus Knight and Particle* have been added, welcome!!

Particle, can we see a pic of your cards please.

Maximus Knight, very nice system you have there. Some say red and black is overdone and I say if it looks good it looks good. I like my red and black.


----------



## Devildog83

Quote:


> Originally Posted by *Devildog83*
> 
> *Maximus Knight and Particle* have been added, welcome!!
> 
> Particle, can we see a pic of your cards please.
> 
> Maximus Knight, very nice system you have there. Some say red and black is overdone and I say if it looks good it looks good. I like my red and black.


You may have heard the old adage, mine is "Black, White and Red all over" just like a newspaper.


----------



## murzyn

Okey guys so do anyone know what BIOS should i use for my 280x gigabyte rev 2.0 to unlock voltage control?


----------



## anubis1127

Quote:


> Originally Posted by *murzyn*
> 
> Okey guys so do anyone know what BIOS should i use for my 280x gigabyte rev 2.0 to unlock voltage control?


I believe those are voltage locked.


----------



## End3R

If it's anything like my r9 270x, the voltage is locked in msi afterburner, but NOT in TRIXX.


----------



## TopicClocker

They're voltage locked but apparently you can flash a bios to unlock them.
Quote:


> Originally Posted by *agrims*
> 
> Gigabyte has an awesome cooler. Mine runs good and cool, and a 1100/1500 card is running like a top. It is also very quiet, just make sure you get the rev 2. Also they are voltage locked but a simple bios mod will net you 1.325 volts, and you can slide the voltage down from there.


----------



## CptnSlow

Do I qualify







Here's my card.


----------



## agrims

Quote:


> Originally Posted by *murzyn*
> 
> Okey guys so do anyone know what BIOS should i use for my 280x gigabyte rev 2.0 to unlock voltage control?


www.overclock.net/t/1450456/gigabyte-r9-280x-v2-voltage-lock-bypass-through-bios-mod

The answers you seek are in the rabbit hole above. Good luck!!


----------



## GreenJavelin

In da club!


----------



## Junyou

Got my XFX DoubleD 270x. Pics to follow.


----------



## murzyn

Thanks guys for help, now at 1.3V i got stable 1160mhz core and 1600mhz memory clock it suits me


----------



## Devildog83

Quote:


> Originally Posted by *CptnSlow*
> 
> Do I qualify
> 
> 
> 
> 
> 
> 
> 
> Here's my card.
> 
> 
> Spoiler: Warning: Spoiler!


You have been added. How about some clocks.


----------



## Devildog83

Quote:


> Originally Posted by *GreenJavelin*
> 
> In da club!
> 
> 
> Spoiler: Warning: Spoiler!


You have also been added. Again how about those clocks. Please don't just say stock clocks, I don't know what every cards stock clocks are. Another Obsidian in the club, nice!!


----------



## CptnSlow

Quote:


> Originally Posted by *Devildog83*
> 
> [/SPOILER]
> 
> You have been added. How about some clocks.


I plan too once I get my X-fire setup.


----------



## CptnSlow

Here's my current stock clock on my Gigabyte R9 280X OC 2.0


Spoiler: Warning: Spoiler!


----------



## redfaction95

Guys, can anyone help me a little bit?
As I have recently migrated from nvidia 560ti to 280x (it is nearly 25 days used with warranty and totally untouched condition, not even a single dot of dust or anything, sealed accessories ), did not reinstalled the OS that is 8.1 x64, I am currently running my gpu on a weak PSU that will get replaced within 10 days, I am getting some weird issues, currently my thermaltake lite power is giving me [email protected] according to its specs, the problem is that I am able to play FIFA 13 smoothly forever long but whenever I play NFS rivals, most wanted, battlefield 4 , I get a complete hang up on game just nearly after 1 min of running it, as I used to play games on full screen I did not knew what happened until I by chance played bf4 on windowed mode, a pop up comes saying "display driver has stopped working and has been recovered".
I have fresh installed amd 14.4 drivers and removed the previous one (nvidia) by uninstalling them and also removing the registry from cmd first, and then after many times of game hang ups I also then used driver sweeper latest one in safe mode.
PS there is also some screen tearing after that pop up which remains there until I restart my pc.
Temps are totally under control as I live in relatively hot area.






Any help would be appreciated.
Thanks


----------



## xutnubu

Don't use Driver Sweeper, is old and no longer supported.

Driver Fusion is the new iteration but I would use Display Driver Uninstaller first.

Also do this, after using DDU:

http://www.overclock.net/t/988215/how-to-remove-your-amd-ati-gpu-drivers

And then install the latest WHQL drivers.

Edit:

Oh and since you had NVIDIA before, remove those drivers first using DDU, then proceed with what I typed above.

Always in Safe Mode, as DDU recommends.


----------



## Chargeit

Hey, I just noticed this owners club.

I have a His R9 270x in my AMD/AMD rig. The thing is a beast at the price point.



It's the one on the right.










*A picture I posted on another post showing my 780 next to my 270x.



*A post showing the nice clean stock OC running heaven 4.0 max @ 1080p.


----------



## CptnSlow

Quote:


> Originally Posted by *redfaction95*
> 
> Guys, can anyone help me a little bit?
> As I have recently migrated from nvidia 560ti to 280x (it is nearly 25 days used with warranty and totally untouched condition, not even a single dot of dust or anything, sealed accessories ), did not reinstalled the OS that is 8.1 x64, I am currently running my gpu on a weak PSU that will get replaced within 10 days, I am getting some weird issues, currently my thermaltake lite power is giving me [email protected] according to its specs, the problem is that I am able to play FIFA 13 smoothly forever long but whenever I play NFS rivals, most wanted, battlefield 4 , I get a complete hang up on game just nearly after 1 min of running it, as I used to play games on full screen I did not knew what happened until I by chance played bf4 on windowed mode, a pop up comes saying "display driver has stopped working and has been recovered".
> I have fresh installed amd 14.4 drivers and removed the previous one (nvidia) by uninstalling them and also removing the registry from cmd first, and then after many times of game hang ups I also then used driver sweeper latest one in safe mode.
> PS there is also some screen tearing after that pop up which remains there until I restart my pc.
> Temps are totally under control as I live in relatively hot area.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any help would be appreciated.
> Thanks


What 280x are you using?


----------



## aaronsta1

here is my crossfire setup
1 MSI r9 270x Gaming OC and 1 MSI r9 270 Gaming OC





i wish they were both X models but i dont have any issues and they seem to work good together and i got the 2nd card really cheap.


----------



## anubis1127

I have two of the regular MSI 270s, and wish they were the X version.


----------



## aaronsta1

Quote:


> Originally Posted by *anubis1127*
> 
> I have two of the regular MSI 270s, and wish they were the X version.


the only real difference is the heatsink and the 2 power plugs.
both of mine bench the same separately.


----------



## GreenJavelin

1 Ghz each


----------



## anubis1127

Quote:


> Originally Posted by *aaronsta1*
> 
> the only real difference is the heatsink and the 2 power plugs.
> both of mine bench the same separately.


Yeah, the X has that nice vrm sink on it, and obviously the extra 6pin for more powwwaaa. I believe the X has an extra vrm phase too.

Both my 270s fold at a pretty steady 1100Mhz, so they aren't terrible I don't think, but my 7870s can do 1200Mhz+. Although the 270s using less power isn't necessarily a bad thing for my power bill as they are running 24/7.


----------



## Majentrix

R9 270 died thanks to faulty memory, and I couldn't be bothered going through the nightmare that is Gigabyte's RMA process again, so I bought this Sapphire R9 270x, the Vapor-X version specifically.


It barely fits in my case! Have it clocked at 1150/1600.


----------



## itzvenom

Hello everyone, it is possible for me to flash my Asus Radeon R9 280 DirectCU II TOP 3GB DDR5 PCI-E bios to 280x ?

Thanks a lot,


----------



## murzyn

One more question now my card is overclocked all the time and the voltage is 1.3V what can i do if i wanna to card go to underclock if not used?


----------



## redfaction95

Quote:


> Originally Posted by *xutnubu*
> 
> Don't use Driver Sweeper, is old and no longer supported.
> 
> Driver Fusion is the new iteration but I would use Display Driver Uninstaller first.
> 
> Also do this, after using DDU:
> 
> http://www.overclock.net/t/988215/how-to-remove-your-amd-ati-gpu-drivers
> 
> And then install the latest WHQL drivers.
> 
> Edit:
> 
> Oh and since you had NVIDIA before, remove those drivers first using DDU, then proceed with what I typed above.
> 
> Always in Safe Mode, as DDU recommends.


Quote:


> Originally Posted by *CptnSlow*
> 
> What 280x are you using?


I am using msi R9 280X and yes *I used DDU in safe mode, I want to ask that are you guys sure that this is related to drivers/softeare rather then hardware/PSU/GPU.
This is the one:-
http://www.pakgamers.com/forums/f139/unboxing-msi-r9-280x-oc-205941/


----------



## Aleckazee

Got my new 280x, seems to be working ok, or at least it did until I broke the pc power button







so not sure what to do about that..
MSI R9 280x Gaming
stock 1200/1500 clocks for now.



If I fix the power button somehow and do a little cable management I should be able to secure the cover on haha.


----------



## Recr3ational

Quote:


> Originally Posted by *Aleckazee*
> 
> Got my new 280x, seems to be working ok, or at least it did until I broke the pc power button
> 
> 
> 
> 
> 
> 
> 
> so not sure what to do about that..
> MSI R9 280x Gaming
> stock 1200/1500 clocks for now.
> 
> 
> 
> If I fix the power button somehow and do a little cable management I should be able to secure the cover on haha.


you can buy a power on cable that goes on the motherboard and you can have the wire leading to a hole in the back or something

You can get something like this. Theres loads of them with custom switches.
I just found this quickly.









http://www.amazon.co.uk/Replacement-Power-Button-Switch-Computer/dp/B008LT2Q6I

Edit: I love your small factor rig btw. I want one. Your gpu looks massive.


----------



## Aleckazee

Quote:


> Originally Posted by *Recr3ational*
> 
> you can buy a power on cable that goes on the motherboard and you can have the wire leading to a hole in the back or something
> 
> You can get something like this. Theres loads of them with custom switches.
> I just found this quickly.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.amazon.co.uk/Replacement-Power-Button-Switch-Computer/dp/B008LT2Q6I
> 
> Edit: I love your small factor rig btw. I want one. Your gpu looks massive.


thanks, might get something like that. altho I was hoping I could just replace the whole front io wiring but not sure how/where to get it.


----------



## TopicClocker

Quote:


> Originally Posted by *Majentrix*
> 
> 
> 
> It barely fits in my case! Have it clocked at 1150/1600.


I tried fitting an XFX R9 280 into a small MATX, it was hell!
Quote:


> Originally Posted by *Aleckazee*
> 
> Got my new 280x, seems to be working ok, or at least it did until I broke the pc power button
> 
> 
> 
> 
> 
> 
> 
> so not sure what to do about that..
> MSI R9 280x Gaming
> stock 1200/1500 clocks for now.
> 
> 
> 
> If I fix the power button somehow and do a little cable management I should be able to secure the cover on haha.


Lovely build that is! I love Silverstone's SFF cases!


----------



## Recr3ational

Quote:


> Originally Posted by *Aleckazee*
> 
> thanks, might get something like that. altho I was hoping I could just replace the whole front io wiring but not sure how/where to get it.


If you want the stock on off switch you'll have to ask the manufacturer, they might sell them or able to replace one. Not worth it I think mate.


----------



## Aleckazee

I'm just using a switch from a different case for now.
I can't seem to be able to install the drivers correctly for the 280x tho, the fans are running full speed all the time and hw monitor is seeing it as a 7970. I used the onboard gpu to uninstall all previous nvidia and amd drivers following the guide I found in another thread but still get the same issue. I've also tried reducing the speed in msi which didn't do anything.
Am I missing something? I really don't want to reinstall windows at this stage..


----------



## anubis1127

Quote:


> Originally Posted by *Aleckazee*
> 
> I'm just using a switch from a different case for now.
> I can't seem to be able to install the drivers correctly for the 280x tho, the fans are running full speed all the time and hw monitor is seeing it as a 7970. I used the onboard gpu to uninstall all previous nvidia and amd drivers following the guide I found in another thread but still get the same issue. I've also tried reducing the speed in msi which didn't do anything.
> Am I missing something? I really don't want to reinstall windows at this stage..


Was that a new card, or a new to you card (used)?


----------



## Aleckazee

new, this one here. I had to rma the first one they sent me because it had issues with artifacting.


----------



## hatchet_warrior

I'm having an issue that I was hoping fellow 280x users can help with. It all started with abysmal FPS in WildStar. I was getting sub 20 open world in some places and well below 8 in large cities. I opened a support ticket, and after sending many system reports (dxdiag, speccy, gpu-z) it was determined that my card is only running in 8x despite being in a 16x slot. First I call BS as I doubt 8x vs 16x will make THAT big of a difference. But, I was told to reseat the gpu and see if that made a difference. It did but not in a good way. First 2 boots caused a system lock up. 3 boot was fin and stable. But now GPU-Z shows the card is at 2x. To check stability and performance I ran Valley and got a solid 1900 score and 45.5fps. That is less than 2fps lower than my max oc.

So my main question is, is my card really at 2x or is GPU-Z messed up? And how can I make it run at the 16x it should? At this point I don't care too much for WildStar support despite the effort they have put into helping me.


----------



## TopicClocker

Quote:


> Originally Posted by *hatchet_warrior*
> 
> I'm having an issue that I was hoping fellow 280x users can help with. It all started with abysmal FPS in WildStar. I was getting sub 20 open world in some places and well below 8 in large cities. I opened a support ticket, and after sending many system reports (dxdiag, speccy, gpu-z) it was determined that my card is only running in 8x despite being in a 16x slot. First I call BS as I doubt 8x vs 16x will make THAT big of a difference. But, I was told to reseat the gpu and see if that made a difference. It did but not in a good way. First 2 boots caused a system lock up. 3 boot was fin and stable. But now GPU-Z shows the card is at 2x. To check stability and performance I ran Valley and got a solid 1900 score and 45.5fps. That is less than 2fps lower than my max oc.
> 
> So my main question is, is my card really at 2x or is GPU-Z messed up? And how can I make it run at the 16x it should? At this point I don't care too much for WildStar support despite the effort they have put into helping me.


Wildstar is quite demanding on the CPU since it's an MMO, when I played the beta on my Phenom II X4 B55 @3.9GHz my frames were a bit low too, the game's performance may have changed now though since release and patches, how do other games perform for you?


----------



## hatchet_warrior

Everything else is great. Even WildStar stays below 60% CPU and below 25% gpu. I play Titanfall on insane, Sniper Elite 3 on max and Wolfenstien on high (maxed it fluctuated too much), and Skyrim with quite a few mods to make it look pretty. I know I'm in need of a CPU upgrade but I can't afford it right after a big gpu purchase.


----------



## buttface420

would it be worth it to get a second 280x or just wait it out for the new 300x series? or do both and get a second 280x for now until then?


----------



## buttface420

also, i saw a youtube vid of some dude who crossfired a hd 7950 with the 280x. said his pc saw them both as a 7970. was showing real good fps on bf4


----------



## hatchet_warrior

Quote:


> Originally Posted by *buttface420*
> 
> his pc saw them both as a 7970


I'm starting to discover that most PCs have no clue what is going on.


----------



## Recr3ational

Quote:


> Originally Posted by *buttface420*
> 
> also, i saw a youtube vid of some dude who crossfired a hd 7950 with the 280x. said his pc saw them both as a 7970. was showing real good fps on bf4


I tri fired my 7950 with 2 x 280x and said it was 7900 series. How did he manage to get it to say 7970?


----------



## buttface420

Quote:


> Originally Posted by *Recr3ational*
> 
> I tri fired my 7950 with 2 x 280x and said it was 7900 series. How did he manage to get it to say 7970?


i dont know how. he was using a msi frozer OC 7950. i heard those overclocked run at about a 7970.

heres the vid:

http://www.youtube.com/watch?v=IcC7zDWAeGY

around the 3:00 minutes in he says on his benchmarks like 3dmark or whatever it recognizes it as a 7970.
he's probably full of le poop


----------



## Arkanon

Quote:


> Originally Posted by *hatchet_warrior*
> 
> I'm having an issue that I was hoping fellow 280x users can help with. It all started with abysmal FPS in WildStar. I was getting sub 20 open world in some places and well below 8 in large cities. I opened a support ticket, and after sending many system reports (dxdiag, speccy, gpu-z) it was determined that my card is only running in 8x despite being in a 16x slot. First I call BS as I doubt 8x vs 16x will make THAT big of a difference. But, I was told to reseat the gpu and see if that made a difference. It did but not in a good way. First 2 boots caused a system lock up. 3 boot was fin and stable. But now GPU-Z shows the card is at 2x. To check stability and performance I ran Valley and got a solid 1900 score and 45.5fps. That is less than 2fps lower than my max oc.
> 
> So my main question is, is my card really at 2x or is GPU-Z messed up? And how can I make it run at the 16x it should? At this point I don't care too much for WildStar support despite the effort they have put into helping me.


PCI-E bandwith and speed drops when the card is in idle state. When idle it's normal it only reports PCI-E state as 2x. Check under load if it still happens. Under load it should bump up to 16x


----------



## TopicClocker

Quote:


> Originally Posted by *hatchet_warrior*
> 
> Everything else is great. Even WildStar stays below 60% CPU and below 25% gpu. I play Titanfall on insane, Sniper Elite 3 on max and Wolfenstien on high (maxed it fluctuated too much), and Skyrim with quite a few mods to make it look pretty. I know I'm in need of a CPU upgrade but I can't afford it right after a big gpu purchase.


Yeah I know what you mean, that CPU will give you a good couple more years don't worry, but MMOs wont run as good as they could.


----------



## Recr3ational

Quote:


> Originally Posted by *buttface420*
> 
> i dont know how. he was using a msi frozer OC 7950. i heard those overclocked run at about a 7970.
> 
> heres the vid:
> 
> http://www.youtube.com/watch?v=IcC7zDWAeGY
> 
> around the 3:00 minutes in he says on his benchmarks like 3dmark or whatever it recognizes it as a 7970.
> he's probably full of le poop


Oh well when its overclocked it does run about the same s a 7970. I had the same card. They were awesome underwater.


----------



## rdr09

Quote:


> Originally Posted by *buttface420*
> 
> i dont know how. he was using a msi frozer OC 7950. i heard those overclocked run at about a 7970.
> 
> heres the vid:
> 
> http://www.youtube.com/watch?v=IcC7zDWAeGY
> 
> around the 3:00 minutes in he says on his benchmarks like 3dmark or whatever it recognizes it as a 7970.
> he's probably full of le poop


he is right. i did that in the past (7970/7950) . . .

http://www.3dmark.com/3dm11/6233890

those 7950s can be had for $100 in the bay.

@Rec3, your 280s are both @ 1250. Wow.


----------



## Recr3ational

Quote:


> Originally Posted by *rdr09*
> 
> he is right. i did that in the past (7970/7950) . . .
> 
> http://www.3dmark.com/3dm11/6233890
> 
> those 7950s can be had for $100 in the bay.
> 
> @Rec3, your 280s are both @ 1250. Wow.


Thanks dude, water helps.


----------



## hatchet_warrior

Quote:


> Originally Posted by *Arkanon*
> 
> PCI-E bandwith and speed drops when the card is in idle state. When idle it's normal it only reports PCI-E state as 2x. Check under load if it still happens. Under load it should bump up to 16x


It doesn't seem to be. It only changes from PCI-E 1.1 to 2.0 which I believe is because my CPU only supports up to 2.0.

Last night I got it back to 16x after trying different slots with no luck. It was stable and bench marked a few fps higher. I put the computer to sleep rather than shutting it down. This morning I woke it only to get a bluescreen. GPU-Z is now saying it is back at 2x.

I'm just confused as to how, with zero movement, the number of lanes can change from 16 to 2.


----------



## rdr09

Quote:


> Originally Posted by *hatchet_warrior*
> 
> It doesn't seem to be. It only changes from PCI-E 1.1 to 2.0 which I believe is because my CPU only supports up to 2.0.
> 
> Last night I got it back to 16x after trying different slots with no luck. It was stable and bench marked a few fps higher. I put the computer to sleep rather than shutting it down. This morning I woke it only to get a bluescreen. GPU-Z is now saying it is back at 2x.
> 
> I'm just confused as to how, with zero movement, the number of lanes can change from 16 to 2.


Arkanon meant that at load it should only change to 2.0 from 1.1 unless both GPU and PCIe lane supports 3.0, then it should show 3.0 - afaik.

Current cards, i read, do not saturate 2.0, so there won't be much difference between the 3.0 and 2.0. i read that only highend cards in multi-gpu setups above 2 cards is when the 3.0 will start to matter. meaning performance becomes apparent.

the bluescreen, though, needs to be diagnosed. you can upload your dump file for analysis. here is the forum for it . . .

http://www.overclock.net/f/17986/crash-analysis-and-debugging

edit: BTW, you have 2 separate cables connected to your GPU?


----------



## buttface420

nevermind, im too dumb to post a pic of my 280x


----------



## hatchet_warrior

Quote:


> Originally Posted by *rdr09*
> 
> Arkanon meant that at load it should only change to 2.0 from 1.1 unless both GPU and PCIe lane supports 3.0, then it should show 3.0 - afaik.
> 
> Current cards, i read, do not saturate 2.0, so there won't be much difference between the 3.0 and 2.0. i read that only highend cards in multi-gpu setups above 2 cards is when the 3.0 will start to matter. meaning performance becomes apparent.
> 
> edit: BTW, you have 2 separate cables connected to your GPU?


Yeah I have 2 seperate cables. I switched them and it seems to be holding at 16x through a few restarts. I'll keep an eye on it.

Thanks for the link to the bsod forum. I never knew that existed. I'll post over there tonight.


----------



## Snucks

Can anyone tell me why im getting 60-90FPS in Counter Strike GO with an MSI R9 280x and i5 4690k... Please help


----------



## End3R

Quote:


> Originally Posted by *Snucks*
> 
> Can anyone tell me why im getting 60-90FPS in Counter Strike GO with an MSI R9 280x and i5 4690k... Please help


Can you tell me why you're qqing about above 60 fps?


----------



## Snucks

People with similar specs to me are getting 200+ fps.. and my gpu is not performing as it should... i think


----------



## hatchet_warrior

Try benchmarking it with Valley. Using CS to determine if you have a bad card is like saying "that car looks faster than mine, mine must be broken."

Also some GPU-Z screen shots would help, and fill in your system specs in the system builder so we can always see what you're running.


----------



## buttface420

Quote:


> Originally Posted by *Snucks*
> 
> People with similar specs to me are getting 200+ fps.. and my gpu is not performing as it should... i think


check your cpu for parked cores .optimize your os for gaming, ccleaner to clear all temp files, cut off all background programs. overclock your cpu. 60-90 frames is good actually.


----------



## Snucks

please come to my thread i have everything listed along with usage and temps... no one is helping me in thread, all info is there

http://forums.guru3d.com/showthread.php?p=4897648#post4897648


----------



## Snucks

please come to my thread i have everything listed along with usage and temps... no one is helping me in thread, all info is there

http://forums.guru3d.com/showthread.php?p=4897648#post4897648


----------



## hatchet_warrior

Quote:


> Originally Posted by *Snucks*
> 
> please come to my thread i have everything listed along with usage and temps... no one is helping me in thread, all info is there
> 
> http://forums.guru3d.com/showthread.php?p=4897648#post4897648


I still would like to see some GPU-z and even CPU-Z screenshots. Those tell us how your computer is running not what parts are in it. It is up to you if you want to provide them, but don't say no one is helping when you ignore requests for information. there are 1,000 's of reasons that you could be seeing low fps. But again FPS means nothing because it can change so much from game to game. I don't have CD to be able to even relate to your system.

If you run almost any benchmark people will be able to tell you if something is wrong with your card.

Also this is Overclock.net so I'll answer your questions on overclock.net


----------



## Snucks

ill post screen shots on my thread h/o
I really appreciate/need the help<3


----------



## buttface420

with your setup you should get great fps,im not sure about cs go.60-90 is great tho. you are running 14.4 catalyst, upgrading it to 14.7 does help with some games like bf4 and adds mantle.

if you have not un-parked your cores, then more than likely your windows has them parked. meaning one of your 4 cores are full speed while the other 3 are capped. theres a easy program to download that searches to see if they are parked,and if they are it can un park them.

also....you only have a 600 watt power supply, a r9 280x calls for at least a 750 watt. not saying it cant work with what you have..but it may be under powered which could prevent the performance you should have.

you should download a benchmark program such as unigine heaven or valley, and tell us what you are getting, your 60-90 fps might be about right


----------



## Snucks

Quote:


> Originally Posted by *hatchet_warrior*
> 
> I still would like to see some GPU-z and even CPU-Z screenshots. Those tell us how your computer is running not what parts are in it. It is up to you if you want to provide them, but don't say no one is helping when you ignore requests for information. there are 1,000 's of reasons that you could be seeing low fps. But again FPS means nothing because it can change so much from game to game. I don't have CD to be able to even relate to your system.
> 
> If you run almost any benchmark people will be able to tell you if something is wrong with your card.
> 
> Also this is Overclock.net so I'll answer your questions on overclock.net


i am unable to post links to my thread yet can i pm them to you?


----------



## Snucks

Quote:


> Originally Posted by *buttface420*
> 
> with your setup you should get great fps,im not sure about cs go.60-90 is great tho. you are running 14.4 catalyst, upgrading it to 14.7 does help with some games like bf4 and adds mantle.
> 
> if you have not un-parked your cores, then more than likely your windows has them parked. meaning one of your 4 cores are full speed while the other 3 are capped. theres a easy program to download that searches to see if they are parked,and if they are it can un park them.
> 
> also....you only have a 600 watt power supply, a r9 280x calls for at least a 750 watt. not saying it cant work with what you have..but it may be under powered which could prevent the performance you should have.
> 
> you should download a benchmark program such as unigine heaven or valley, and tell us what you are getting, your 60-90 fps might be about right


everywhere i read said 600w was enough?  ill download heaven now


----------



## Snucks

Quote:


> Originally Posted by *buttface420*
> 
> with your setup you should get great fps,im not sure about cs go.60-90 is great tho. you are running 14.4 catalyst, upgrading it to 14.7 does help with some games like bf4 and adds mantle.
> 
> if you have not un-parked your cores, then more than likely your windows has them parked. meaning one of your 4 cores are full speed while the other 3 are capped. theres a easy program to download that searches to see if they are parked,and if they are it can un park them.
> 
> also....you only have a 600 watt power supply, a r9 280x calls for at least a 750 watt. not saying it cant work with what you have..but it may be under powered which could prevent the performance you should have.
> 
> you should download a benchmark program such as unigine heaven or valley, and tell us what you are getting, your 60-90 fps might be about right


also i have downloaded that to unpark cores and it didnt help to much, 60-90 is pretty low compared to other peoples though, people with same specs or lower get 200+, thats why im worried plus i only get like 25-30 fps on dayz


----------



## Snucks

Quote:


> Originally Posted by *Snucks*
> 
> everywhere i read said 600w was enough?  ill download heaven now


what settings should i run the heaven benchmark at (1080p, anti aliasing, etc.)


----------



## buttface420

actually i believe you are right about a 600watt should be enough power. run it on the extreme preset,1080p if you can


----------



## redfaction95

A quality 600W is actually an overkill for 280X single. Way more than overkill, I really do not understand that why people exaggerate the need of more wattage nowadays


----------



## Snucks

Quote:


> Originally Posted by *buttface420*
> 
> actually i believe you are right about a 600watt should be enough power. run it on the extreme preset,1080p if you can


alright I am running valley right now on extreme hd.. I'll post results when it's done.. it looks like I'm not even getting past 16fps... ***ack


----------



## Snucks

i don't think that is very good


----------



## buttface420

no that is horrible, something is very wrong.

i have a 280x and a cpu that is half as good as yours, i scored 1721,avg fps 41.1/min 16.2 and max 81.4. your score should be well over 2000

dont qoute me on this but you might have to rma that card.

in cs go you should be getting 150-300 fps easy..maybe even close to 400 at times, in dayz around 60.


----------



## Snucks

Quote:


> Originally Posted by *buttface420*
> 
> no that is horrible, something is very wrong.
> 
> i have a 280x and a cpu that is half as good as yours, i scored 1721,avg fps 41.1/min 16.2 and max 81.4. your score should be well over 2000
> 
> dont qoute me on this but you might have to rma that card.
> 
> in cs go you should be getting 150-300 fps easy..maybe even close to 400 at times, in dayz around 60.


like i said man... this is a very sad day for the kingdom ;(


----------



## Snucks

Quote:


> Originally Posted by *Snucks*
> 
> like i said man... this is a very sad day for the kingdom ;(


when i run CS GO and that benchmark my gpu was at 99% usage too... i knew something was up.. i mean i am not sure what else i should do.


----------



## End3R

Quote:


> Originally Posted by *Snucks*
> 
> like i said man... this is a very sad day for the kingdom ;(


Do you have supersampling/edge detect etc forced on in CCC? If so that can cause performance issues, especially if you're also using another program like afterburner (in conjunction with ccc).


----------



## Snucks

Quote:


> Originally Posted by *End3R*
> 
> Do you have supersampling/edge detect etc forced on in CCC? If so that can cause performance issues, especially if you're also using another program like afterburner (in conjunction with ccc).


what do mean ccc?


----------



## End3R

Quote:


> Originally Posted by *Snucks*
> 
> what do mean ccc?


Catalyst Control Center


----------



## buttface420

that score is so low its almost like he is using the integrated graphics on his cpu and not using the gpu at all, but it shows a r9 200 series..that is weird bro.


----------



## Snucks

Quote:


> Originally Posted by *End3R*
> 
> Catalyst Control Center


what should i do? go to ccc and turn of edge sampling and whatnot and try benchmark again?


----------



## buttface420

Quote:


> Originally Posted by *Snucks*
> 
> what should i do? go to ccc and turn of edge sampling and whatnot and try benchmark again?


yes dude...openm your ccc..go to gaming/3d app settings.. check your anti aliasing method..it should be on multisampling. if its on super sampling that is your problem


----------



## End3R

Quote:


> Originally Posted by *Snucks*
> 
> what should i do? go to ccc and turn of edge sampling and whatnot and try benchmark again?


Make it look like this, you can choose to turn morphological filtering off if you want, I find it makes near to no difference and increases quality so I never turn it off.


----------



## Snucks

[/quote]
Quote:


> Originally Posted by *End3R*
> 
> Make it look like this, you can choose to turn morphological filtering off if you want, I find it makes near to no difference and increases quality so I never turn it off.


alright i put those settings in... imma try benchmark again brb


----------



## buttface420

yeah i just turned on my supersampling in ccc and ran the bench and got about the same score as you did..i just got a 602


----------



## Snucks

Quote:


> Originally Posted by *Snucks*


alright i put those settings in... imma try benchmark again brb[/quote]

alright this is the benchmark with the morphological filter off and the setting you listed put in


----------



## buttface420

Quote:


> Originally Posted by *Snucks*
> 
> alright i put those settings in... imma try benchmark again brb


alright this is the benchmark with the morphological filter off and the setting you listed put in

[/quote]

your fps problem should be over dude go play cs go now!


----------



## Snucks

Quote:


> Originally Posted by *buttface420*
> 
> alright this is the benchmark with the morphological filter off and the setting you listed put in


your fps problem should be over dude go play cs go now![/quote]
+REP for both of you!


----------



## End3R

Quote:


> Originally Posted by *Snucks*
> 
> do either of ya two have steam?


Sure you can pm me for it if you wanna add me


----------



## Snucks

Quote:


> Originally Posted by *End3R*
> 
> Sure you can pm me for it if you wanna add me


pm'd


----------



## redfaction95

Repped both of you buttface and end3r








Btw i also get 1800 score and 43 fps with my msi 280x 1020mhz no OC with full stock i5 2500k +4gb 1333mhz rams








Its quite close enough to your beasty cpu.


----------



## Snucks

Oh wow, nice... I've seen someone's as high as 2100 something with our gpu and i5 4670k unlocked... crazy..


----------



## Snucks

Oh yeah I ave another question since it has to do with this thread... is coil whining normal with msi r9 280x cards? I get it as soon as my gpu sees any game, the higher the fps the higher the noise... is that normal?


----------



## DasThunder

If what you define by normal is.. Is it common? Then yes. Is it supposed to happen? Depends on the luck of your draw. The coil whine so long as it is coil whine is normal. It will not do any harm to your system or the card.

Iit will just drive you nuts if you do not wear a headset.







I had a coil whining card before I got my Vapor x 270x and it performed for 4.5 years with the coil whine perfectly fine.

Could always ask MSI to give you a new one so you don't have to put up with it.


----------



## TopicClocker

Quote:


> Originally Posted by *End3R*
> 
> Make it look like this, you can choose to turn morphological filtering off if you want, I find it makes near to no difference and increases quality so I never turn it off.


Quote:


> Originally Posted by *buttface420*
> 
> yes dude...openm your ccc..go to gaming/3d app settings.. check your anti aliasing method..it should be on multisampling. if its on super sampling that is your problem


This is lovely, glad to see you guys helped him out!


----------



## redfaction95

Any info on this error
PS Have installed fresh 8.1 x64bit fully update, 14.4 suite installed with original BF3 and no bloat ware/ afterburner, I get this error after 8-10min of gameplay.


----------



## End3R

Shot in the dark but directx errors usually mean you need to update directx.


----------



## redfaction95

Quote:


> Originally Posted by *End3R*
> 
> Shot in the dark but directx errors usually mean you need to update directx.


Yeah, I tried running directx web installer and it says that it has detected a newer version or equivalent and ends on Finish.


----------



## End3R

Quote:


> Originally Posted by *redfaction95*
> 
> Yeah, I tried running directx web installer and it says that it has detected a newer version or equivalent and ends on Finish.


Hm, you may want to clear them out and start over from 9 and work your way up? Sometimes older versions have files the newer ones don't on the assumption you had the older one first.


----------



## buttface420

Quote:


> Originally Posted by *redfaction95*
> 
> Any info on this error
> PS Have installed fresh 8.1 x64bit fully update, 14.4 suite installed with original BF3 and no bloat ware/ afterburner, I get this error after 8-10min of gameplay.


this is a very common problem with bf3, could be one of many things

the cause is xfires ingame chat feature ....switch it off and you shouldn't have any issues

disable xfire support in xfire options games installed clicked bf3 and clicked disable xfire support

or

enable .net framework 3.5 in your windows, 8.1 has it disabled for some reason

or
update drivers/reinstall drivers

you can also try to downclock. these may or may not help you, its just a crappy error bf3 likes to have...are you using gtx gpu?


----------



## redfaction95

Quote:


> Originally Posted by *buttface420*
> 
> this is a very common problem with bf3, could be one of many things
> 
> the cause is xfires ingame chat feature ....switch it off and you shouldn't have any issues
> 
> disable xfire support in xfire options games installed clicked bf3 and clicked disable xfire support
> 
> or
> 
> enable .net framework 3.5 in your windows, 8.1 has it disabled for some reason
> 
> or
> update drivers/reinstall drivers
> 
> you can also try to downclock. these may or may not help you, its just a crappy error bf3 likes to have...are you using gtx gpu?


Thanks
I did not encountered this error throughout my 560TI experience though.
And I am getting the same error in NFS rivals.
I will be trying fifa 13 in a while then report back.


----------



## redfaction95

I am using msi R9 280X OC


----------



## buttface420

are you running the card overclocked? you may have to go and reset msi settings to default if so,or increase the voltage a tiny bit, you could also update the drivers to 14.7 beta and see if that helps.

a faulty psu could also cause this (not likely)


----------



## redfaction95

Quote:


> Originally Posted by *buttface420*
> 
> are you running the card overclocked? you may have to go and reset msi settings to default if so,or increase the voltage a tiny bit, you could also update the drivers to 14.7 beta and see if that helps.
> 
> a faulty psu could also cause this (not likely)


I will be replacing the PSU in 2 days, will check then, the card is @ full stock, no afterburner nothing installed except the drivers and the CCC that comes with it.


----------



## Crockturtle566

Hey guys I just got a used 280x and it has been giving me nothing but issues all day. This is the first day of me owning it. Now when I try to play a game it will work for a second but then the computer will just reboot no blue screen, no errors when I login into windows. I checked HW monitor and found this 
Those voltages I'm pretty sure are the psu if I'm right? Is that is what is causing me pain most likely?
Thanks and if I get the issue resolved Ill be a proud member of the 280x club


----------



## itzvenom

Hello everyone, it is possible for me to flash my Asus Radeon R9 280 DirectCU II TOP 3GB DDR5 PCI-E bios to 280x ?

Thanks a lot,


----------



## seanp177




----------



## aaronsta1

i think im getting some bad performance from my AMD system.



what do you guys think.
i have crossfire r9 270x clocked to 1120/1500 on an Gigabyte 990FXA board with a FX 8350 clocked to 4400


----------



## BruceB

Quote:


> Originally Posted by *Crockturtle566*
> 
> Hey guys I just got a used 280x and it has been giving me nothing but issues all day. This is the first day of me owning it. Now when I try to play a game it will work for a second but then the computer will just reboot no blue screen, no errors when I login into windows. I checked HW monitor and found this
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Those voltages I'm pretty sure are the psu if I'm right? Is that is what is causing me pain most likely?
> Thanks and if I get the issue resolved Ill be a proud member of the 280x club


It could be a number of things.
From what you've written here I'd say it's probably either the PSU shutting down due to too much power draw or the GPU overheating and forcing a shutdown.

Use _SpeedFan_ or _MSI Afterburner_ or something similar to check the GPU's temps/volts.
Could you also tell us more about your System? It'd be best to fill out a rig in your Profile and add it to your sig (then you won't get asked about it all the time!).









Quote:


> Originally Posted by *redfaction95*
> 
> I will be replacing the PSU in 2 days, will check then, the card is @ full stock, no afterburner nothing installed except the drivers and the CCC that comes with it.


This error is usually given when the Card crashes due to pushing the OC too far, as @buttface420 said. If you were OC'ing I would reccomend either upping the voltage or dropping the OC.

Since you're at stock I'd say try running _FurMark_ for a couple of mins (*watch the temps!*), this will simulate max powerdraw. If you don't get an error then, I doubt it's the PSU. Either way, get in contact with your manufacturer and ask what they recommend, they might RMA it for you.


----------



## OmnesPotens

Hi, I made an account just for this problem that I've been having after building my first pc and getting everything set up correctly. So here's my problem (please excuse the length of the text, I've tried a ton of things.)

Ok so heres my system build: http://pcpartpicker.com/p/fjwKsY

For the most part the gpu was running great, getting very good and stable fps even on highest settings (usually 60fps if not better). I noticed one afternoon that the fps dropped considerably and was unstable. After hours of searching and troubleshooting I found that the most likely culprit were the drivers. So I did a complete driver uninstall and reinstalled the latest AMD drivers.

It worked even better and more stable than before on the highest settings for any game I tried. I even ran 3DMark firestrike 2 or 3 times to make sure it wasnt just a fluke, it wasnt. The next day (today) I come home to absolutely horrid performance even under full load. I ran 3DMark Firestrike to see if maybe it was just some kind of system error after rebooting and I still got god awful fps (I'm talking 0.5 - 2 fps).

I have not OCed anything. I have tried reseating the GPU, and constantly checked that cables were plugged in correctly, they were. My temps are always stable and according to MSI Afterburner the GPU clock speeds are consistent when running OCCT, 3DMark and Hitman Absolution Benchmark, but I get horrid fps.

*I've determined, after a full clean install of windows that no matter what drivers I use for my R9 270x the fps will be stunted awfully if I power on my pc from an off-state. But as soon as I restart my performance goes up to blazing speeds. I'm talking rock solid 60 fps on max settings as well as on default settings (I never turn on MSAA though).*

If you'd like to look through the thread for my specific troubleshooting process on Tom's Hardware its here: http://www.tomshardware.com/answers/id-2267060/radeon-270x-giving-issues-inconsistent-performance.html#14019104

I don't know what else to do. I think an RMA may be the only way to get the performance I should be getting with a regular boot up from off-state. I just don't want to RMA my GPU unless I know for sure that it is the source of my problem and not something else.

Help please!


----------



## buttface420

Quote:


> Originally Posted by *OmnesPotens*
> 
> Hi, I made an account just for this problem that I've been having after building my first pc and getting everything set up correctly. So here's my problem (please excuse the length of the text, I've tried a ton of things.)
> 
> Ok so heres my system build: http://pcpartpicker.com/p/fjwKsY
> 
> For the most part the gpu was running great, getting very good and stable fps even on highest settings (usually 60fps if not better). I noticed one afternoon that the fps dropped considerably and was unstable. After hours of searching and troubleshooting I found that the most likely culprit were the drivers. So I did a complete driver uninstall and reinstalled the latest AMD drivers.
> 
> It worked even better and more stable than before on the highest settings for any game I tried. I even ran 3DMark firestrike 2 or 3 times to make sure it wasnt just a fluke, it wasnt. The next day (today) I come home to absolutely horrid performance even under full load. I ran 3DMark Firestrike to see if maybe it was just some kind of system error after rebooting and I still got god awful fps (I'm talking 0.5 - 2 fps).
> 
> I have not OCed anything. I have tried reseating the GPU, and constantly checked that cables were plugged in correctly, they were. My temps are always stable and according to MSI Afterburner the GPU clock speeds are consistent when running OCCT, 3DMark and Hitman Absolution Benchmark, but I get horrid fps.
> 
> *I've determined, after a full clean install of windows that no matter what drivers I use for my R9 270x the fps will be stunted awfully if I power on my pc from an off-state. But as soon as I restart my performance goes up to blazing speeds. I'm talking rock solid 60 fps on max settings as well as on default settings (I never turn on MSAA though).*
> 
> If you'd like to look through the thread for my specific troubleshooting process on Tom's Hardware its here: http://www.tomshardware.com/answers/id-2267060/radeon-270x-giving-issues-inconsistent-performance.html#14019104
> 
> I don't know what else to do. I think an RMA may be the only way to get the performance I should be getting with a regular boot up from off-state. I just don't want to RMA my GPU unless I know for sure that it is the source of my problem and not something else.
> 
> Help please!


could be a virus, or even overheating due to dust.worst case would be faulty vram issues of your gpu,check overheating of the cpu also.


----------



## OmnesPotens

I've been checking temps with HW Monitor, OCCT and MSI Afterburner on screen display no problems show up, all temps are well within safe zone. I've scanned my system for all possible things with Malwarebytes, AVG and Windows Defender and nothing comes up. It's a brand new build and I haven't seen any dust accumulating. I even did a full reinstall of Windows 8.1 last night and the problem persists regardless :/


----------



## buttface420

ahhh okay.then if its not temperature,virus,or drivers then one thing i would suggest is trying a different video card to see if the problem keeps goin or try your video card in another pc. it may be your gpu is having issues.

also could be a motherboard issue, or a memory leak.


----------



## OmnesPotens

Hmm, how would I determine if its the mobo or a memory leak? I don't really have any other pc that I can check the GPU with unfortunately.


----------



## redfaction95

Quote:


> Originally Posted by *buttface420*
> 
> ahhh okay.then if its not temperature,virus,or drivers then one thing i would suggest is trying a different video card to see if the problem keeps goin or try your video card in another pc. it may be your gpu is having issues.
> 
> also could be a motherboard issue, or a memory leak.


Bro got here is bit of an update:
Watch dogs on full ultra runs Extra smooth, and it runs like forever noi crash and the temps do NOT exceed 75C with even 99% usage, as said by another mate, I checked running Furmark
Quote:


> Originally Posted by *BruceB*
> 
> It could be a number of things.
> From what you've written here I'd say it's probably either the PSU shutting down due to too much power draw or the GPU overheating and forcing a shutdown.
> 
> Use _SpeedFan_ or _MSI Afterburner_ or something similar to check the GPU's temps/volts.
> Could you also tell us more about your System? It'd be best to fill out a rig in your Profile and add it to your sig (then you won't get asked about it all the time!).
> 
> 
> 
> 
> 
> 
> 
> 
> This error is usually given when the Card crashes due to pushing the OC too far, as @buttface420 said. If you were OC'ing I would reccomend either upping the voltage or dropping the OC.
> 
> Since you're at stock I'd say try running _FurMark_ for a couple of mins (*watch the temps!*), this will simulate max powerdraw. If you don't get an error then, I doubt it's the PSU. Either way, get in contact with your manufacturer and ask what they recommend, they might RMA it for you.


Quote:


> Originally Posted by *buttface420*
> 
> ahhh okay.then if its not temperature,virus,or drivers then one thing i would suggest is trying a different video card to see if the problem keeps goin or try your video card in another pc. it may be your gpu is having issues.
> 
> also could be a motherboard issue, or a memory leak.


As you said, I tried furmark and some more game so here is the update:
Watch dogs runs damn fine on everything @ ultra 1080p, grid autosport and fifa 13 also.
Furmark runs damn fine without even a single crash no prob.
This problem come in BF3 BF4 NFS rivals and mostwanted.
Here is a 10min run:


And here are some snaps of the error that comes in *stated games*


----------



## aaronsta1

Quote:


> Originally Posted by *redfaction95*
> 
> Bro got here is bit of an update:
> Watch dogs on full ultra runs Extra smooth, and it runs like forever noi crash and the temps do NOT exceed 75C with even 99% usage, as said by another mate, I checked running Furmark
> 
> As you said, I tried furmark and some more game so here is the update:
> Watch dogs runs damn fine on everything @ ultra 1080p, grid autosport and fifa 13 also.
> Furmark runs damn fine without even a single crash no prob.
> This problem come in BF3 BF4 NFS rivals and mostwanted.
> Here is a 10min run:
> 
> 
> And here are some snaps of the error that comes in *stated games*


its a bad OC.

ive had the same issues..
you can run unigine, or 3dmark with no problems even some games work great..

but BF4 or fill in the blank game crashes.. that error with the badly formed commands also pops up when you have a bad CPU OC too..

since you are running at 80c, i wouldnt suggest upping the voltage..

lower the gpu clock to like 980 and see if it still crashes, if your CPU is OC then try to lower that as well.


----------



## redfaction95

Quote:


> Originally Posted by *aaronsta1*
> 
> its a bad OC.
> 
> ive had the same issues..
> you can run unigine, or 3dmark with no problems even some games work great..
> 
> but BF4 or fill in the blank game crashes.. that error with the badly formed commands also pops up when you have a bad CPU OC too..
> 
> since you are running at 80c, i wouldnt suggest upping the voltage..
> 
> lower the gpu clock to like 980 and see if it still crashes, if your CPU is OC then try to lower that as well.


Suggestion noted, will do it and report back
Thanks.
PS RMA will be my nightmare.


----------



## aaronsta1

do you guys suggest upping the idle memory clocks?

these cards have a 150mhz memory idle clock and while browsing web with chrome i get a ton of screen flickers when i scroll..

its most annoying..

i dont notice it so much on my 7870 cards but those card have a 300 mhz idle clock. it still does it tho..

has anyone found a good idle clock that doesnt flicker??


----------



## Recr3ational

Quote:


> Originally Posted by *aaronsta1*
> 
> do you guys suggest upping the idle memory clocks?
> 
> these cards have a 150mhz memory idle clock and while browsing web with chrome i get a ton of screen flickers when i scroll..
> 
> its most annoying..
> 
> i dont notice it so much on my 7870 cards but those card have a 300 mhz idle clock. it still does it tho..
> 
> has anyone found a good idle clock that doesnt flicker??


Just up it using afterburner. My idle memory clock is at 1650 lol.


----------



## buttface420

Quote:


> Originally Posted by *redfaction95*
> 
> Suggestion noted, will do it and report back
> Thanks.
> PS RMA will be my nightmare.


so this is ONLY happening when you try to play origin(EA) games? if so it might not be a hardware problem

Open Origin, click on Origin drop down menu, Select Application Settings, Select the Origin in Game TAB.

De-select the origin in game checkbox. exit origin restart pc. if that doesnt work also try uncheck the "enable cloud storage for all games

or

you could also try running origin /game as administrator


----------



## TechnoVixen

Can't be arsed to take a pic of the card inside my system right now but it is a Gigabyte R9 270 OC Windforce.


----------



## redfaction95

Quote:


> Originally Posted by *buttface420*
> 
> so this is ONLY happening when you try to play origin(EA) games? if so it might not be a hardware problem
> 
> Open Origin, click on Origin drop down menu, Select Application Settings, Select the Origin in Game TAB.
> 
> De-select the origin in game checkbox. exit origin restart pc. if that doesnt work also try uncheck the "enable cloud storage for all games
> or
> 
> you could also try running origin /game as administrator


Mm precisely it happens in orignal origin game BF3, other than that it happens in my BlackBox non origin cracked NFS mostwanted , rivals andd bf4,
Let me try one more origin game to get some results, I also have Fifa World, will try and report back on stock clocks as before and then with underclocked gpu


----------



## nitrubbb

I have MSI R9 270X 4G, where can I get the latest BIOS for it?

Opened GPU-Z and in the BIOS line chose "Save to file" which caused a picture freeze and a few seconds later monitor going into stand-by mode. Why?


----------



## redfaction95

Quote:


> Originally Posted by *aaronsta1*
> 
> its a bad OC.
> 
> ive had the same issues..
> you can run unigine, or 3dmark with no problems even some games work great..
> 
> but BF4 or fill in the blank game crashes.. that error with the badly formed commands also pops up when you have a bad CPU OC too..
> 
> since you are running at 80c, i wouldnt suggest upping the voltage..
> 
> lower the gpu clock to like 980 and see if it still crashes, if your CPU is OC then try to lower that as well.


Problem Solved by this method ^^ Repped you and special thanks to @buttface420
Downclocked to 1000mhz (AMD's stock clock) because some games do not support the OC that is usually factory done by (for example in my case MSI OC'ed it to 1020mhz out of the box unlike original 1000mhz from AMD)
PS I used default AMD overdrive from Catalyst Control Center to put the card at stock 1000mhz.
This place is CHAMP.
Thanks


----------



## buttface420

add me to the 280x club bros , sapphire dual-x r9 280x oc @ 1050/1500


----------



## mcdoubleyou

that cable management though Ö


----------



## aaronsta1

Quote:


> Originally Posted by *redfaction95*
> 
> Problem Solved by this method ^^ Repped you and special thanks to @buttface420
> Downclocked to 1000mhz (AMD's stock clock) because some games do not support the OC that is usually factory done by (for example in my case MSI OC'ed it to 1020mhz out of the box unlike original 1000mhz from AMD)
> PS I used default AMD overdrive from Catalyst Control Center to put the card at stock 1000mhz.
> This place is CHAMP.
> Thanks


ok so now we determined the cause of the crashes, we have to figure out why.

MSI cards can usually run at the default speed.. actually if you got the MSI Gaming 3g 280x i think its rated at 1050 in OC mode.

you said it was running hot. showed a pic of 80c. if the card is running that hot, the vrms might be throttling down the clocks in the game and causing the game to crash.

possible causes of heat.. poor ventilation in the case.. some cases arent very good..
you can use MSIs afterburner program to set a custom fan profile that will speed the fans up more then default.. by default these cards are set to run silent..

you can run gpu-z with it logging to a file while running bf4 and check to see how hot its getting while playing a game.

the 20 mhz wont hurt performance one bit, but there still is a possibility of having a bad card, it could also be your power supply..


----------



## OmnesPotens

I believe I may have figured out the cause of my troubles. It seems like my Gigabyte R9 270x OC Edition GPU refuses to run in its supposedly "supported" PCIe Gen3 mode. If it runs in Gen3 mode the fps is stunted to around 6 fps and even then it is extremely unstable. I'm not sure if this is due to my Gigabyte GA-H97M-HD3 motherboard or if its a GPU issue but from what I can tell the R9 270x is supposed to run at PCIe Gen3 mode whenever a graphics-intensive program/application is running. So any ideas on what I should do about this issue?


----------



## BruceB

Quote:


> Originally Posted by *OmnesPotens*
> 
> I believe I may have figured out the cause of my troubles. It seems like my Gigabyte R9 270x OC Edition GPU refuses to run in its supposedly "supported" PCIe Gen3 mode. If it runs in Gen3 mode the fps is stunted to around 6 fps and even then it is extremely unstable. I'm not sure if this is due to my Gigabyte GA-H97M-HD3 motherboard or if its a GPU issue but from what I can tell the R9 270x is supposed to run at PCIe Gen3 mode whenever a graphics-intensive program/application is running. So any ideas on what I should do about this issue?


IMO it doesn't Sound like a GPU-side Problem, the only difference between PCIe 2.0 and 3.0 is the bandwidth (PCIe2.0 x16 has more than enough for a 270X anyway), if that's what you mean with 'Gen3'?
I don't want to Point out the obvious but have you updated your MB Drivers/BIOS to the latest Version?


----------



## BruceB

Quote:


> Originally Posted by *buttface420*
> 
> so this is ONLY happening when you try to play origin(EA) games? if so it might not be a hardware problem
> Open Origin, click on Origin drop down menu, Select Application Settings, Select the Origin in Game TAB.
> De-select the origin in game checkbox. exit origin restart pc. if that doesnt work also try uncheck the "enable cloud storage for all games or
> you could also try running origin /game as administrator


Good call, the BF4 info thread is full of People having Problems with origin ingame.








Quote:


> Originally Posted by *redfaction95*
> 
> Problem Solved by this method ^^ Repped you and special thanks to @buttface420
> Downclocked to 1000mhz (AMD's stock clock) because some games do not support the OC that is usually factory done by (for example in my case MSI OC'ed it to 1020mhz out of the box unlike original 1000mhz from AMD)
> PS I used default AMD overdrive from Catalyst Control Center to put the card at stock 1000mhz.
> This place is CHAMP.
> Thanks


^^Glad to hear you got it sorted!


----------



## TopicClocker

Quote:


> Originally Posted by *redfaction95*
> 
> Problem Solved by this method ^^ Repped you and special thanks to @buttface420
> Downclocked to 1000mhz (AMD's stock clock) because some games do not support the OC that is usually factory done by (for example in my case MSI OC'ed it to 1020mhz out of the box unlike original 1000mhz from AMD)
> PS I used default AMD overdrive from Catalyst Control Center to put the card at stock 1000mhz.
> This place is CHAMP.
> Thanks


Shouldn't it be stable out of the box though? That is what they're selling it as.


----------



## redfaction95

Quote:


> Originally Posted by *TopicClocker*
> 
> Shouldn't it be stable out of the box though? That is what they're selling it as.


Yeah it should be and it was in benchmarks and stress test only, but in especially DICE games, the gpu gives BAD commands, maybe there is some conflict between the log files of games as AMD ones are at 1000mhz unlike these aftermarket ones.


----------



## redfaction95

Quote:


> Originally Posted by *BruceB*
> 
> Good call, the BF4 info thread is full of People having Problems with origin ingame.
> 
> 
> 
> 
> 
> 
> 
> 
> ^^Glad to hear you got it sorted!


Thanks a lot


----------



## redfaction95

Quote:


> Originally Posted by *aaronsta1*
> 
> ok so now we determined the cause of the crashes, we have to figure out why.
> 
> MSI cards can usually run at the default speed.. actually if you got the MSI Gaming 3g 280x i think its rated at 1050 in OC mode.
> 
> you said it was running hot. showed a pic of 80c. if the card is running that hot, the vrms might be throttling down the clocks in the game and causing the game to crash.
> 
> possible causes of heat.. poor ventilation in the case.. some cases arent very good..
> you can use MSIs afterburner program to set a custom fan profile that will speed the fans up more then default.. by default these cards are set to run silent..
> 
> you can run gpu-z with it logging to a file while running bf4 and check to see how hot its getting while playing a game.
> 
> the 20 mhz wont hurt performance one bit, but there still is a possibility of having a bad card, it could also be your power supply..


Yeah sure, I will try to identify the cause and let you know once I get the new PSU.


----------



## OmnesPotens

Quote:


> Originally Posted by *BruceB*
> 
> IMO it doesn't Sound like a GPU-side Problem, the only difference between PCIe 2.0 and 3.0 is the bandwidth (PCIe2.0 x16 has more than enough for a 270X anyway), if that's what you mean with 'Gen3'?
> I don't want to Point out the obvious but have you updated your MB Drivers/BIOS to the latest Version?


Yes thats what I mean, in the bios it labels them as Gen1, Gen2 and Gen3. Regardless on whether or not it is enough for the card shouldnt the card still perform just as well if not better in PCI-e 3.0 x16? According to what i can see the bios I have is version F4 and when i go to the Gigabyte page for my mobo there are no bios updates or versions listed. Any idea where/how else I can check for updates?


----------



## xutnubu

Quote:


> Originally Posted by *redfaction95*
> 
> Quote:
> 
> 
> 
> Originally Posted by *aaronsta1*
> 
> ok so now we determined the cause of the crashes, we have to figure out why.
> 
> MSI cards can usually run at the default speed.. actually if you got the MSI Gaming 3g 280x i think its rated at 1050 in OC mode.
> 
> you said it was running hot. showed a pic of 80c. if the card is running that hot, the vrms might be throttling down the clocks in the game and causing the game to crash.
> 
> possible causes of heat.. poor ventilation in the case.. some cases arent very good..
> you can use MSIs afterburner program to set a custom fan profile that will speed the fans up more then default.. by default these cards are set to run silent..
> 
> you can run gpu-z with it logging to a file while running bf4 and check to see how hot its getting while playing a game.
> 
> the 20 mhz wont hurt performance one bit, but there still is a possibility of having a bad card, it could also be your power supply..
> 
> 
> 
> Yeah sure, I will try to identify the cause and let you know once I get the new PSU.
Click to expand...

You can only check VRM temps with HWiNFO64, GPU-Z doesn't show them for this card.

I'd appreciate if you post your temps after a period of gaming to see how they compare to mine.

My card reaches around 75C on the core, VRM 1 is almost 5-7C below core, so around 67-70C and VRM 2 almost never changes at 52C. I actually don't know why, it's always at 51-53C idle or load.


----------



## BruceB

Quote:


> Originally Posted by *OmnesPotens*
> 
> Yes thats what I mean, in the bios it labels them as Gen1, Gen2 and Gen3. Regardless on whether or not it is enough for the card shouldnt the card still perform just as well if not better in PCI-e 3.0 x16? According to what i can see the bios I have is version F4 and when i go to the Gigabyte page for my mobo there are no bios updates or versions listed. Any idea where/how else I can check for updates?


Your Card should run exactly the same in both PCIe 2.0 and PCIe 3.0 Slots (assuming everything's working properly!







)
Your BIOS (F4) is no longer the newest, here's the download link to the latest BIOS (F5): http://www.gigabyte.com/products/product-page.aspx?pid=4965#bios

Gigabyte have a really cheap looking website for some reason, clearly don't pay their web Designers enough!









Update to the F5 BIOS and then tell us what happens!









[EDIT]
Please check that BIOS is for the correct MB before installing it!!


----------



## OmnesPotens

Quote:


> Originally Posted by *BruceB*
> 
> Your Card should run exactly the same in both PCIe 2.0 and PCIe 3.0 Slots (assuming everything's working properly!
> 
> 
> 
> 
> 
> 
> 
> )
> Your BIOS (F4) is no longer the newest, here's the download link to the latest BIOS (F5): http://www.gigabyte.com/products/product-page.aspx?pid=4965#bios
> 
> Gigabyte have a really cheap looking website for some reason, clearly don't pay their web Designers enough!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Update to the F5 BIOS and then tell us what happens!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [EDIT]
> Please check that BIOS is for the correct MB before installing it!!


Ah wow, yeah you were right. Seems like the drop down menu for where it says "drivers" on my mobo page wouldn't load so it didn't show the BIOS section at all. I updated and it seems to be working fine. Thanks for the help haha

UPDATE: Just kidding...same thing happens if I power on from an off-state, fps is stunted at around 5 fps and says its running at PCIe 3.0. Sigh...oh and now restarting doesn't fix the issue, but 2 restarts does.

I'm now going to shutdown, reload into the BIOS, load the "optimized defaults" and set ram speed back 1600 (I've been running at 1600Mhz ram speed but am disabling fast boot).lets see if this fixes it.


----------



## BruceB

Quote:


> Originally Posted by *OmnesPotens*
> 
> Ah wow, yeah you were right. Seems like the drop down menu for where it says "drivers" on my mobo page wouldn't load so it didn't show the BIOS section at all. I updated and it seems to be working fine. Thanks for the help haha


All good!









I've been looking around the web for other People with the same Problem and found your thread on Tom's Hardware!
I'll post a link here because it has quite a lot of relevant info that you'd otherwise have to repeat:
http://www.tomshardware.co.uk/answers/id-2267060/radeon-270x-giving-issues-inconsistent-performance.html









I searched for 'bad gpu performance after sleep' and Google gave me a huge number of pages, so it Looks like you're not alone with this Problem!

[EDIT]
Here's another one with an nVidia GPU:
http://superuser.com/questions/460771/degraded-video-performance-after-wake-from-sleep-windows-7
This even happens on OSX:
http://forum.netkas.org/index.php?topic=2013.0

It appears just to be a Thing that happens, probably something to do with GPU power states and Drivers... more experimentation is needed! I'll try it on my 280X during the week, maybe I can get the same effect.


----------



## OmnesPotens

Looks like every other boot-up allows the gpu to perform slightly better, but the gpu doesnt perform at max performance (60 fps) until a restart is done.


----------



## OmnesPotens

Quote:


> Originally Posted by *BruceB*
> 
> All good!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've been looking around the web for other People with the same Problem and found your thread on Tom's Hardware!
> I'll post a link here because it has quite a lot of relevant info that you'd otherwise have to repeat:
> http://www.tomshardware.co.uk/answers/id-2267060/radeon-270x-giving-issues-inconsistent-performance.html
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I searched for 'bad gpu performance after sleep' and Google gave me a huge number of pages, so it Looks like you're not alone with this Problem!
> 
> [EDIT]
> Here's another one with an nVidia GPU:
> http://superuser.com/questions/460771/degraded-video-performance-after-wake-from-sleep-windows-7
> This even happens on OSX:
> http://forum.netkas.org/index.php?topic=2013.0
> 
> It appears just to be a Thing that happens, probably something to do with GPU power states and Drivers... more experimentation is needed! I'll try it on my 280X during the week, maybe I can get the same effect.


Alright sounds good, I'll try talking to tech support again and see what they say.


----------



## OmnesPotens

So I talked to both Gigabyte and Intel tech support and after asking me to do a few things Gigabyte says it could be a problem with the CPU's power management so I go to Intel and we test the CPU and it checks out fine. Then Intel says that its my PSU's wattage not being high enough, they recommend an 800W PSU...correct me if I'm wrong, but that's complete bull**** is it not? (not sure if profanity is allowed in these forums so excuse my french)

Note: this is my full build http://pcpartpicker.com/user/Iramis/saved/GrzFf7. Notice the estimated Wattage (I know this is probably not 100% accurate but honestly even under 100% full load my PSU will at least pull 80% of 600W so it should be more than enough shouldn't it?


----------



## bigpoppapump

Quote:


> Originally Posted by *OmnesPotens*
> 
> So I talked to both Gigabyte and Intel tech support and after asking me to do a few things Gigabyte says it could be a problem with the CPU's power management so I go to Intel and we test the CPU and it checks out fine. Then Intel says that its my PSU's wattage not being high enough, they recommend an 800W PSU...correct me if I'm wrong, but that's complete bull**** is it not? (not sure if profanity is allowed in these forums so excuse my french)
> 
> Note: this is my full build http://pcpartpicker.com/user/Iramis/saved/GrzFf7. Notice the estimated Wattage (I know this is probably not 100% accurate but honestly even under 100% full load my PSU will at least pull 80% of 600W so it should be more than enough shouldn't it?


Hahaha yeah 800w is insane overkill. That was a total "it's not in the manual, time to pass the buck" answer.

Maybe try forcing PCIe 2 through BIOS?


----------



## OmnesPotens

Quote:


> Originally Posted by *bigpoppapump*
> 
> Hahaha yeah 800w is insane overkill. That was a total "it's not in the manual, time to pass the buck" answer.
> 
> Maybe try forcing PCIe 2 through BIOS?


I've tried that, last time I did it seemed to give me a stable 58-60 fps if PCIe 3.0 isnt working at the same performance or better than PCIe 2.0 then theres something wrong. Regardless of if the GPU performs perfectly fine at PCIe 2.0 I dont want to find out down the line that some issue I could've had fixed just destroys my GPU or my system as a whole.


----------



## xutnubu

Perhaps something to do with your house's electrical system?


----------



## OmnesPotens

Quote:


> Originally Posted by *xutnubu*
> 
> Perhaps something to do with your house's electrical system?


What do you mean?


----------



## buttface420

this problem could be due to sudden losses of power. either from too much power being needed or a faulty psu

BUT

this seems to be a problem with other owners of the motherboard you have ( GA-H97M-HD3 )..they have gpu randomly having losses,say sometimes it works great then suddenly drops down to 10 fps, well they also have said they got a new power supply and that didnt fix the problem.

apparently it is some defect of the pcie on the motherboards.

if nothing else helps, you may want to look into trying a different motherboard to see if it still happens


----------



## hatchet_warrior

Quote:


> Originally Posted by *OmnesPotens*
> 
> I've tried that, last time I did it seemed to give me a stable 58-60 fps if PCIe 3.0 isnt working at the same performance or better than PCIe 2.0 then theres something wrong. Regardless of if the GPU performs perfectly fine at PCIe 2.0 I dont want to find out down the line that some issue I could've had fixed just destroys my GPU or my system as a whole.


I'm not 100% up to date on Intel motherboards; but on AMD an issue like that is normally due to a faulty chipset in the motherboard. Check temps first, if they're ok you could try bumping voltage a bit on your chipset and see if things smooth out.

If you don't have the option to try your gpu in a different system, then I would suggest replacing the motherboard first. Seems to be the cheaper option.


----------



## BruceB

Quote:


> Originally Posted by *OmnesPotens*
> 
> Looks like every other boot-up allows the gpu to perform slightly better, but the gpu doesnt perform at max performance (60 fps) until a restart is done.


This is just a problem everyone has, regardless of hardware or even OS (see my previous post or seach it in google).

Gigabyte telling you it was your CPU was just passing the buck because they didn't know, then intel passed the buck when they didn't know either... standard tech support









There must be a reason for it, but it's not your PSU or any other hardware, don't worry about it and just do a reboot every now and then









I'll see if I can find time today to try it out. I have Hibernation turned off to save space on my SSD.








Does it also happen when you sleep/standby the PC, or only when you hibernate it?


----------



## DiceAir

Quote:


> Originally Posted by *BruceB*
> 
> This is just a problem everyone has, regardless of hardware or even OS (see my previous post or seach it in google).
> 
> Gigabyte telling you it was your CPU was just passing the buck because they didn't know, then intel passed the buck when they didn't know either... standard tech support
> 
> 
> 
> 
> 
> 
> 
> 
> 
> There must be a reason for it, but it's not your PSU or any other hardware, don't worry about it and just do a reboot every now and then
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll see if I can find time today to try it out. I have Hibernation turned off to save space on my SSD.
> 
> 
> 
> 
> 
> 
> 
> 
> Does it also happen when you sleep/standby the PC, or only when you hibernate it?


Many people like to blame the psu for being not enough. Once I had issues with my crossfire r9 280x and Everybody told me that my AX850w is not enough. I was just laughing cause my pc doesn't even use 850w and it's a corsair psu that's working just fine. They told me to get a 1KW PSU. I'm starting to understand why some people go for 1kw+ psu's cause then nobody can blame the psu for not being strong enough.

I would really want intel to prove why a 800w corsair psu is not enough. Corsair psu's can handle rated power continuously. I have a powercolor r9 270x in my other system and it's just fine.


----------



## rdr09

Quote:


> Originally Posted by *OmnesPotens*
> 
> Yes thats what I mean, in the bios it labels them as Gen1, Gen2 and Gen3. Regardless on whether or not it is enough for the card shouldnt the card still perform just as well if not better in PCI-e 3.0 x16? According to what i can see the bios I have is version F4 and when i go to the Gigabyte page for my mobo there are no bios updates or versions listed. Any idea where/how else I can check for updates?


if this is for your motherboard . . . update to F5

http://www.gigabyte.com/products/product-page.aspx?pid=4965#bios


----------



## BruceB

Quote:


> Originally Posted by *OmnesPotens*


I just tried it on my machine (in sig): I put it into energy-saving mode then woke it back up again and ran a Benchmark (_Fire Strike_). The difference is practically nothing.









Do you only get this Problem when you when you hibernate your PC?


----------



## Devildog83

*GigabitPony and buttface240* have been added, welcome to the club.

I will be checking in a lot more often now the my son moved out of my mancave and I have unfettered access to my main rig.


----------



## OmnesPotens

Quote:


> Originally Posted by *BruceB*
> 
> I just tried it on my machine (in sig): I put it into energy-saving mode then woke it back up again and ran a Benchmark (_Fire Strike_). The difference is practically nothing.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do you only get this Problem when you when you hibernate your PC?


I've never run my system in any kind of sleep or hibernate mode/setting. I've never needed to and I was honestly just going to disable it completely to save space like you've done with your ssd..

My problem only occurs when I fully shutdown the pc using the Windows shutdown button and then wait a while before powering the system back on. When I do this, the fps is complete crap, but as soon as I restart the fps shoots up to a stable 60.


----------



## Devildog83

Quote:


> Originally Posted by *OmnesPotens*
> 
> I've never run my system in any kind of sleep or hibernate mode/setting. I've never needed to and I was honestly just going to disable it completely to save space like you've done with your ssd..
> 
> My problem only occurs when I fully shutdown the pc using the Windows shutdown button and then wait a while before powering the system back on. When I do this, the fps is complete crap, but as soon as I restart the fps shoots up to a stable 60.


Do you have Afterburner? Even if not could you show us the settings you have there. In the CCC turn off graphics overdrive if it's on. It could be you need to do a complete removal of the drive and install it again. Some cards work better with older versions. I hope we can help you.


----------



## OmnesPotens

Quote:


> Originally Posted by *Devildog83*
> 
> Do you have Afterburner? Even if not could you show us the settings you have there. In the CCC turn off graphics overdrive if it's on. It could be you need to do a complete removal of the drive and install it again. Some cards work better with older versions. I hope we can help you.


Okay, Ill screenshot some pics of the MSI Afterburner settings I have as well ass the CCC settings (which are all at default with Overdrive turned off) just give me a few minutes. I've tried drivers version: 13.2, 14.4, and 14.7 Beta.

One thing I've noticed is that in my BIOS under the section that controls the PCIe Slot Configuration, there is a setting called "DMI Gen2 Speed" does this actually have anything to do with the speed the GPU will run at? because if so there is no DMI Gen3 option and I can only enable or disable DMI Gen2 Speed.


----------



## Devildog83

If you are talking about the mobo bios I have never changed any settings and don't recommend doing it unless you know exactly what you are doing.


----------



## OmnesPotens

Quote:


> Originally Posted by *Devildog83*
> 
> If you are talking about the mobo bios I have never changed any settings and don't recommend doing it unless you know exactly what you are doing.


I haven't changed them besidess setting the RAM speed to 1600Mhz (the problem persists regardless of the RAM speed). I just got off the phone with an actually helpful, knowledgeable tech support person. We went through the BIOS settings, checked Display Adapters, checked power settings (this time the guy didn't pull some BS and say "you need an 800Watt PSU"). After seeing that I updated the mobo BIOS, tried multiple display drivers and even freshly re-installed Windows 8.1 he finally admitted that without having another system or GPU to test with he can only narrow it down to being a Motherboard or GPU issue. Which is obvious at this point but hey its something. So he recommended I try to have the GPU replaced by Amazon and if not to RMA Gigabyte. I'm fine with that since I need to focus more on school anyway, its just a pain that after all the trouble the only solution is a replacement.


----------



## OmnesPotens

So the replacement R9 270x that Amazon sent does no better than my last one. It does the same thing, runs great after I use the restart option in Windows (60 fps stable) but after shutting down and then powering it back on, the Fps drops tremendously to around 5 fps, unstable. So is it actually the mobo PCIe slot?

Note: I had been testing things with the audio off for the most part, seems like the Audio is off, sometimes it crackles or sounds very grainy and like corrupted almost. Not sure how to explain it. I'm thinking the whole problem stems from something on the motherboard. If its not the motherboard then I have no idea what it is because I've tested the CPU with Intel's tech support and the Processor Diagnostic Tool. I've made sure it was properly seated and all that, the temps never get anywhere out of the safe zone so idk.


----------



## hatchet_warrior

Quote:


> Originally Posted by *OmnesPotens*
> 
> So the replacement R9 270x that Amazon sent does no better than my last one. It does the same thing, runs great after I use the restart option in Windows (60 fps stable) but after shutting down and then powering it back on, the Fps drops tremendously to around 5 fps, unstable. So is it actually the mobo PCIe slot?
> 
> Note: I had been testing things with the audio off for the most part, seems like the Audio is off, sometimes it crackles or sounds very grainy and like corrupted almost. Not sure how to explain it. I'm thinking the whole problem stems from something on the motherboard. If its not the motherboard then I have no idea what it is because I've tested the CPU with Intel's tech support and the Processor Diagnostic Tool. I've made sure it was properly seated and all that, the temps never get anywhere out of the safe zone so idk.


I'm almost positive that you have a faulty motherboard. The only other option could be a faulty PSU delivering crappy power. I can't remember if you checked that. I think you should just replace that motherboard. Seems to be the cheapest option right now.


----------



## OmnesPotens

Luckily I'm still within the return/replace period so as long as the company doesn't try to screw me over I'll be able to either get a refund or just get it replaced for no charge. I don't have any other PSU to test with and I've watched the voltages on both a regular boot and a restart button boot. they don't change between the two so if that's any indicator of consistent power output then it looks fine. There is a slight change on one of the voltages every now and again, not sure what its monitoring. It goes from 7.94-8V but it rarely does this and is always either 7.94 for a long time or 8 for a long time.


----------



## hatchet_warrior

Quote:


> Originally Posted by *OmnesPotens*
> 
> Luckily I'm still within the return/replace period so as long as the company doesn't try to screw me over I'll be able to either get a refund or just get it replaced for no charge. I don't have any other PSU to test with and I've watched the voltages on both a regular boot and a restart button boot. they don't change between the two so if that's any indicator of consistent power output then it looks fine. There is a slight change on one of the voltages every now and again, not sure what its monitoring. It goes from 7.94-8V but it rarely does this and is always either 7.94 for a long time or 8 for a long time.


You bios should give you accurate readings for idle voltages, HWMonitor should be pretty close if you want to check load voltages. I'm still convinced the issue is related to the motherboard, so the 12v numbers are the most important. Ideally, you should see no more than a 10% drop at any point (11.88v)


----------



## OmnesPotens

According to the Bios, all the voltages are constant. In particular, that +12V stays at a solid 11.952 V


----------



## hatchet_warrior

I would check them under load just yo make sure, but it sounds like your psu is ok. So either RMA that board or buy a new one.


----------



## buttface420

i've seen a few other people with the same exact motherboard have the same problem..replacing the psu or gpu did not fix thier problem. they replaced thier motherboard


----------



## DarthBaggins

After reading all about your issues I'd definitely say your mobo is going bad, I'd recommend rma on it and sell for a different mobo like the Gaming 5/7


----------



## OmnesPotens

Yeah I'm in the process of trying to get a refund for the mobo because I had no idea that this mobo had so many problems.


----------



## neurotix

Skimmed this thread again.

I'm gonna give all you new guys some advice.



Make your CCC settings look like this, and then run Valley or Heaven with your card overclocked.

You *WILL* get a much better score.

Trust me, I'm a bencher and top 200 in the US as well as #6 on the OCN United States team.

As a bonus, with these settings, you will also get much greater fps in games, if you can live without tessellation. Give it a try. I don't know what the hell tessellation even is other than it slows AMD cards to half their potential fps while I don't notice any difference in image quality whatsoever. Also, in whatever games you play try and use FXAA if it's available. It's much faster and more modern than MSAA, which is slow because it's from like 1999.

Hope this helps some of you with just a single 270X or 280X on a single monitor.


----------



## radier

Anizo 2x ?? No thank you...

Wysłane z mojego GT-N7000 przy użyciu Tapatalka


----------



## nuiproblema

Hello everyone . I'm new here . i own a Gigabyte R9 270( this model exactly GV-R927OC-2GD) . I need some help with overclocking . After staying one night up i finally found how to unlock the voltage and i have two sweet spots :
a) core voltage -19 mv => core clock 1100 mhz and mem 1500( temperatures around 63-64 on metro LL)
b) core voltage +31 mv => core clock 1152 mhz and mem 1500 (temperatures around 65-68 on metro LL; max voltage ingame voltage is 1,225 volts)
I can go up to 1,3v but that seems a bit too much . I wanted to know if option (b) is relatively safe for 24/7 or should i go with option (a)?


----------



## Recr3ational

Quote:


> Originally Posted by *nuiproblema*
> 
> Hello everyone . I'm new here . i own a Gigabyte R9 270( this model exactly GV-R927OC-2GD) . I need some help with overclocking . After staying one night up i finally found how to unlock the voltage and i have two sweet spots :
> a) core voltage -19 mv => core clock 1100 mhz and mem 1500( temperatures around 63-64 on metro LL)
> b) core voltage +31 mv => core clock 1152 mhz and mem 1500 (temperatures around 65-68 on metro LL; max voltage ingame voltage is 1,225 volts)
> I can go up to 1,3v but that seems a bit too much . I wanted to know if option (b) is relatively safe for 24/7 or should i go with option (a)?


Temps look good to me for b) so I don't see why not..
Have you tried leaving the gpu stressed for like an hour? Just to see if that's the max temp it's going to reach?


----------



## nuiproblema

I really dont trust programs like furmark or occt because i know that no game will stress them like that . I runned metro last ligh for about two hours with option b ( gpu being at 99% at the majority of time in the game; SSAA 2x ) and it didn't went over 68 degrees. But i will run OCCT for 5 minutes and post the results so i can know the worst case scenario .


----------



## Recr3ational

Quote:


> Originally Posted by *nuiproblema*
> 
> I really dont trust programs like furmark or occt because i know that no game will stress them like that . I runned metro last ligh for about two hours with option b ( gpu being at 99% at the majority of time in the game; SSAA 2x ) and it didn't went over 68 degrees. But i will run OCCT for 5 minutes and post the results so i can know the worst case scenario .


No you're right. I don't either. As long as it's stressed and the temps are fine. Then keep it like that.

I just leave heaven on for a few hours and leave it to see what my temps are. Cos something I'll go on for massive gaming sessions.


----------



## hatchet_warrior

Quote:


> Originally Posted by *neurotix*
> 
> Skimmed this thread again.
> As a bonus, with these settings, you will also get much greater fps in games, if you can live without tessellation. Give it a try. I don't know what the hell tessellation even is other than it slows AMD cards to half their potential fps while I don't notice any difference in image quality whatsoever. Also, in whatever games you play try and use FXAA if it's available. It's much faster and more modern than MSAA, which is slow because it's from like 1999.
> 
> Hope this helps some of you with just a single 270X or 280X on a single monitor.


Tessellation increases polygon count of closer objects. Quite simply, it makes the world look more realistic by having smother contours rather than hard, angular edges.


----------



## Bruska

Hey guys do you know how to check VRM temps in a Sapphire r9 270 X Vapor X??

It's because I can reach 1625 mhz on the memory but after some time it starts artifacting and I think it's beacuse VRM temps. The max core OC i've done is 1220 mhz stable on games and 1250 mhz on benchs, and temps are around 68º-71º


----------



## buttface420

Quote:


> Originally Posted by *neurotix*
> 
> Skimmed this thread again.
> 
> I'm gonna give all you new guys some advice.
> 
> 
> 
> Make your CCC settings look like this, and then run Valley or Heaven with your card overclocked.
> 
> You *WILL* get a much better score.
> 
> Trust me, I'm a bencher and top 200 in the US as well as #6 on the OCN United States team.
> 
> As a bonus, with these settings, you will also get much greater fps in games, if you can live without tessellation. Give it a try. I don't know what the hell tessellation even is other than it slows AMD cards to half their potential fps while I don't notice any difference in image quality whatsoever. Also, in whatever games you play try and use FXAA if it's available. It's much faster and more modern than MSAA, which is slow because it's from like 1999.
> 
> Hope this helps some of you with just a single 270X or 280X on a single monitor.


thanks man good info!


----------



## nuiproblema

Hello everyone ! Im in with a update of the overclocking journey on my R9 270 . I went one step futher and now i succedeed to hit 1200 mhz on core with +81 core voltage and 1500 mhz mem.I looped heaven with these settings and max temp was 76 degrees but averagin 73 degrees. Do you think i am relatively safe with those clocks ? In metro LL temps never went up above 70 degrees and ingame max voltage is 1,273 volts .
Im a bit scared on pushing things further and i think i will establish for these if you guys think clocks and temps are still ok .
I need some opinions


----------



## neurotix

Quote:


> Originally Posted by *nuiproblema*
> 
> Hello everyone ! Im in with a update of the overclocking journey on my R9 270 . I went one step futher and now i succedeed to hit 1200 mhz on core with +81 core voltage and 1500 mhz mem.I looped heaven with these settings and max temp was 76 degrees but averagin 73 degrees. Do you think i am relatively safe with those clocks ? In metro LL temps never went up above 70 degrees and ingame max voltage is 1,273 volts .
> Im a bit scared on pushing things further and i think i will establish for these if you guys think clocks and temps are still ok .
> I need some opinions


How long did you play Metro for at those clocks? No crashes?

The Metro series is one of the most demanding games you could possibly play. So is Far Cry 3 or Crysis 3. If you can play any of those games without crashing you are probably stable.

Heaven/Valley are good to test for artifacts at extreme clocks, or black screen errors from too high of a memory OC. However, that's about all they're good for. Case in point: My 270X can pass Valley at 1300mhz with no artifacts. However, it crashes within 5 seconds of loading a save in Crysis 3. Obviously, it's not stable for gaming at those clocks.

So, if you can play Metro, you're probably good.


----------



## nuiproblema

I've got no crashes with those clocks . What i meant is if im relatively 24/7 safe with those clocks,voltages and temps .

PS: After some more testing i became uneasy with those volts (1,273 seems a bit too much coming from 1,93 stock ) so i've settled in for 1,25 volts and 1175 core clock and memory clocks . Temperatures in valley are within 73 degrees and in game doesn't go over 70 degrees


----------



## neurotix

Quote:


> Originally Posted by *nuiproblema*
> 
> I've got no crashes with those clocks . What i meant is if im relatively 24/7 safe with those clocks,voltages and temps .
> 
> PS: After some more testing i became uneasy with those volts (1,273 seems a bit too much coming from 1,93 stock ) so i've settled in for 1,25 volts and 1175 core clock and memory clocks . Temperatures in valley are within 73 degrees and in game doesn't go over 70 degrees


Well, in general, volts aren't what kill chips, heat is. So you can give it as much voltage as you need as long as you keep the load temps under about 80C. I'm not sure what the max heat allowed is for the 270, though. As soon as it starts downclocking to maintain thermal specs, you've reached the limit, and running it at or near that temp 24/7 for a long time *might* kill the chip. But for gaming it's probably okay. Just don't give it 1.4v and run Furmark on it or something and you shouldn't have anything to worry about.


----------



## Janac

I just want to say something. I had very not pleasure sound from my 280x, yes it was a coil whine. It is a Club 3D royalQueen edition. But then I changed the BIOS switch from 2 to 1 and coil whine disappeared. Strange huh









Hope that helps anyone having coil whine with 280x


----------



## hatchet_warrior

Quote:


> Originally Posted by *neurotix*
> 
> under about 80C. I'm not sure what the max heat allowed is for the 270, though.


I remember seeing somewhere that max before shutdown is 100c, and throttling starts around 90c. But as always, lower is better, especially when it comes to 1 or 2 fps.


----------



## Devildog83

Quote:


> Originally Posted by *nuiproblema*
> 
> Hello everyone ! Im in with a update of the overclocking journey on my R9 270 . I went one step futher and now i succedeed to hit 1200 mhz on core with +81 core voltage and 1500 mhz mem.I looped heaven with these settings and max temp was 76 degrees but averagin 73 degrees. Do you think i am relatively safe with those clocks ? In metro LL temps never went up above 70 degrees and ingame max voltage is 1,273 volts .
> Im a bit scared on pushing things further and i think i will establish for these if you guys think clocks and temps are still ok .
> I need some opinions


Just a hint for you guys, there is a reason I have never run furmark on any card I have ever owned. People with much more experience than me have told me that it's not a stress test but a torture test plain and simple. It may be useful for a burn in at stock clocks but I would never use it for testing out an overclock, you could easily fry the chip where a nice long run in Heaven or Crysis 3 and BF4 will tell your OC is stable without to much chance of brickin' the GPU. Just my thoughts.


----------



## neurotix

Quote:


> Originally Posted by *Devildog83*
> 
> Just a hint for you guys, there is a reason I have never run furmark on any card I have ever owned. People with much more experience than me have told me that it's not a stress test but a torture test plain and simple. It may be useful for a burn in at stock clocks but I would never use it for testing out an overclock, you could easily fry the chip where a nice long run in Heaven or Crysis 3 and BF4 will tell your OC is stable without to much chance of brickin' the GPU. Just my thoughts.


QFT.

+rep


----------



## Danbeme32

I would like to join the club running 2x Powercolor R9 280x. one at 1105/1500 and the other 1080/1500.


----------



## Devildog83

Quote:


> Originally Posted by *Danbeme32*
> 
> I would like to join the club running 2x Powercolor R9 280x. one at 1105/1500 and the other 1080/1500.
> 
> 
> Spoiler: Warning: Spoiler!


You have been added, welcome !!! Those are the Turbo Duo's right?


----------



## Recr3ational

It's nice seeing turbo duos being used. It's so under rated.


----------



## Harrywang

Hello guys, just recently bought a HIS iceq2 r9 280x for 110$. I want to overclock this badboy

Any tips?

I haven't OC'd a GPU since my 560 TI and it seems like theres some new options out there?

What power limit should I have?

What are the "sweet spots"? Do I need to increase my memory voltage at all? Also should I be using a fan curve?

EDIT:

So after a day of overclocking I managed to get the following :

Core 1180
Memory 1575
Power limit +20
Voltage 1250 mV

Max temps were 71c max using a custom fan curve. I used heaven dx11 for stability teting.

Are these considered good? My memory seems a bit low how do people get up to the 1700 range? I never played around with memory voltage before do I need to do that to get higher memory clocks? If so is it even worth it to try to raise my memory clock?


----------



## Danbeme32

Quote:


> Originally Posted by *Devildog83*
> 
> [/SPOILER]
> 
> You have been added, welcome !!! Those are the Turbo Duo's right?


Yes they are Turbo Duo's..


----------



## neurotix

Quote:


> Originally Posted by *Harrywang*
> 
> Hello guys, just recently bought a HIS iceq2 r9 280x for 110$. I want to overclock this badboy
> 
> Any tips?
> 
> I haven't OC'd a GPU since my 560 TI and it seems like theres some new options out there?
> 
> What power limit should I have?
> 
> What are the "sweet spots"? Do I need to increase my memory voltage at all? Also should I be using a fan curve?
> 
> EDIT:
> 
> So after a day of overclocking I managed to get the following :
> 
> Core 1180
> Memory 1575
> Power limit +20
> Voltage 1250 mV
> 
> Max temps were 71c max using a custom fan curve. I used heaven dx11 for stability teting.
> 
> Are these considered good? My memory seems a bit low how do people get up to the 1700 range? I never played around with memory voltage before do I need to do that to get higher memory clocks? If so is it even worth it to try to raise my memory clock?


That overclock is decent. Actually, 1180mhz core is very good, especially on air. Most cards won't do much over 1200mhz, regardless of what some people say. You might be able to bench them slightly higher than that, but not game on them for very long.

Anyway, as for memory clocks... I can give you the info I use with my 7970, but I'm unsure if it will work with a 280X or not.

SapphireTrixx4.4.0b-MOD1.zip 3450k .zip file


That's a modded version of Sapphire Trixx. Install it and see if the program reads your cards clocks or not. If it does, great! That Trixx has a slider for memory voltage on Tahiti GPUs. *That* is how people get memory stable at over 1700mhz. Additionally, if you use this modded Trixx it allows the card to have up to 1.381v. Alternatively, you could try the latest MSI Afterburner as it also should have a memory voltage control.



If your card supports mem overvoltage there should be an unlocked voltage slider in Afterburner that is visible after clicking the arrow.

Additionally, you will probably want to learn how to unlock the power limit slider to 50%. Afaik, this method still works on Win 7. If you're not using Win 7, well... I don't know what to tell you. You should be fine and not throttle with the power limit at 20%, but raising it to 50% doesn't hurt either and will practically eliminate throttling on Tahiti in the case of extreme overclocks. Essentially, you MUST install MSI Afterburner and run it with the flag /xcl (e*X*tend clock limit). Add /xcl to Afterburner's shortcut on the desktop if you right click the icon and go to properties. Then run it, and reboot. Next you must find the registry entries mentioned in the guide link. Look for an entry 'PP_PhmSoftPowerPlayTable'. It might not show up under 0000, the last time I did this mod (about a week ago upon installing my 7970 in a different rig) the power play table was in the same registry hive, but under 0002. Then, click on the power play table and scroll down a bit. Change the only "14" to a "32". Close everything and reboot. If you did it right, your power limit on Tahiti should be 50%.

Note: as with anything involving the Windows registry, I highly recommend you make a restore point before attempting this in case you screw up.









Hope this helps.


----------



## aaronsta1

well i had to send in my MSI 270X Gaming for RMA due to a bad fan.. i just got it back today.

i is crap









wont even run at default speeds 1080/1400 i have black squares everywhere in 3dmark.

one of the things im noticing in gpu-z is when i am doing 3dmark the volt is only 1.16 and not the 1.2 that the other card defaulted to.
so i checked out some things and the old card had bios 015.041.000.000.003348 and the new card has 015.041.000.000.002887

not sure if i should try to flash the newer bios on the card?

i hate to have to mail this card in again if its just something simple like a bios update.

edit: i even lowered the clock to 975/1400 to see if it would run 3dmark.. it did not









i guess ill be sending this one back.


----------



## Aleckazee

Looks like I've finally got a working card. I did some quick overclocking and managed to get 1180/1550MHz with just the +20 power limit. I got a score of 1333 in valley on ultra preset and 1440p. Does that seem like a decent score for my res?
Also, afterburner and hwmonitor both show a max temp of 73c, but I'm guessing that's the core temp? How do I check mem temp? Or is it the same thing?


----------



## neurotix

Quote:


> Originally Posted by *Aleckazee*
> 
> Looks like I've finally got a working card. I did some quick overclocking and managed to get 1180/1550MHz with just the +20 power limit. I got a score of 1333 in valley on ultra preset and 1440p. Does that seem like a decent score for my res?
> Also, afterburner and hwmonitor both show a max temp of 73c, but I'm guessing that's the core temp? How do I check mem temp? Or is it the same thing?


1333 score means nothing, post a screenshot of your FPS and score. Make sure you run it with 8X AA.

73C is probably your core temp. Use TechPowerUp GPU-Z and the sensors pane. Where it says "GPU temperature", that's your core temp. I believe on Tahiti and Pitcairn "VRM2" is memory temperature.

It would help if you told us WHAT card you have, as this club includes 4 different cards. Manufacturer and cooling would also help.


----------



## End3R

Quote:


> Originally Posted by *Aleckazee*
> 
> Looks like I've finally got a working card. I did some quick overclocking and managed to get 1180/1550MHz with just the +20 power limit. I got a score of 1333 in valley on ultra preset and 1440p. Does that seem like a decent score for my res?
> Also, afterburner and hwmonitor both show a max temp of 73c, but I'm guessing that's the core temp? How do I check mem temp? Or is it the same thing?


I run at 1080p and get this score using Valley on ExtremeHD preset using a 270x (it just say custom because my monitor forces the res to be 1680x1050 instead of 1920x1080).










I'm just at stock speeds of 1050/1400.

If you have supersampling etc turned on in CCC it will lower your score.


----------



## Harrywang

Quote:


> Originally Posted by *neurotix*
> 
> That overclock is decent. Actually, 1180mhz core is very good, especially on air. Most cards won't do much over 1200mhz, regardless of what some people say. You might be able to bench them slightly higher than that, but not game on them for very long.
> 
> Anyway, as for memory clocks... I can give you the info I use with my 7970, but I'm unsure if it will work with a 280X or not.
> 
> SapphireTrixx4.4.0b-MOD1.zip 3450k .zip file
> 
> 
> That's a modded version of Sapphire Trixx. Install it and see if the program reads your cards clocks or not. If it does, great! That Trixx has a slider for memory voltage on Tahiti GPUs. *That* is how people get memory stable at over 1700mhz. Additionally, if you use this modded Trixx it allows the card to have up to 1.381v. Alternatively, you could try the latest MSI Afterburner as it also should have a memory voltage control.
> 
> 
> 
> If your card supports mem overvoltage there should be an unlocked voltage slider in Afterburner that is visible after clicking the arrow.
> 
> Additionally, you will probably want to learn how to unlock the power limit slider to 50%. Afaik, this method still works on Win 7. If you're not using Win 7, well... I don't know what to tell you. You should be fine and not throttle with the power limit at 20%, but raising it to 50% doesn't hurt either and will practically eliminate throttling on Tahiti in the case of extreme overclocks. Essentially, you MUST install MSI Afterburner and run it with the flag /xcl (e*X*tend clock limit). Add /xcl to Afterburner's shortcut on the desktop if you right click the icon and go to properties. Then run it, and reboot. Next you must find the registry entries mentioned in the guide link. Look for an entry 'PP_PhmSoftPowerPlayTable'. It might not show up under 0000, the last time I did this mod (about a week ago upon installing my 7970 in a different rig) the power play table was in the same registry hive, but under 0002. Then, click on the power play table and scroll down a bit. Change the only "14" to a "32". Close everything and reboot. If you did it right, your power limit on Tahiti should be 50%.
> 
> Note: as with anything involving the Windows registry, I highly recommend you make a restore point before attempting this in case you screw up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope this helps.


So after more testing I wasn't fully stable.

I changed my core to 1185 and memory at 1550 for now at 1250 core voltages.

My question still stands is it worth it to up my memory voltage so I can get higher memory clocks? Does it make a big difference?

Every review/benchmark that I seen doesn't mention anything about memory voltages but they get rly high memory clocks.


----------



## Aleckazee

Quote:


> Originally Posted by *neurotix*
> 
> MSI ftl,
> 1333 score means nothing, post a screenshot of your FPS and score. Make sure you run it with 8X AA.
> 
> 73C is probably your core temp. Use TechPowerUp GPU-Z and the sensors pane. Where it says "GPU temperature", that's your core temp. I believe on Tahiti and Pitcairn "VRM2" is memory temperature.
> 
> It would help if you told us WHAT card you have, as this club includes 4 different cards. Manufacturer and cooling would also help.


Thanks, haven't checked the mem temp yet but i'll do it soon. I've got a msi 280x.

After some more tweaking it seems to be stable at 1185/1680 vcore 1.242v.



running stock clocks (1120/1500) I get 1204.


----------



## muhd86

Guys I want to quad fire.r9 270x.msi twin.Frozer gpu.

4960x and gigabyte up4 1500 watt cooler master psu.

Just forsake of comparison is. R9 270X A 7950 SLIGHTLY over clocked

Sent from my Sony Xperia Z Ultra


----------



## rdr09

Quote:


> Originally Posted by *muhd86*
> 
> Guys I want to quad fire.r9 270x.msi twin.Frozer gpu.
> 
> 4960x and gigabyte up4 1500 watt cooler master psu.
> 
> Just forsake of comparison is. R9 270X A 7950 SLIGHTLY over clocked
> 
> Sent from my Sony Xperia Z Ultra


the 280 is equal to the 7950 not the 270X. the latter has to be oc'ed to match the 7950.

can't recommend 2GB cards.


----------



## neurotix

For gaming on a single screen, at Ultra settings with FXAA, 2GB should be more than enough. The only way you'll use that all is if you use a ton of post processing and MSAA, which will slow Pitcairn to a crawl anyway. (Hint: Use FXAA. It's faster and looks decent.)

I agree in general though. Since VRAM is not mirrored in Crossfire, doing Quadfire 270X would be incredibly worthless. You'd be limited to 2GB VRAM, and you shouldn't be doing Quadfire for 1080P anyway. At resolutions greater than 1920x1080, 2GB VRAM is not enough to play the latest games at decent quality settings. You should Quadfire a 280 or greater. And even then, each extra card has greatly diminishing returns. Two cards in Crossfire has about ~1.8-1.9 times the performance of a single card nowadays, but any after that add less and less FPS. Also, the more cards you add the stronger the rest of the system needs to be, I couldn't recommend doing Crossfire with an FX CPU for example. (And I know this from experience.)

Also, depending on the 270X and your clocks, you can actually match or exceed 7970 performance with Pitcairn in certain benchmarks. The difference between my 270X, 7870 XT and 7970 in Valley for example is less than 2 FPS (They all get around 48 fps).

http://hwbot.org/submission/2599416_neurotix_3dmark11___performance_radeon_r9_270x_10048_marks
http://hwbot.org/submission/2543586_neurotix_3dmark11___performance_radeon_hd_7870_(tahiti_core)_11471_marks
http://hwbot.org/submission/2599500_neurotix_3dmark11___performance_radeon_hd_7970_11884_marks

In 3dmark11 the 7970 is more powerful, but they're still all within 2000 points of each other.


----------



## hatchet_warrior

Quote:


> Originally Posted by *Harrywang*
> 
> So after more testing I wasn't fully stable.
> 
> I changed my core to 1185 and memory at 1550 for now at 1250 core voltages.
> 
> My question still stands is it worth it to up my memory voltage so I can get higher memory clocks? Does it make a big difference?
> 
> Every review/benchmark that I seen doesn't mention anything about memory voltages but they get rly high memory clocks.


Honestly, OCing a GPU is all about E-peen. The difference in frames is barely noticeable, yet the increase in heat and noise is quite obvious.

I'll use my card as an example.

Stock:

OC'd:


----------



## End3R

Quote:


> Originally Posted by *hatchet_warrior*
> 
> Honestly, OCing a GPU is all about E-peen. The difference in frames is barely noticeable, yet the increase in heat and noise is quite obvious.
> 
> I'll use my card as an example.
> 
> Stock:
> 
> OC'd:


Not that I entirely disagree with you, benchmarking in general is about epeen - as neurotix already mentioned anyone gaming on a single monitor at 1080p should be fine with a 2gb card, I can speak from experience, even heavily modding skyrim with an ENB and 2k textures, I still average 30 fps (which is dropping from 60+ without the enb) - but it's still completely playable.

But that being said, you're doing it wrong, I think you'd see a bigger increase than that otherwise. Do you still have supersampling etc turned on in CCC? Unless maybe you're CPU is bottlenecking the performance, which is why the score is barely changing.


----------



## buttface420

games are already demanding more than 2gb for 1080p tho, like watchdogs at ultra requires at least 3, and even then fps is le poo


----------



## End3R

Quote:


> Originally Posted by *buttface420*
> 
> games are already demanding more than 2gb for 1080p tho, like watchdogs at ultra requires at least 3, and even then fps is le poo


Actually it doesn't, that was an intentional gimp left in by ubisoft, The Worse Mod+Maldo eliminate that, you can play on ultra 2gb.
Quote:


> Originally Posted by *End3R*
> 
> The final version of TheWorse mod with Maldo Textures should be able to use ultra textures without stuttering on a card with 2gb.
> http://www.guru3d.com/files-details/watch-dogs-theworse-mod-8-download.html
> 
> "The mod's texture patch comes courtesy of MaLDo, the wizard responsible for enhancing Crysis 2. *It updates Watch Dogs' 'High' texture setting to use 'Ultra' textures without mipmaps, thereby reducing the stuttering effect that can happen on GPUs with 2GB of VRAM.*"
> 
> I'd buy this game if I could pay the modders, with about a month they did what ubisoft failed to do in 2 years.


----------



## buttface420

ahh okay, from what i understood the worse mod was...well...worse graphics than stock. i may give it a try


----------



## End3R

Quote:


> Originally Posted by *buttface420*
> 
> ahh okay, from what i understood the worse mod was...well...worse graphics than stock. i may give it a try


lol nah that's just the username of the guy who made it - it actually turns back on all the pretty visual effects they Demo'd at E3 years ago - as to why they were disabled is still widely speculated, but it's one of these reasons:

1) Due to incompetence, they were unable to optimize these effects so they figured it'd be easier to just turn them off
2) They intentionally did it to gimp the pc release to help their sales on console

Either way, it's pretty lame. I really wish I could pay the modders for it, they made the game worth $60, I refuse to fork that over to Ubi, maybe when the game hits below $20 I'll pick it up.


----------



## hatchet_warrior

Quote:


> Originally Posted by *End3R*
> 
> But that being said, you're doing it wrong, I think you'd see a bigger increase than that otherwise. Do you still have supersampling etc turned on in CCC? Unless maybe you're CPU is bottlenecking the performance, which is why the score is barely changing.


It is most likely my CPU holding it back. Yet there really isn't too much to be gained, even with a better CPU. Comparing my results with newer Intel systems shows only a slight bump in fps. Again nothing that most people would actually be able to see as long as they stay consistent and above 45.


----------



## End3R

Quote:


> Originally Posted by *hatchet_warrior*
> 
> It is most likely my CPU holding it back. Yet there really isn't too much to be gained, even with a better CPU. Comparing my results with newer Intel systems shows only a slight bump in fps. Again nothing that most people would actually be able to see as long as they stay consistent and above 45.


Very true, as I said I wasn't entirely disagreeing with you. With my rig, using an 8320 and 270x is widely considered to be "mid-range" and yet at stock, I can run everything out there above 30fps, the only benefit I get from overclocking is a higher benchmark score, in my games I won't notice any diff.








Maybe in 10 years when the next, "next-gen" comes out I'll oc my stuff, assuming I haven't just bought another completely new comp by then. Only reason anyone would need more powerful gear is if they are running at higher than 1080p.


----------



## Harrywang

Agreed....after testing many my OC I can't find a reason for me to overclock my GPU. I really enjoy OCing however OCing gpu seems to BARELY if at all increase my fps in games.

After having a stable OC of 1185/1550 running heaven it is not stable when running high demanding games.

GW2 for an example. My max overclock playing this game is 1160/1575 at 1.25 volts. I tried to see the difference in fps but there doesn't seem to be an increase at all.

I don't see any reason for me to have it at this OC with the increase in heat and noise. Kind of sad really.


----------



## Aleckazee

This is ridiculous, this is the 3rd msi 280x I have and now this one is getting incredibly bad coil whine, it sounds like screeching tires. I'm starting to regret buying it in the first place









I don't know what to do


----------



## DarthBaggins

Put it under water


----------



## [CyGnus]

I solved my coil whine with another PSU, not that the one i had was bad, some brands of psu's dont like some card brands... weird but true.


----------



## Aleckazee

Quote:


> Originally Posted by *DarthBaggins*
> 
> Put it under water


Would that actually solve it? I thought it would still be audible.
Quote:


> Originally Posted by *[CyGnus]*
> 
> I solved my coil whine with another PSU, not that the one i had was bad, some brands of psu's dont like some card brands... weird but true.


Thanks, I've got a spare tx650 I could try but I really don't want to have to buy another psu :/


----------



## [CyGnus]

Aleckazee let me know if it worked and also try to lower or up the voltage of the card, that alone will change its voltage frequency so maybe it can also help


----------



## Devildog83

Sadness in GPU town.







It looks like my HD7870 Devil died. Even opening a webpage it went black screen on me. Turned of X-Fire and just used the 7870 and no go, tried removing the 270x and no go, then tried just the 270x and it works fine so I installed the 7870 in the bottom and no go then too. I guess running it at 1200 core was just too much. I never had issues with running that way for as long as I have had it but I guess things don't last forever.

I just might sell the 270x devil and get a 290.


----------



## Dasboogieman

Quote:


> Originally Posted by *Aleckazee*
> 
> This is ridiculous, this is the 3rd msi 280x I have and now this one is getting incredibly bad coil whine, it sounds like screeching tires. I'm starting to regret buying it in the first place
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't know what to do


I padded my Hawaii Tri X chokes with Sekisui thermal tape, dampens the noise quite well. Also, hot electronics silicone glue works wonders as well, in fact, that is how PSU manufacturers eliminate coil whine commercially.


----------



## neurotix

Quote:


> Originally Posted by *Devildog83*
> 
> Sadness in GPU town.
> 
> 
> 
> 
> 
> 
> 
> It looks like my HD7870 Devil died. Even opening a webpage it went black screen on me. Turned of X-Fire and just used the 7870 and no go, tried removing the 270x and no go, then tried just the 270x and it works fine so I installed the 7870 in the bottom and no go then too. I guess running it at 1200 core was just too much. I never had issues with running that way for as long as I have had it but I guess things don't last forever.
> 
> I just might sell the 270x devil and get a 290.


I'm sorry for your loss, DD. R.I.P. the little 7870 that could.

A single 290 would essentially be an upgrade; considering that you get twice as much VRAM, but the same amount of ROPs, TMUs and shaders.

Finding one that's red and black might be difficult, I think the only one like that is the ASUS DCUII or reference, and the coolers on both pretty much suck. (I read that the DCUII only has 3 heatpipes out of 5 that contact the GPU core, and VRM cooling is nonexistent.)

Should be easy to find a 290 on Ebay for around $300. You can always paint it to match too.

Good luck.


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> I'm sorry for your loss, DD. R.I.P. the little 7870 that could.
> 
> A single 290 would essentially be an upgrade; considering that you get twice as much VRAM, but the same amount of ROPs, TMUs and shaders.
> 
> Finding one that's red and black might be difficult, I think the only one like that is the ASUS DCUII or reference, and the coolers on both pretty much suck. (I read that the DCUII only has 3 heatpipes out of 5 that contact the GPU core, and VRM cooling is nonexistent.)
> 
> Should be easy to find a 290 on Ebay for around $300. You can always paint it to match too.
> 
> Good luck.


Thinking about a reference card so it will be easier to get a sweet waterblock for it. I will have to redo my loop but that's OK cause I love doing it anyway. I was getting a bit bored!









P. S. There is red and black Powercolor 290 but it seems to be crap too.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Thinking about a reference card so it will be easier to get a sweet waterblock for it. I will have to redo my loop but that's OK cause I love doing it anyway. I was getting a bit bored!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P. S. There is red and black Powercolor 290 but it seems to be crap too.


Don't underestimate the PowerColor mate. I mean performance wise. The 280x was the first ever PowerColor I got. It seems that they've actually made a good card here. Not voltage locked, it overclocks well. Especially underwater. Your options there if you need it.

I'm definitely looking out for them now. Usually I would buy Asus or Sapphire or something but the PowerColor took me by surprise.


----------



## radicalrev

Hi guys,

My 7970 just died on me, and trying to replace it with a 280x that is still compatible with my XSPC waterblock.

However, I have been hearing lots of artifact issues and such with the 280x. Can anyone shed some light on this matter?
I am thinking of getting HIS R9 280x Turbo Boost since it is compatible. Link : http://www.hisdigital.com/un/product2-772.shtml

Do you think I should be fine with my choice? Or should I just get another HD7970 to replace and not worry about artifact issues?


----------



## Recr3ational

Quote:


> Originally Posted by *radicalrev*
> 
> Hi guys,
> 
> My 7970 just died on me, and trying to replace it with a 280x that is still compatible with my XSPC waterblock.
> 
> However, I have been hearing lots of artifact issues and such with the 280x. Can anyone shed some light on this matter?
> I am thinking of getting HIS R9 280x Turbo Boost since it is compatible. Link : http://www.hisdigital.com/un/product2-772.shtml
> 
> Do you think I should be fine with my choice? Or should I just get another HD7970 to replace and not worry about artifact issues?


upgrade to a 280x. If you get artifacts you'll still be able to RMA it.


----------



## radicalrev

Quote:


> Originally Posted by *Recr3ational*
> 
> upgrade to a 280x. If you get artifacts you'll still be able to RMA it.


What are the chances to get one with artifacts? Does it have to do with AMD's software or is this a hardware issue during manufacturing?


----------



## End3R

Quote:


> Originally Posted by *radicalrev*
> 
> What are the chances to get one with artifacts? Does it have to do with AMD's software or is this a hardware issue during manufacturing?


I'd say you got a 50/50 shot, the first 270x I got I had to RMA, it had the flickering/artifacting/crashing issues that google will tell you is fairly common with these cards. That being said, the 2nd 270x I was sent after the RMA has been performing like an absolute champ.


----------



## radicalrev

Quote:


> Originally Posted by *End3R*
> 
> I'd say you got a 50/50 shot, the first 270x I got I had to RMA, it had the flickering/artifacting/crashing issues that google will tell you is fairly common with these cards. That being said, the 2nd 270x I was sent after the RMA has been performing like an absolute champ.


wow thats a pretty high chance. Is there a quick fix yet for this issue instead of RMA?


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Don't underestimate the PowerColor mate. I mean performance wise. The 280x was the first ever PowerColor I got. It seems that they've actually made a good card here. Not voltage locked, it overclocks well. Especially underwater. Your options there if you need it.
> 
> I'm definitely looking out for them now. Usually I would buy Asus or Sapphire or something but the PowerColor took me by surprise.


I like Powercolor, just the one red and black 290 has had horrible reviews so I am no likely to go to that particular card. Would not hesitate to go reference Powercolor 290 and dunk it under water.


----------



## Roaches

Quote:


> Originally Posted by *Devildog83*
> 
> Sadness in GPU town.
> 
> 
> 
> 
> 
> 
> 
> It looks like my HD7870 Devil died. Even opening a webpage it went black screen on me. Turned of X-Fire and just used the 7870 and no go, tried removing the 270x and no go, then tried just the 270x and it works fine so I installed the 7870 in the bottom and no go then too. I guess running it at 1200 core was just too much. I never had issues with running that way for as long as I have had it but I guess things don't last forever.
> 
> I just might sell the 270x devil and get a 290.


Might want to disassemble that card to see what killed it...Usually SMD components on the PCB that fail are likely to be the root cause...Sorry for your loss mate.


----------



## neurotix

Quote:


> Originally Posted by *Roaches*
> 
> Might want to disassemble that card to see what killed it...Usually SMD components on the PCB that fail are likely to be the root cause...Sorry for your loss mate.


Yeah, it couldn't hurt to take the heatsink off and look at the board and see if you notice any burn marks or anything. When my 6870 and 7970 died I did this, but didn't find anything.


----------



## Devildog83

Quote:


> Originally Posted by *Roaches*
> 
> Might want to disassemble that card to see what killed it...Usually SMD components on the PCB that fail are likely to be the root cause...Sorry for your loss mate.


I am going to do a complete reinstall of windows later today and I will try and make sure it wasn't something with the drivers or anything like that but I won't hold too much hope. Who knows but since I reinstalled the latest driver while troubleshooting I lost sound too so we will see.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> I am going to do a complete reinstall of windows later today and I will try and make sure it wasn't something with the drivers or anything like that but I won't hold too much hope. Who knows but since I reinstalled the latest driver while troubleshooting I lost sound too so we will see.


IF your second card works, i doubt its drivers mate.
Is it worth re-installing windows?
Just trying to save you some time that's all.
Good luck dude.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> IF your second card works, i doubt its drivers mate.
> Is it worth re-installing windows?
> Just trying to save you some time that's all.
> Good luck dude.


No, but because of the audio issue I was going to but I solved it by reinstalled the audio driver so I cancelled that thought.


----------



## Scorpion49

So I decided to hop in here with my reference model AMD R9 270, I got it for $65 as a brand new pull from a Dell system. I really like the reference cooler on this card, it feels more significant than the one on the 290, very heavy. Not sure of what design they used for the heatsink and all but it works well. I did have to remove the long bracket they install on the end of it, luckily I had some screws from a GTX 680 that fit in the holes left over once I removed it.


----------



## orick

What's the length of the reference 270? Curious because I am interested in getting a 270 for upgrade but my case is small and I have extra fans installed on the drive cage.


----------



## Scorpion49

Quote:


> Originally Posted by *orick*
> 
> What's the length of the reference 270? Curious because I am interested in getting a 270 for upgrade but my case is small and I have extra fans installed on the drive cage.


It is exactly 9.5" long. Its longer than my Powercolor PCS+ 7870 ghz card by about an inch.


----------



## orick

Thanks. That's too long. I think the asus one might be 9". Which just might fit. Going to have to dig around a bit.


----------



## Paopawdecarabao

Can I join?



Just bought a Sapphire r9 270x 4gb brand new seal is intact for $115 locally.

So happy would go to my mini ITX build.


----------



## Aleckazee

Quote:


> Originally Posted by *[CyGnus]*
> 
> let me know if it worked and also try to lower or up the voltage of the card, that alone will change its voltage frequency so maybe it can also help


Sorry for the late reply, changing psu didn't make a difference, I also tried changing the core voltage and power limit in afterburner but that didn't make any different either :/
Quote:


> Originally Posted by *Dasboogieman*
> 
> I padded my Hawaii Tri X chokes with Sekisui thermal tape, dampens the noise quite well. Also, hot electronics silicone glue works wonders as well, in fact, that is how PSU manufacturers eliminate coil whine commercially.


Thanks, I'll try the thermal tape and maybe the hot glue, I'm scared I'll do something wrong tho.

If it wasn't for the coil whine it would be perfect, I'm really happy with the performance otherwise.

On a side note, is it possible to game on 2 monitors at all? 2 is all I have









EDIT: Worked out the dual monitor thing, far from great for gaming with the bezel in the middle of course but still a novelty to me haha


----------



## neptunus

Can you guys please post a 3DMark 11 GPU score with X preset @1100Mhz without any driver modifications? Thanks


----------



## Roaches

Quote:


> Originally Posted by *Devildog83*
> 
> I am going to do a complete reinstall of windows later today and I will try and make sure it wasn't something with the drivers or anything like that but I won't hold too much hope. Who knows but since I reinstalled the latest driver while troubleshooting I lost sound too so we will see.


I would suggest, putting the card in a separate system to see if it works to getting into the motherboard BIOS knowing the card should run from its integrated generic VGA driver. Also check if fans are spinning in an attempt knowing the card still has some life.

if you're stuck with one system, a full manual driver / registry wipe and reinstallation could help the least.


----------



## [CyGnus]

Aleckazee guess you have a bad card try to rma it....


----------



## Jethrodood

Hey guys. Finally got an r9. Gigabyte windforce 280. 1172/1500 no additional power. Solidly build and quiet.


----------



## Scorpion49

Finally got a case, I really like this cooler on the 270. Very quiet, too bad they don't sell reference models like they do with the 290/290X.


----------



## DarthBaggins

Last I checked they did, or at least manufacturers sell reference design models
sent via tapatalk on Nokia 925
-DarthBaggins


----------



## Scorpion49

Quote:


> Originally Posted by *DarthBaggins*
> 
> Last I checked they did, or at least manufacturers sell reference design models
> sent via tapatalk on Nokia 925
> -DarthBaggins


Reference PCB yes, not reference cooler. Only way you get one of those is from an OEM system, and I've only ever seen a handful for sale on ebay.


----------



## muhd86

i wanted to know how good is a sapphire vaporx r9 280x , i might get 3 gpus to do tri fire

can a vaporx 280x comapre with a r9 290 if so how much diff should i expect .


----------



## M1kuTheAwesome

Seems like my Crossfire plans are not gonna happen anytime soon, I'm a bit too low on money... Might even wait till the next generation cards with upgrading.
But the other day I noticed a problem: my 280X TurboDuo won't clock as high as it used to. I tried gaming with my 1219MHz core OC and got artifacts. Tried all the same test and benches I've always used for testing GPU OC and they all had artifacts. The voltage went up to 1.3v alright so that's not the problem. Tried both VBIOSes and had the same problem with both of them. Upgraded drivers to 14.7, still same issue. What else could cause my max OC to go down and is there a fix or will I just have to live with it? It isn't really such a bad issue but just seems suspicious...


----------



## neurotix

Quote:


> Originally Posted by *muhd86*
> 
> i wanted to know how good is a sapphire vaporx r9 280x , i might get 3 gpus to do tri fire
> 
> can a vaporx 280x comapre with a r9 290 if so how much diff should i expect .


There's two different Vapor-X 280X.

One is the Vapor-X with dual fan, like the old 7970 Vapor-X. The other is a Vapor-X Tri-X with triple fans and a different shroud.

The older style one with two fans is the same as the 7970 Vapor-X, except it has a pretty backplate. I have the 7970 version of this card and it's excellent. At 1200/1600mhz 1.3v and 100% fan it doesn't pass 65C in most games.

The newer one has the Tri-X Vapor-X cooler, all this means is that it has the Tri-X cooler from the 290s with a vapor chamber. The cooler should be nearly identical to this card. This cooler is basically state of the art and it should perform much better than the older dual fan design. It is basically Sapphire's strongest, highest tier cooler and one of the best air coolers on the market.

A 280X cannot compare with a 290, performance wise. In 3dmark11 for example, the 290 will score a few thousand points higher.

290: http://hwbot.org/submission/2613225_neurotix_3dmark11___performance_radeon_r9_290_17085_marks
7970 (280X): http://hwbot.org/submission/2599500_neurotix_3dmark11___performance_radeon_hd_7970_11884_marks

290 is over 5000 points higher.

*However,* if you are going to use more than one card, the 280X is still incredible value and excellent performance. 3x 280X should outperform 2x 290s in most benchmarks. (If they don't, then something is very wrong and you don't know what you're doing.)

It depends on what resolution and settings you game at. 3X 280X should max out everything at 1080P for years to come. For resolutions higher than that, it should still be good for another year or two.

If you can get each card for under $200, it might just be the cheapest way to get the most power.


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Seems like my Crossfire plans are not gonna happen anytime soon, I'm a bit too low on money... Might even wait till the next generation cards with upgrading.
> But the other day I noticed a problem: my 280X TurboDuo won't clock as high as it used to. I tried gaming with my 1219MHz core OC and got artifacts. Tried all the same test and benches I've always used for testing GPU OC and they all had artifacts. The voltage went up to 1.3v alright so that's not the problem. Tried both VBIOSes and had the same problem with both of them. Upgraded drivers to 14.7, still same issue. What else could cause my max OC to go down and is there a fix or will I just have to live with it? It isn't really such a bad issue but just seems suspicious...


Got that one sorted out myself. Turns out it was corrupted Windows and maybe drivers as well. I started getting artifacts in some games even at stock settings and I had also had SSD error warnings pop up randomly, despite my SSD being in perfect health. Then I realized I still use the same Windows 7 install that I did when doing suicide runs on 2 different CPUs this summer, so I'm actually surprised that it was working at all. Did a clean reinstall and both the artifacts and SSD warnings are gone. And my card clocks to its old standards.


----------



## Alanthor

What's the average OC that is stable on a R9 270X 2GB? Or what's your max stable OC?
Stable by no artifacts in games and no BSODS..

Cuz I have the Sapphire Radeon R9 270X 2GB DUAL-X OC, and it's factory OC'ed to 1070MHz. How far would I be able to push it?..


----------



## Devildog83

Quote:


> Originally Posted by *Alanthor*
> 
> What's the average OC that is stable on a R9 270X 2GB? Or what's your max stable OC?
> Stable by no artifacts in games and no BSODS..
> 
> Cuz I have the Sapphire Radeon R9 270X 2GB DUAL-X OC, and it's factory OC'ed to 1070MHz. How far would I be able to push it?..


Mine is 1235/1575 but I have a Powercolor so I don't know what the Dual-X can do. 1125 to 1150 if I have to guess, but that's just a guess.


----------



## Alanthor

Quote:


> Originally Posted by *Devildog83*
> 
> Mine is 1235/1575 but I have a Powercolor so I don't know what the Dual-X can do. 1125 to 1150 if I have to guess, but that's just a guess.


Does it matters what "brand" it is? I mean, its still AMD R9 270X, same GPU and such, just a different cooler and different clocks, isnt it?


----------



## Skye12977

Quote:


> Originally Posted by *Alanthor*
> 
> Does it matters what "brand" it is? I mean, its still AMD R9 270X, same GPU and such, just a different cooler and different clocks, isnt it?


Some GPU's use different PCB's, different coolers, fans, clocks, speeds, warranties, some cards are locked to where you can't even over-clock them.
There are a lot of things that "can" be different.
e) also every card is the roll of a dice, some cards might not want to go past stock clocks no matter how much voltage you put to it, some can go up to 2,000mhz... like that one 780 I saw.


----------



## neurotix

The Dual-X is a great cooler and should easily be good for 1200mhz, assuming your chip can do it.

Don't underestimate the Dual-X cooler.


----------



## Alanthor

Yes, I know DUAL-X is a exceptional cooling system, but why would the "brand" make any difference on what clocks it can achieve? I mean, Sapphire, ASUS, Powercolor yada yada yada, a 270X is still a 270X and the GPU itself is made by AMD. Those brands only uses different kinds of cooling, higher/lower clocks etc. Or am I wrong







?

Anyways, im currently at 1100MHz at default voltage. No issues so far.. My Fire Strike score just getting higher and higher


----------



## neurotix

Quote:


> Originally Posted by *Alanthor*
> 
> Yes, I know DUAL-X is a exceptional cooling system, but why would the "brand" make any difference on what clocks it can achieve? I mean, Sapphire, ASUS, Powercolor yada yada yada, a 270X is still a 270X and the GPU itself is made by AMD. Those brands only uses different kinds of cooling, higher/lower clocks etc. Or am I wrong
> 
> 
> 
> 
> 
> 
> 
> ?
> 
> Anyways, im currently at 1100MHz at default voltage. No issues so far.. My Fire Strike score just getting higher and higher


The brand doesn't matter. The chips are the same.

The cooler matters. The power delivery matters. Whether or not it's a custom PCB matters. Whether or not the chips are binned matters. And so on.

So, technically, the brand doesn't matter because they're the same chips, yet at the same time it does because certain manufacturers use high binned chips with very strong coolers that keep them cooler, and these two things affect your overclock potential.

If the chip is a crappy low binned chip from a budget brand, it won't overclock worth a damn. Likewise, even if it's a good chip, if it's paired with a crappy or defective cooler (e.g no RAM cooling, no direct touch heatpipes, no VRM cooling) it won't overclock very far because it will overheat.

My 270X has 84.6% ASIC (it's a Sapphire Vapor-X) and does 1250mhz at 1.3v and doesn't pass 55C in most games with fans at 100%. The card has a vapor chamber cooler for the GPU core, RAM cooling and adequate VRM cooling. That's a LOT of voltage and a high frequency, yet it stays very cool under load. This is an example of a GPU "done right".

Some other cards will overheat and shut down even at stock clocks because the VRMs aren't cooled. (Powercolor TurboDuo anyone??)

Essentially, the difference in brand amounts to a difference in cooler, and also to a difference in binning. (For example, if you buy a Sapphire Toxic card, the extra money pretty much ensures you a high binned chip.) The only way to know the difference and what's good and what isn't is to DO RESEARCH before you buy. Read all the reviews you can of the card in question and note it's flaws and problems.


----------



## Devildog83

This /\, there is a reason the Toxic, Devil, Gaming and DC II cost more. They are all custom built cards from the PCB up. Better memory controllers and chips, better quality chokes and a better layout for heat control along with the heat-sink and fans. All of this and more makes these cards able to clock higher as a rule. There will be exceptions as there always is but as a whole the high end custom built cards can clock higher with less heat.


----------



## DarthBaggins

highest clocks I could get on my 270x was 1275-1300, but I had issues at 1300 when folding (client would hang and not finalize the project)
I love my DCII 7870 takes anything I throw at it:thumb:


----------



## Recr3ational

Devil Dog, what's the update regarding your card? Are you replacing it?


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Devil Dog, what's the update regarding your card? Are you replacing it?


For now just using the 1 270x. Haven't decided what to do yet. Going to sell a bunch of stuff soon and maybe the 270x devil along with 2 motherboards and a CPU just don't know which ones I want to keep, then I will most likely get a 290 ref card but still deciding.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> For now just using the 1 270x. Haven't decided what to do yet. Going to sell a bunch of stuff soon and maybe the 270x devil along with 2 motherboards and a CPU just don't know which ones I want to keep, then I will most likely get a 290 ref card but still deciding.


290 and waterblocks are a good idea


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> 290 and waterblocks are a good idea


Yes, my system looks a bit bare without dual GPU's and no watercooling to the GPU either.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Yes, my system looks a bit bare without dual GPU's and no watercooling to the GPU either.


I know right, I got a secondary rig with one gpu it looks empty







.

I think one 290 with custom loop would look good with your rig. What are you doing with the single 270x?


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> I know right, I got a secondary rig with one gpu it looks empty
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I think one 290 with custom loop would look good with your rig. What are you doing with the single 270x?


In the post above I pointed out that I most likely will sell it along with some other stuff to get a 290. I can get a 290 on e-bay for 200 to 300 bucks.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> In the post above I pointed out that I most likely will sell it along with some other stuff to get a 290. I can get a 290 on e-bay for 200 to 300 bucks.


Sorry, I read that at work. Was in a rush. The 290 in the uk are stupidly cheap too for some reason. I mean a secondhand 290 is the same price of a new 280x


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Sorry, I read that at work. Was in a rush. The 290 in the uk are stupidly cheap too for some reason. I mean a secondhand 290 is the same price of a new 280x


I know, at a bit over $200 this card is the deal of the century. Price to performance, nothing comes close. I just worry a bit about e-bay and if they have been used for mining and all. One of these under water is very nice for anything 99% of user's need, 2 would be incredible.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> I know, at a bit over $200 this card is the deal of the century. Price to performance, nothing comes close. I just worry a bit about e-bay and if they have been used for mining and all. One of these under water is very nice for anything 99% of user's need, 2 would be incredible.


I bought my second 280x from eBay of a miner. Still perfect to this day!


----------



## muhd86

need some help ..

if any one can guide on benchmark scores of 3 / 4 r9 280 sapphire gpus ---tri fire or quad fire .


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> I bought my second 280x from eBay of a miner. Still perfect to this day!


Some day soon I will have 2 x 290's under water, I will need a bigger PSU so a 1000w platinum is in my future also.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Some day soon I will have 2 x 290's under water, I will need a bigger PSU so a 1000w platinum is in my future also.


Haha so many plans! I'm 80% done with my second rig. After that in going to upgrade to 290x on my main rig if the price drops after new gpus come out. Then I'm done for a few years. Hopefully.


----------



## BruceB

Quote:


> Originally Posted by *Recr3ational*
> 
> Haha so many plans! I'm 80% done with my second rig. After that in going to upgrade to 290x on my main rig if the price drops after new gpus come out. Then I'm done for a few years. Hopefully.


That's never going to happen!








You're going to unwrap your shiney new 290X and then the next second think _...but that 390X does look good..._


----------



## Saviour

Hey guys,

Recently bought brand new (not a miner) Sapphire R9 280X Dual-X but I think the card is underperforming.
I am using Catalyst 14.8. I haven't OCed it yet (Stock 1050/1500). I set only +20 Power Limit in Trixx.
Also my temperatures during benchmarks are really good: GPU below 70°C, VRM1 and VRM2 below 80°C.
Here are my Valley and Heaven results which seems low to me. And GPU-Z Sensors pic.

Do you think the card is faulty or I am doing something wrong?

Thank you in advance!


----------



## BruceB

Quote:


> Originally Posted by *Saviour*
> 
> Hey guys,
> Recently bought brand new (not a miner) Sapphire R9 280X Dual-X but I think the card is underperforming.
> I am using Catalyst 14.8. I haven't OCed it yet (Stock 1050/1500). I set only +20 Power Limit in Trixx.
> Also my temperatures during benchmarks are really good: GPU below 70°C, VRM1 and VRM2 below 80°C.
> Here are my Valley and Heaven results which seems low to me. And GPU-Z Sensors pic.
> Do you think the card is faulty or I am doing something wrong?
> Thank you in advance!


With my Powercolor 280X at stock (1030/1500MHz) I get ~41FPS in Valley and ~44FPS on Haven. Are you sure your card's not throttling? Check the core clock using afterburner or a similar program.









EDIT:
Posted sensors pic Shows that your Card isn't throttling. It's not immeditately obvious what it could be. Did you remove all your old Drivers before installing the new ones?


----------



## Saviour

Quote:


> Originally Posted by *BruceB*
> 
> With my Powercolor 280X at stock (1030/1500MHz) I get ~41FPS in Valley and ~44FPS on Haven. Are you sure your card's not throttling? Check the core clock using afterburner or a similar program.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT:
> Posted sensors pic Shows that your Card isn't throttling. It's not immeditately obvious what it could be. Did you remove all your old Drivers before installing the new ones?


Thank you for your reply.

I reinstalled Windows 8.1 two times but I can't be sure that it's not driver related problem. What do you suggest?


----------



## BruceB

Quote:


> Originally Posted by *Saviour*
> 
> Thank you for your reply.
> I reinstalled Windows 8.1 two times but I can't be sure that it's not driver related problem. What do you suggest?


A reinstall would give a clean Driver install.








Are you using the beta Drivers or the supported Drivers?


----------



## Saviour

I am using Catalyst 14.8 WHQL. Before that I was using 14.4 and if I remember correctly the scores were the same.

Any other suggestions?


----------



## CrazyNightOwl

Hello ladies and gentlemen,

My situation is as follows.

I'm looking to upgrade an MSI GTX 460 Hawk with dead fans with a today's upper-mid-range video card. Being on budget, I'm of course looking for the best bang for the buck. Another requirement is decent computational power aside from just playing games.

Now I'm strongly convinced that video GPU's should come with blower style or be watercooled, since, in a traditional PC case, the CPU cooler , VREG and NB (if present) heatsinks are positioned directly above the video card, getting heated up by the rising hot air exhausted from the dual or triple fan coolers. This issue is even more important since I'm running an EVGA X58 SLI3 (one board with really hot heatsinks) with an air-cooled X5650 6-core Xeon CPU.

According to the first 2 criteria, R9 280 / 280X are my best bet: cheap, have much better compute performance than NVIDIA cards of the same price. But the problem is that I can't find a 280 / 280X with a reference cooler. Is it even possible to buy one or do AMD not ship reference cards to distributors, instead forcing them to come up with custom designs?

Next, if the above is impossible, I've also considered HD 7970 for the low price, but reviews say it's really hot and loud. True or not?

Finally, we have the R9 290, which is really expensive but can probably be found on ebay or just bought in retail if it's really required. Comes with a blower and is very good at GPGPU computation. But, and here's the big but, this one is said to overheat and throttle down.

In case it matters for the 7970 and 290, my case (excuse the pun) is a CM 690 II, with 2 140 fans at the top, 2 140 fans at the bottom, and one 120 fan at the rear and front each. This is to say that I can provide adequate airflow for the GPU should it be required, but doesn't mean that I'm going to rely on this airflow and throw a dual or triple fan video card in, no way, man.

Bottom line:
* Do retail 280 or 280X reference designs exist? If yes, where can I get one? Anyone upgrading ad willing to sell me one?
* Does the 7970 really get that hot and loud? Will good airflow (see above) remedy that?
* Does the 290 really overheat and throttle down under heavy and / or regular load? Will good airflow (see above) remedy that?

Thank you for your attention,
crazynightowl


----------



## Recr3ational

To the above,
Regarding what cooler to get. Just buy what you can afford. Adapt if it's to hot or just spend a few more bucks and get more fans.

I'm using two 280x and I am definitely happy with it. Even with 4 monitors it handles games well at 60 fps or above.

If you can afford it. I would strongly recommend the 290, even if you buy it secondhand. It will last longer, it might be hot and power hungry but with 6 fans that you have in your system, I would of thought that it would be fine.

Regarding the reference coolers on 280x, I think they never made one? It's basically a 7970. Might be wrong on that one though.

Go for the 290, if it's hot. Get more/better fans


----------



## CrazyNightOwl

Wow, that was fast!

Alright. So you mean to say that 290's thermal issues can be remedied with decent airflow inside the case, correct? This is what I really wonder about. With the huge price drop on the 7970's, I might as well get one with a blower and rely on the case fans to take care of the GPU temps, if that was guaranteed to work.


----------



## Recr3ational

Quote:


> Originally Posted by *CrazyNightOwl*
> 
> Wow, that was fast!
> 
> Alright. So you mean to say that 290's thermal issues can be remedied with decent airflow inside the case, correct? This is what I really wonder about. With the huge price drop on the 7970's, I might as well get one with a blower and rely on the case fans to take care of the GPU temps, if that was guaranteed to work.


Definitely. I'm pretty sure they're not that bad to be honest with you. After market coolers are pretty decent if you choose the right one.
Just make sure your system is getting a lot of airflow and I think youre all good sir. 7970s are still pretty decent, but compare to the price/performance of a 290. I would pick that any day. I wish I waited and picked up 2 x 290s instead as I paid more for my 280xs than 290s are second hand now.


----------



## CrazyNightOwl

Well, honestly, I still wouldn't like to use an open-style cooler - the air will have to travel to the rear fan at least to exit the case (hot air always goes up, remember) , and that means it will already have passed by the NB heatsink, at touched the CPU radiator, too. Not good at all.

OK, I can first experiment with an old GTS 8800 with a blower, if those fans really prove to be able to lower the GPU temps, then it's going to be more clear where to go next.


----------



## Recr3ational

Quote:


> Originally Posted by *CrazyNightOwl*
> 
> Well, honestly, I still wouldn't like to use an open-style cooler - the air will have to travel to the rear fan at least to exit the case (hot air always goes up, remember) , and that means it will already have passed by the NB heatsink, at touched the CPU radiator, too. Not good at all.
> 
> OK, I can first experiment with an old GTS 8800 with a blower, if those fans really prove to be able to lower the GPU temps, then it's going to be more clear where to go next.


I don't want to sound douche, but in a pc case, the air will move by the directions of the fans that are pushing it. A case is to small to have that as a factor. Let's just think about a case with no fans. If you blow from the top the air would move towards the bottom.

If you're concerened about hot air. You could flip your fans around.

Have top, back pulling air in the case and have the two bottom and front fans as exhaust. Obviously with good filters.

Again I must stress that im only trying to help, and not trying to sound like a know it all.


----------



## 8800GT

Anyone else find it ridiculous its been 5 months since a damn whql driver? Not to mention that 280x's still dont work in crossfire with watch dogs and the performance in Metro redux sucks. AMD needs to get their proverbial crap together.


----------



## [CyGnus]

8800GT My Metro 2033 Redux works just fine a few scenes at 40/45fps (outside big maps) but mostly fixed 60FPS with AF 16X drivers set to High quality....


----------



## Jethrodood

To any would be 280/280x purchasers. The gigabyte 280 i got is extremely well built, has triple QUIET fans and is well priced.


----------



## 8800GT

Quote:


> Originally Posted by *[CyGnus]*
> 
> 8800GT My Metro 2033 Redux works just fine a few scenes at 40/45fps (outside big maps) but mostly fixed 60FPS with AF 16X drivers set to High quality....


Maxed out I'm lucky to get 45 fps. 2x ssaa


----------



## muhd86

well can any one tell comparison wise 4 r9 280 sapphire gpus vs which nvidia gpus can these guys target and decimate .

any one


----------



## CrazyNightOwl

Quote:


> Originally Posted by *Recr3ational*
> 
> I don't want to sound douche, but in a pc case, the air will move by the directions of the fans that are pushing it. A case is to small to have that as a factor. Let's just think about a case with no fans. If you blow from the top the air would move towards the bottom.
> 
> If you're concerened about hot air. You could flip your fans around.
> 
> Have top, back pulling air in the case and have the two bottom and front fans as exhaust. Obviously with good filters.
> 
> Again I must stress that im only trying to help, and not trying to sound like a know it all.


Hmm, it's going to take time for me to consider that theory,, esp. about top-top-bottom airflow. Regardless, do you think ~ $350 (hard to be precise given the rate oscillations) is good for a new R9 290 reference, or is it too much and I'm better off searching ebay for a second-hand one? Given that I want to act wisely, without throwing in cash just cause because I can.


----------



## BruceB

Quote:


> Originally Posted by *CrazyNightOwl*
> 
> Hmm, it's going to take time for me to consider that theory,, esp. about top-top-bottom airflow. Regardless, do you think ~ $350 (hard to be precise given the rate oscillations) is good for a new R9 290 reference, or is it too much and I'm better off searching ebay for a second-hand one? Given that I want to act wisely, without throwing in cash just cause because I can.


If I were you, I'd either hold off until the 390X lands or spend an extra 100-150$ to get a GTX980


----------



## Devildog83

I am actually buying a used 290 today so my 270x devil will go on the auction block as soon as it arrives.


----------



## neurotix

Quote:


> Originally Posted by *CrazyNightOwl*
> 
> Hello ladies and gentlemen,
> 
> My situation is as follows.
> 
> I'm looking to upgrade an MSI GTX 460 Hawk with dead fans with a today's upper-mid-range video card. Being on budget, I'm of course looking for the best bang for the buck. Another requirement is decent computational power aside from just playing games.
> 
> Now I'm strongly convinced that video GPU's should come with blower style or be watercooled, since, in a traditional PC case, the CPU cooler , VREG and NB (if present) heatsinks are positioned directly above the video card, getting heated up by the rising hot air exhausted from the dual or triple fan coolers. This issue is even more important since I'm running an EVGA X58 SLI3 (one board with really hot heatsinks) with an air-cooled X5650 6-core Xeon CPU.
> 
> According to the first 2 criteria, R9 280 / 280X are my best bet: cheap, have much better compute performance than NVIDIA cards of the same price. But the problem is that I can't find a 280 / 280X with a reference cooler. Is it even possible to buy one or do AMD not ship reference cards to distributors, instead forcing them to come up with custom designs?
> 
> Next, if the above is impossible, I've also considered HD 7970 for the low price, but reviews say it's really hot and loud. True or not?
> 
> snip
> 
> Thank you for your attention,
> crazynightowl


Hey, I can answer your questions.

They don't make a 280/280X reference as far as I know.

Fortunately, they DO make 7970 reference with a blower cooler, I know this for a fact. Just look on ebay and you'll find some. (Because the 280X is so cheap because of the mining crash, the 7970 is even cheaper, and it's literally the exact same card, probably with the same overclocking ability. Tahiti XT vs Tahiti XTX. Not a big difference.)

The 290 is easily the best bang for the buck right now. The reference cooler SUCKS HARD on these things. You definitely want non-reference cooling. Out of all the 290s, the Sapphire coolers (Tri-X, Vapor-X) are undoubtedly the best. Next best is probably Powercolor PCS+ but afaik that's a 3 slot card. From what I've heard, you want to avoid the ASUS DCUII because it only has 3/5 heatpipes touching the core and no VRM cooling. Just read around some reviews and you'll see what I mean.

So, your dilemma is that you have a hot-ass CPU and motherboard. So you don't want a custom GPU cooler that's going to exhaust the heat upward. You have two choices:

1) Buy a reference card, that is almost guaranteed to be a space heater inside your computer. CPU might stay cooler. GPU will probably throttle and perform like crap. Will probably not be able to overclock because of heat. (Especially if it's a 290.)

2) Buy a custom card, CPU will probably run hotter.

The CPU should get used a lot less in a gaming scenario. My Haswell runs around 50C in most games, while my GPUs run much hotter. If I run x264 on my Haswell I see temps of 95C and that's with an above average, likely excellent cooling setup (2x Corsair SP120s 2100 RPM on a H100i, PK-3 Nano on the water block.) Essentially, what I'm saying is that when I stress my CPU it runs incredibly hot no matter what. It sounds like I have a similar scenario.

So, you have a choice to make. Personally, I went for non-reference coolers because in any kind of parallel work (GPGPU) the GPUs tend to run MUCH hotter than my CPU. Additionally, the GPUs should be used OVER the CPU, that is, a GPU is many orders of magnitude faster than a CPU for highly threaded, parallel work. In [email protected] my GPUs are about 5 times faster than my CPU. (200k PPD each vs ~40k on the CPU.)

You can probably remedy this by making sure you have POWERFUL fans in the top and rear of your case, and especially by making sure the top of the case is well ventilated. This should reduce heat lingering around the CPU socket and VRM heatsinks. The hot air blown off the non-reference cards will be pulled out of the back and top. Sorry if this sounds simple or patronizing, but you never know. Airflow is important.

What are you going to be doing? [email protected] on both CPU and GPU? Encoding or rendering on both?

EDIT: Also, going from an air cooler to a CLC (h100i, H220X etc) or a custom loop will probably solve all of your CPU heat problems. Additionally you can position a twin fan RAM cooler over your VRM heatsink this way to further reduce heat.


----------



## neurotix

Devildog what 290 are you getting?


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> Devildog what 290 are you getting?


I got a Sapphire reference R9 290 for 245 shipped. Lightly use and it comes with a money-back guarantee. I sold my Devil 270x and a couple other things to pay for it. I still have it for a few days but I may have to relinquish my thread-starter status here soon.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> I got a Sapphire reference R9 290 for 245 shipped. Lightly use and it comes with a money-back guarantee. I sold my Devil 270x and a couple other things to pay for it. I still have it for a few days but I may have to relinquish my thread-starter status here soon.


Does that mean you're no longer our leader. What if the decepticons come back?


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Does that mean you're no longer our leader. What if the decepticons come back?


LOL, I will have to see if I can get the 7870 working and flash to a 270x bios.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> LOL, I will have to see if I can get the 7870 working and flash to a 270x bios.


That sounds fun







. Last time I did that was with my old 7950 to unlock shaders. Now it's dead









Worth selling 2 x 280x to buy 2x 970s?
Might get 2 x 290s. I have no idea what to do.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> That sounds fun
> 
> 
> 
> 
> 
> 
> 
> . Last time I did that was with my old 7950 to unlock shaders. Now it's dead
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Worth selling 2 x 280x to buy 2x 970s?
> Might get 2 x 290s. I have no idea what to do.


I don't know about the 970's so much but 290's for less than $250 used is a good deal to me. Hard to beat the price to performance. I saw someone in the OCN marketplace selling 2 for $200 each.


----------



## Saviour

Hey guys,

I decided to exchange my Sapphire R9 280X Dual-X for 2x Sapphire R9 270X Vapor-X. Do you think is it worth it (if the price approximately is equal)?
Do you think CF with 2x 270X is better than single card: 280X / 290X ?
Can anyone with 270X in CF give me some information about card temperatures? (I guess they won't be hotter than single Sapphire R9 280X Dual-X.)

Thank you in advance!


----------



## Recr3ational

Quote:


> Originally Posted by *Saviour*
> 
> Hey guys,
> 
> I decided to exchange my Sapphire R9 280X Dual-X for 2x Sapphire R9 270X Vapor-X. Do you think is it worth it (if the price approximately is equal)?
> Do you think CF with 2x 270X is better than single card: 280X / 290X ?
> Can anyone with 270X in CF give me some information about card temperatures? (I guess they won't be hotter than single Sapphire R9 280X Dual-X.)
> 
> Thank you in advance!


Why not just buy a secondary 280x?


----------



## Saviour

I don't like Dual-X cooler, temps will get even hotter. My PSU won't handle CF 280X.
And I can get 2x 270X for the price of one 280X.


----------



## Recr3ational

Quote:


> Originally Posted by *Saviour*
> 
> I don't like Dual-X cooler, temps will get even hotter. My PSU won't handle CF 280X.
> And I can get 2x 270X for the price of one 280X.


What psu do you have, and you'll get a decrease in performance....

I mean dual 280x will last longer..


----------



## Saviour

My PSU is Seasonic X-560 and I don't want to put 2x 280X on it. Also I should give extra money for another 280X.
So I guess it's better to get a single card like 290 or 290X?


----------



## Recr3ational

Quote:


> Originally Posted by *Saviour*
> 
> My PSU is Seasonic X-560 and I don't want to put 2x 280X on it. Also I should give extra money for another 280X.
> So I guess it's better to get a single card like 290 or 290X?


Yeah, 290 is a good choice. Check if you have enough power first.


----------



## neurotix

Nah, he should get an increase in performance with 2x 270X vs 1 280X. Though he will have less VRAM. 2x 270X = 1 290.


----------



## ubR322

i have a xfx 280x black edition and i have my old xfx 7970 ghz, i want to run them in crossfire.

do you think my corsair ax850 watt will be enough?

i5 4670k (stock clock)
gigabyte sniper.z87
16gb ram (2x8gb)
2 ssd
1 hard drive
xfx 280x black edition
xfx 7970 ghz
11 120mm corsair fans
swifttech MCP655-B pump
recon3d pro sound card
corsair ax850


----------



## Danbeme32

Quote:


> Originally Posted by *ubR322*
> 
> i have a xfx 280x black edition and i have my old xfx 7970 ghz, i want to run them in crossfire.
> 
> do you think my corsair ax850 watt will be enough?
> 
> i5 4670k (stock clock)
> gigabyte sniper.z87
> 16gb ram (2x8gb)
> 2 ssd
> 1 hard drive
> xfx 280x black edition
> xfx 7970 ghz
> 11 120mm corsair fans
> swifttech MCP655-B pump
> recon3d pro sound card
> corsair ax850


Yea you should be good.. Am running a corsair hx750 with 2 280x xfire with no problem while folding 24/7.. and no problems yet..


----------



## ubR322

i threw the cards in. i red screen every time i load BF4. idk if its a bad card, or just not enough power :/

:edit: i had to put my 280x BE as card #1 and have both crossfire bridges installed, everything works perfect now!


----------



## ubR322

can i be added to the club? i have both a 280x black edition and 270x. currently running the 280x and 7970 ghz in crossfire!


----------



## Devildog83

Quote:


> Originally Posted by *ubR322*
> 
> 
> 
> can i be added to the club? i have both a 280x black edition and 270x. currently running the 280x and 7970 ghz in crossfire!


Welcome !!! You have been added, what clocks are you running.


----------



## DigDeep

Hi!

Is it safe and Ok if I flash my Sapphire 270 Dual x to 270x Dual x?


----------



## Recr3ational

Quote:


> Originally Posted by *DigDeep*
> 
> Hi!
> 
> Is it safe and Ok if I flash my Sapphire 270 Dual x to 270x Dual x?


Try it. Make sure you make a back up. If any problems comes up. Flash it with the old bios.


----------



## Scorpion49

I'd like to get added as well, I guess my post a while back got missed. Stock R9 270 reference card.


----------



## ubR322

Quote:


> Originally Posted by *Devildog83*
> 
> Welcome !!! You have been added, what clocks are you running.


im running stock clocks on everything, i dont believe in overclocking LOL


----------



## mikemykeMB

So who is gonna buy and torture a R9 285?..Seen many reviews and all, just anyone out in this forum have inputs-thoughts concerns. Don't want to go 2 far OT tho'...


----------



## BruceB

Quote:


> Originally Posted by *mikemykeMB*
> 
> So who is gonna buy and torture a R9 285?..Seen many reviews and all, just anyone out in this forum have inputs-thoughts concerns. Don't want to go 2 far OT tho'...


What is an R9 285? Is it a OC'd 270X or an underclocked 280X?


----------



## dabby91

Quote:


> Originally Posted by *BruceB*
> 
> What is an R9 285? Is it a OC'd 270X or an underclocked 280X?


It's the replacement for the standard 280, based on the "Tonga" GPU. Think of it as an updated Hawaii card that has been cut down for 280 levels of performance. Its major differences are 2GB of RAM vs the 3GB of the 280, and also consumes less power than the 280. It also supports the features that were introduced with GCN 1.1, such as Trueaudio and er... whatever else there was xD.


----------



## mutatedknutz

hey guys am having this small issue, in amd gaming evolved app it shows auto driver download not available for this video card, why is so? and even in 3d mark i saw graphic card not supported or something, dont really remember, why is that?


----------



## DiceAir

So I was thinking about getting the kraken g10 for my r9 280x but then something happened to me. Someone offered me a decent price for both my cards. So should I still sell both my cards as I'm also a bit tired with the whole multigpu setup. I have to many issues. I'm running 2560x1440 @ 96Hz so what would you suggest or do you guys think I should rather keep my 280x crossfire for now. I can go Nvidia or AMd doesn't matter.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *mutatedknutz*
> 
> 
> hey guys am having this small issue, in amd gaming evolved app it shows auto driver download not available for this video card, why is so? and even in 3d mark i saw graphic card not supported or something, dont really remember, why is that?


Are you on beta drivers?
Quote:


> Originally Posted by *DiceAir*
> 
> So I was thinking about getting the kraken g10 for my r9 280x but then something happened to me. Someone offered me a decent price for both my cards. So should I still sell both my cards as I'm also a bit tired with the whole multigpu setup. I have to many issues. I'm running 2560x1440 @ 96Hz so what would you suggest or do you guys think I should rather keep my 280x crossfire for now. I can go Nvidia or AMd doesn't matter.


If you got a good deal on them you may want to look at buying a 290 or 290x with the moneys if you look well enough now you might be able to get a 290 and still have cash left over for something else. 2x 280x's are only ~4% (guessing) better than a 290 itself.


----------



## DiceAir

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Are you on beta drivers?
> If you got a good deal on them you may want to look at buying a 290 or 290x with the moneys if you look well enough now you might be able to get a 290 and still have cash left over for something else. 2x 280x's are only ~4% (guessing) better than a 290 itself.


I would still have to pay in a bit but I was prepared to buy a kraken g10 and a h55 so I can add that money and go for an even better gpu.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *DiceAir*
> 
> I would still have to pay in a bit but I was prepared to buy a kraken g10 and a h55 so I can add that money and go for an even better gpu.


in that case, I say do it, youll get better gaming performance and rendering


----------



## DiceAir

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> in that case, I say do it, youll get better gaming performance and rendering


is it really better to go for r9 290 than 2x r9 280x. I can get the msi gaming r9 290 for not that much more than the 2x r9 280x I'm selling it for


----------



## Recr3ational

Quote:


> Originally Posted by *DiceAir*
> 
> is it really better to go for r9 290 than 2x r9 280x. I can get the msi gaming r9 290 for not that much more than the 2x r9 280x I'm selling it for


Yes. Then when you have money you can get another one


----------



## DiceAir

Quote:


> Originally Posted by *Recr3ational*
> 
> Yes. Then when you have money you can get another one


lol then back to square 1. The main issue is with heat. I was thinking maybe just drop the hammer on r9 295x2 and get it over and done with. Less hassles with heat and i don't ever have to get a bracket like the g10. My supplier here dropped the price on the r9 295x2 here.


----------



## Recr3ational

Quote:


> Originally Posted by *DiceAir*
> 
> lol then back to square 1. The main issue is with heat. I was thinking maybe just drop the hammer on r9 295x2 and get it over and done with. Less hassles with heat and i don't ever have to get a bracket like the g10. My supplier here dropped the price on the r9 295x2 here.


If you got the money to splash on 295x2 get a single 290x then buy a full loop for your system?


----------



## DiceAir

Quote:


> Originally Posted by *Recr3ational*
> 
> If you got the money to splash on 295x2 get a single 290x then buy a full loop for your system?


nah that's to much money...lol as you can see i don't want to spend a fortune but get a good system. maybe I should check reviews on the r9 290 and compare to my system by doing the exact same test


----------



## Wheezo

Quote:


> Originally Posted by *DiceAir*
> 
> nah that's to much money...lol as you can see i don't want to spend a fortune but get a good system. maybe I should check reviews on the r9 290 and compare to my system by doing the exact same test


Check out the new GTX 970 / 980. Might as well get the latest tech if you sell your cards. AMD is hard to consider at this moment. Plus they are really power efficient and run cool.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *DiceAir*
> 
> is it really better to go for r9 290 than 2x r9 280x. I can get the msi gaming r9 290 for not that much more than the 2x r9 280x I'm selling it for


yes, here is why, for what you get from your 2 cards you can have just about the same power in just a single card. and it is also newer. which will hold a resale value for later on, or if you want to add another card later on.. if you have the option, it is better to future proof.


----------



## neurotix

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> yes, here is why, for what you get from your 2 cards you can have just about the same power in just a single card. and it is also newer. which will hold a resale value for later on, or if you want to add another card later on.. if you have the option, it is better to future proof.


Um... am I missing something?

2x 280X should run circles around a single 290 all day.

290 - 4GB VRAM, 2560 shaders, 64 ROPs, 160 TMUs, 512 bit memory bus

2x 280X/7970 - 3GB VRAM, 4096 shaders, 64 ROPs, 256 TMUs, 384 bit memory bus

That's 37% more shaders and texture mapping units in 280X Crossfire.

Crossfire scaling is right around 80% (180%) from what I've seen with my setup. In some cases higher or lower. But this should be more than enough performance to outdo a single 290. If not, something is seriously wrong. Unless you guys can provide numerous benchmarks showing a 290 outscoring a CF 280X setup, this doesn't make any sense.

All you'd be getting is the bonus features that come with a Hawaii card. TrueAudio, XDMA (Bridgeless Crossfire), more VRAM, better memory bandwidth. However, the GPU should be less powerful and you'd less performance than with two discrete cards. Both setups can do Mantle.

I would gladly take 2x 280X over a single 290 or 290X. I wouldn't recommend getting a single 290. If you really want to, I would suggest selling both cards and getting two 290s instead. This should be $600 or less if you get the cards secondhand on Ebay.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *neurotix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *F3ERS 2 ASH3S*
> 
> yes, here is why, for what you get from your 2 cards you can have just about the same power in just a single card. and it is also newer. which will hold a resale value for later on, or if you want to add another card later on.. if you have the option, it is better to future proof.
> 
> 
> 
> Um... am I missing something?
> 
> 2x 280X should run circles around a single 290 all day.
> 
> 290 - 4GB VRAM, 2560 shaders, 64 ROPs, 160 TMUs, 512 bit memory bus
> 
> 2x 280X/7970 - 3GB VRAM, 4096 shaders, 64 ROPs, 256 TMUs, 384 bit memory bus
> 
> That's 37% more shaders and texture mapping units in 280X Crossfire.
> 
> Crossfire scaling is right around 80% (180%) from what I've seen with my setup. In some cases higher or lower. But this should be more than enough performance to outdo a single 290. If not, something is seriously wrong. Unless you guys can provide numerous benchmarks showing a 290 outscoring a CF 280X setup, this doesn't make any sense.
> 
> All you'd be getting is the bonus features that come with a Hawaii card. TrueAudio, XDMA (Bridgeless Crossfire), more VRAM, better memory bandwidth. However, the GPU should be less powerful and you'd less performance than with two discrete cards. Both setups can do Mantle.
> 
> I would gladly take 2x 280X over a single 290 or 290X. I wouldn't recommend getting a single 290. If you really want to, I would suggest selling both cards and getting two 290s instead. This should be $600 or less if you get the cards secondhand on Ebay.
Click to expand...

So yes, you are correct a 290.. As you see below I mentioned on a 290x, in addition with the money that he would have saved, also bringing the cost of electricity down, your 80% crossfire and 37% over all power,, only equals out to be about 10% better when applied to real world application, as I have played with my cards long enough and matched them to 290's
Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DiceAir*
> 
> I would still have to pay in a bit but I was prepared to buy a kraken g10 and a h55 so I can add that money and go for an even better gpu.
> 
> 
> 
> If you got a good deal on them you may want to look at buying a 290 or 290x with the moneys if you look well enough now you might be able to get a 290 and still have cash left over for something else. 2x 280x's are only ~4% (guessing) better than a 290 itself.
Click to expand...

^this was a way to be about equal performance real world and yet saving money. however I do say the 290x would be better or likewise,


----------



## Devildog83

Just bought a 290 for $245 and it will be about $200 more to add it to the loop including back-plate which is already on the way.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Just bought a 290 for $235 and it will be about $200 more to add it to the loop including back-plate which is already on the way.


Nice mate. Which vendor?


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Nice mate. Which vendor?


Sapphire reference.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Sapphire reference.


Sweet. Hope it goes well mate


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Sweet. Hope it goes well mate


Thanks, it should be here weds.


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> Um... am I missing something?
> 
> 2x 280X should run circles around a single 290 all day.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 290 - 4GB VRAM, 2560 shaders, 64 ROPs, 160 TMUs, 512 bit memory bus
> 
> 2x 280X/7970 - 3GB VRAM, 4096 shaders, 64 ROPs, 256 TMUs, 384 bit memory bus
> 
> That's 37% more shaders and texture mapping units in 280X Crossfire.
> 
> Crossfire scaling is right around 80% (180%) from what I've seen with my setup. In some cases higher or lower. But this should be more than enough performance to outdo a single 290. If not, something is seriously wrong. Unless you guys can provide numerous benchmarks showing a 290 outscoring a CF 280X setup, this doesn't make any sense.
> 
> All you'd be getting is the bonus features that come with a Hawaii card. TrueAudio, XDMA (Bridgeless Crossfire), more VRAM, better memory bandwidth. However, the GPU should be less powerful and you'd less performance than with two discrete cards. Both setups can do Mantle.
> 
> I would gladly take 2x 280X over a single 290 or 290X. I wouldn't recommend getting a single 290. If you really want to, I would suggest selling both cards and getting two 290s instead. This should be $600 or less if you get the cards secondhand on Ebay.


You are correct, my 7870/270x X-fire was equal to or in some case better then the 290's but the True audio and other features on the 290 make it a better deal. 2 x 280X's should kill a 290 in performance.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> [/SPOILER]
> 
> You are correct, my 7870/270x X-fire was equal to or in some case better then the 290's but the True audio and other features on the 290 make it a better deal. 280X's should kill a 290 in performance.


My dual 280x is better than a single 290 but it's the fact that it's a single card. In the future you can always upgrade. That's why I always buy 2 cards at a time.

Also in English, can you explain what true audio is?


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> My dual 280x is better than a single 290 but it's the fact that it's a single card. In the future you can always upgrade. That's why I always buy 2 cards at a time.
> 
> Also in English, can you explain what true audio is?


Yep, I plan on another 290 in the near future but I will have to get a different PSU as I think 2 x 290's will be too much for a Platinum 660 even if it is Seasonic.

In simple terms the GPU does most of the work for audio as opposed to the CPU handling it which takes some draw off of the CPU, as far as how much better it is, this might help.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Yep, I plan on another 290 in the near future but I will have to get a different PSU as I think 2 x 290's will be too much for a Platinum 660 even if it is Seasonic.


Hmm yeah I think it would not be enough.
290s are so cheap it's unbelievable!

I'm thinking about going tri fire been thinking about it for a few months.


----------



## neurotix

Two 290s will definitely be too much for a 660W power supply. Especially when overclocking.

With my i7 at 4.5ghz and two 290s at 1100/1500mhz, the highest power draw I've seen has been 830W (running Valley). A FX 8 core chip draws twice as much power as an i7.

Even at idle with all power saving features enabled, C6 and C7 states on, my system draws 115W.

With my FX-8350 and one 290 I was seeing power draw over 600W.

I got the Cooler Master V1000 because of Shilka's recommendation and it's great, had no issues with OCP or anything.


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> Two 290s will definitely be too much for a 660W power supply. Especially when overclocking.
> 
> With my i7 at 4.5ghz and two 290s at 1100/1500mhz, the highest power draw I've seen has been 830W (running Valley). A FX 8 core chip draws twice as much power as an i7.
> 
> Even at idle with all power saving features enabled, C6 and C7 states on, my system draws 115W.
> 
> With my FX-8350 and one 290 I was seeing power draw over 600W.
> 
> I got the Cooler Master V1000 because of Shilka's recommendation and it's great, had no issues with OCP or anything.


Yep the 660w Platinum should be enough for one 290 if I don't load it up too much but for 2 I need more and why not 1000w if I have to get a new one anyhow.


----------



## Devildog83

Quote:


> Originally Posted by *Devildog83*
> 
> Yep the 660w Platinum should be enough for one 290 if I don't load it up too much but for 2 I need more and why not 1000w if I have to get a new one anyhow.


Hey I got this backplate for my 290, what do y'all think? -


----------



## neurotix

That's gonna look pretty sick if your system still has the red/black theme.

Make sure to take pics and update your build log once it's all water cooled.


----------



## BruceB

Quote:


> Originally Posted by *dabby91*
> 
> It's the replacement for the standard 280, based on the "Tonga" GPU. Think of it as an updated Hawaii card that has been cut down for 280 levels of performance. Its major differences are 2GB of RAM vs the 3GB of the 280, and also consumes less power than the 280. It also supports the features that were introduced with GCN 1.1, such as Trueaudio and er... whatever else there was xD.


Thanks!
After looking at some benchmarks the R9 285 seems to sit between the R9 280 and R9 280X (AMD _really_ need to sort out their naming scheme...







), TrueAudio sounds sweet (excuse the pun) but only a few games actually support it atm, and at its current price point (~235EUR) it's only 25EUR cheaper than the 280X. imo, if I were to buy a new GFX card I'd get the 280X.

Why did AMD bring out the 285 at that price? It dosen't make any sense to me!


----------



## neurotix

Quote:


> Originally Posted by *BruceB*
> 
> Thanks!
> After looking at some benchmarks the R9 285 seems to sit between the R9 280 and R9 280X (AMD _really_ need to sort out their naming scheme...
> 
> 
> 
> 
> 
> 
> 
> ), TrueAudio sounds sweet (excuse the pun) but only a few games actually support it atm, and at its current price point (~235EUR) it's only 25EUR cheaper than the 280X. imo, if I were to buy a new GFX card I'd get the 280X.
> 
> Why did AMD bring out the 285 at that price? It dosen't make any sense to me!


What games support TrueAudio? Afaik there aren't any games that support it yet. I believe it was added to the drivers in 14.2 or something but no games have yet to use it.

For that matter, are the only games that support Mantle still Thief and Battlefield 4?

I hate to say, but I have a strong feeling that these two things are basically gonna be the next hardware PhysX- that is, maybe 1% of games released in a given year will actually support it.


----------



## sage101

Ok guys I'm looking to replace my dying gtx 460 and I need your help in deciding the best R9 270. Will be overclocking on stock voltage, the msi and gigabyte models look good. What you guys think.


----------



## Devildog83

Quote:


> Originally Posted by *sage101*
> 
> Ok guys I'm looking to replace my dying gtx 460 and I need your help in deciding the best R9 270. Will be overclocking on stock voltage, the msi and gigabyte models look good. What you guys think.


Personal opinion, you can get a 270x for as low as $155., I would go there. As far as the best, opinions very but the best thing might be to look for one that has Hynix memory. Someone like Neurotix might be able to help with the vendors that sell those.

If you meant 270 or 270x please ignore the first part of this post.


----------



## Devildog83

Quote:


> Originally Posted by *Devildog83*
> 
> Personal opinion, you can get a 270x for as low as $155., I would go there. As far as the best, opinions very but the best thing might be to look for one that has Hynix memory. Someone like Neurotix might be able to help with the vendors that sell those.
> 
> If you meant 270 or 270x please ignore the first part of this post.


P.S. XFX and Giga seem to have the best warranties.


----------



## sage101

Quote:


> Originally Posted by *Devildog83*
> 
> Personal opinion, you can get a 270x for as low as $155., I would go there. As far as the best, opinions very but the best thing might be to look for one that has Hynix memory. Someone like Neurotix might be able to help with the vendors that sell those.
> 
> If you meant 270 or 270x please ignore the first part of this post.


Thanks for the reply devil. The 270X is out of my budget, I'm looking @ getting a used card on ebay for around $100.


----------



## sage101

Quote:


> Originally Posted by *Devildog83*
> 
> P.S. XFX and Giga seem to have the best warranties.


I see you got the powercolor model of the HD7870, what's your take on them in terms of heat output and overclockability?


----------



## Devildog83

Quote:


> Originally Posted by *sage101*
> 
> Thanks for the reply devil. The 270X is out of my budget, I'm looking @ getting a used card on ebay for around $100.


Hey that's OK, not all of us have tons of money to spend on this stuff. Too bad though, I just found a ref 290 for $210.


----------



## Devildog83

Quote:


> Originally Posted by *sage101*
> 
> I see you got the powercolor model of the HD7870, what's your take on them in terms of heat output and overclockability?


The Devil was an awesome overclocker and not too bad on heat with the stock cooler but if you load it way up and need the fans spinning fast it can get a touch loud. That's at over 70% fan speed though.


----------



## Devildog83

Quote:


> Originally Posted by *Devildog83*
> 
> The Devil was an awesome overclocker and not too bad on heat with the stock cooler but if you load it way up and need the fans spinning fast it can get a touch loud. That's at over 70% fan speed though.


Sadly I am still testing but I think it died.


----------



## sage101

Quote:


> Originally Posted by *Devildog83*
> 
> The Devil was an awesome overclocker and not too bad on heat with the stock cooler but if you load it way up and need the fans spinning fast it can get a touch loud. That's at over 70% fan speed though.


Nice. So would you recommend the powercolor turboduo 270 model? Sorry for asking all these questions, I've only owned nvidia cards by evga so I have no experience with AMD/ATI cards.


----------



## Devildog83

Quote:


> Originally Posted by *sage101*
> 
> Nice. So would you recommend the powercolor turboduo 270 model? Sorry for asking all these questions, I've only owned nvidia cards by evga so I have no experience with AMD/ATI cards.


That's not a bad card if you can get it for $100 and you don't want to massively overclock. I don't think you do by what you said.


----------



## hatchet_warrior

Quote:


> Originally Posted by *neurotix*
> 
> For that matter, are the only games that support Mantle still Thief and Battlefield 4?
> 
> I hate to say, but I have a strong feeling that these two things are basically gonna be the next hardware PhysX- that is, maybe 1% of games released in a given year will actually support it.


Sniper Elite 3 uses Mantle AFAIK. Let me tell you. It makes my P2 955 feel like a new CPU. Maxed out the game looks like you're watching a movie. For comparison, I had to turn Wolfenstein down a bit to get consistent frame rate.


----------



## hatchet_warrior

Quote:


> Originally Posted by *Devildog83*
> 
> That's not a bad card if you can get it for $100 and you don't want to massively overclock. I don't think you do by what you said.


Right now you can pick up a Gigabyte 270x OC for $100 on eBay. Gigabyte's warranty goes with the card for 3 years so you have some nice piece of mind.


----------



## buttface420

wondering if i should get a second 280x to crossfire, or just get a gtx 970 for a year or so and give my 280x to a family member.....

does anybody know if the r9 280x will support dx12?


----------



## BruceB

Quote:


> Originally Posted by *neurotix*
> 
> What games support TrueAudio? Afaik there aren't any games that support it yet. I believe it was added to the drivers in 14.2 or something but no games have yet to use it.
> 
> For that matter, are the only games that support Mantle still Thief and Battlefield 4?
> 
> I hate to say, but I have a strong feeling that these two things are basically gonna be the next hardware PhysX- that is, maybe 1% of games released in a given year will actually support it.


TrueAudio is currently supported by (according to wikkipedia):
Murdered: Soul Suspect, Star Citizen, Thief and Lichdom: Battlemage
So, not many games atm. I agree, I can't see this being used over DirectSound or OpenAL when they're more accessible and have a larger target audience (ie. nVidia and Intel GPU users).








I like the _idea_ of Hardware accelerated Sound, but it makes more sense to unload Sound processing onto the Sound Card, it really should be Microsoft's Goal to re-introduce Hardware accelleration into DirectSound so that anyone with a Sound Card can take advantage of it.









Quote:


> Originally Posted by *buttface420*
> 
> wondering if i should get a second 280x to crossfire, or just get a gtx 970 for a year or so and give my 280x to a family member.....
> 
> does anybody know if the r9 280x will support dx12?


Honestly, I don't think it will Support DX12. The R9 380X probably will though








If I were you I'd hold off on getting a new GPU until AMD's new line Comes out, then you can make a more informed choice as to which Card is best for you.


----------



## Devildog83

Quote:


> Originally Posted by *BruceB*
> 
> TrueAudio is currently supported by (according to wikkipedia):
> Murdered: Soul Suspect, Star Citizen, Thief and Lichdom: Battlemage
> So, not many games atm. I agree, I can't see this being used over DirectSound or OpenAL when they're more accessible and have a larger target audience (ie. nVidia and Intel GPU users).
> 
> 
> 
> 
> 
> 
> 
> 
> I like the _idea_ of Hardware accelerated Sound, but it makes more sense to unload Sound processing onto the Sound Card, it really should be Microsoft's Goal to re-introduce Hardware accelleration into DirectSound so that anyone with a Sound Card can take advantage of it.
> 
> 
> 
> 
> 
> 
> 
> 
> Honestly, I don't think it will Support DX12. The R9 380X probably will though
> 
> 
> 
> 
> 
> 
> 
> 
> If I were you I'd hold off on getting a new GPU until AMD's new line Comes out, then you can make a more informed choice as to which Card is best for you.


Also coming is Free-Sinc, I guess this is AMD's answer to G-Sink. It's all in the early stages but with the sound and FPS boost from True Audio and Mantle I still see these being used much more in the future.


----------



## kersoz2003

I have an ASUS R9 270 DCU II but can't go over 1220 voltage ( locked) can I bios mod ? I tried every overclocking software but no veil. So can I bios mod it ? Or can any of you bios mode my original bios ? to increase voltage limit ?


----------



## Recr3ational

Quote:


> Originally Posted by *kersoz2003*
> 
> I have an ASUS R9 270 DCU II but can't go over 1220 voltage ( locked) can I bios mod ? I tried every overclocking software but no veil. So can I bios mod it ? Or can any of you bios mode my original bios ? to increase voltage limit ?


Flash it with a powercolor turbo duo bios. Both my cards are unlocked


----------



## neurotix

Quote:


> Originally Posted by *Devildog83*
> 
> Personal opinion, you can get a 270x for as low as $155., I would go there. As far as the best, opinions very but the best thing might be to look for one that has Hynix memory. Someone like Neurotix might be able to help with the vendors that sell those.
> 
> If you meant 270 or 270x please ignore the first part of this post.


I don't know about guaranteed Hynix on 270X.

That would be a good goal of this club, to try and determine what RAM comes with what brand of card for 270, 270X, 280 and 280X.

Later on after I cook dinner I will check my 270X Vapor-X and see what kind of RAM it has. I *think* it's Elpida.

The Sapphire Tri-X 290/X has guaranteed Hynix memory. I don't think Sapphire reference does though. They started putting Hynix memory on the 290s because of the huge problem with black screens, even at stock. From what I heard, this was caused by bad batches of Elpida memory that some of the first 290X shipped with.

(For that matter, Devildog, once you get your 290 check this thread: http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread Your reference card might have a chance to unlock to a 290X. My Tri-X cannot.)


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> I don't know about guaranteed Hynix on 270X.
> 
> That would be a good goal of this club, to try and determine what RAM comes with what brand of card for 270, 270X, 280 and 280X.
> 
> Later on after I cook dinner I will check my 270X Vapor-X and see what kind of RAM it has. I *think* it's Elpida.
> 
> The Sapphire Tri-X 290/X has guaranteed Hynix memory. I don't think Sapphire reference does though. They started putting Hynix memory on the 290s because of the huge problem with black screens, even at stock. From what I heard, this was caused by bad batches of Elpida memory that some of the first 290X shipped with.
> 
> (For that matter, Devildog, once you get your 290 check this thread: http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread Your reference card might have a chance to unlock to a 290X. My Tri-X cannot.)


I will and my Devils both had Elpida 6 Ghz chips but the 270x would clock way higher at almost 6.3 Ghz


----------



## neurotix

The Hynix in my 290s doesn't make a damn of difference. They still only do 1500mhz (6ghz). I can get away with 1525mhz but they black screen instantly at 1550mhz. I've heard this is because the chips are rated far less and the timings are much tighter. The RAM controller is also different from Tahiti. I believe stock is 1250mhz so I guess I should be happy with 1500mhz. That's 250mhz more.


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> The Hynix in my 290s doesn't make a damn of difference. They still only do 1500mhz (6ghz). I can get away with 1525mhz but they black screen instantly at 1550mhz. I've heard this is because the chips are rated far less and the timings are much tighter. The RAM controller is also different from Tahiti. I believe stock is 1250mhz so I guess I should be happy with 1500mhz. That's 250mhz more.


I don't really know why the 270x memory clocked so high when the 7870 maxed out at about 1475 or so. The core on the other hand only got to 1235 stable where the 7870 would do 1250 or even more if I left the memory at 1450. The bios and memory controller I guess.

Those were pretty much the only differences between the 2 Devils.


----------



## neurotix

So, my 270X Vapor-X has Elpida like I thought.

It's still a great card, the new cooler they put on it is amazing.

I think you probably have a good chance of getting Hynix with the Toxic cards, or any of the highest-tier cards from any manufacturer.


----------



## Jorginto

Does anyone know id Powercolor 280X TurboDuo is voltage locked? I edited my bios with VBE7 from 1.2 to 1.225 and. Afterburner and HWinfo show increased voltage, but I'm not sure if that's correct.


----------



## Devildog83

Quote:


> Originally Posted by *Jorginto*
> 
> Does anyone know id Powercolor 280X TurboDuo is voltage locked? I edited my bios with VBE7 from 1.2 to 1.225 and. Afterburner and HWinfo show increased voltage, but I'm not sure if that's correct.


If HWinfo shows an increase I am sure that's right.


----------



## hatchet_warrior

Quote:


> Originally Posted by *neurotix*
> 
> So, my 270X Vapor-X has Elpida like I thought.
> 
> It's still a great card, the new cooler they put on it is amazing.
> 
> I think you probably have a good chance of getting Hynix with the Toxic cards, or any of the highest-tier cards from any manufacturer.


There are 2 versions of Hynix memory one of them is lower quality that Elpida. I believe the Toxic cards are the only ones guaranteed to use the higher quality version, whereas many of the other manufacturers use whatever was available. This is why you sometimes see Elpida perform better than Hynix, especially on more mid-range cards like the 270x.


----------



## Chita Gonza

Price point on the Reference R9 270X really 199? Where can I get a reference preorder? Preferably Sapphire or ASUS....MSI would also do.
I want to surprise a very poor friend with a really nice gift.


----------



## Jorginto

Guys, how is my firestrike score compared to other 280x users?

http://www.3dmark.com/3dm/4157552


----------



## kersoz2003

Quote:


> Originally Posted by *Recr3ational*
> 
> Flash it with a powercolor turbo duo bios. Both my cards are unlocked


is it really possible ? flash an asus card with powercolor ? I know how to flash. is there any modding do I need to do ?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Jorginto*
> 
> Does anyone know id Powercolor 280X TurboDuo is voltage locked? I edited my bios with VBE7 from 1.2 to 1.225 and. Afterburner and HWinfo show increased voltage, but I'm not sure if that's correct.
> Quote:
> 
> 
> 
> Originally Posted by *Devildog83*
> 
> If HWinfo shows an increase I am sure that's right.
Click to expand...

I agree, If you don't see the voltage change but the BIOS reflects a change, then its locked.. if there is a change then it has been properly modified


----------



## Devildog83

Well it's here -


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> Well it's here -


that looks great in your rig!


----------



## Devildog83

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> that looks great in your rig!


Kamodo or EK waterblock? What do you think?

Thanks by the way!!!


----------



## neurotix

Oh sweet, that looks sick, even it's not watercooled yet.


----------



## Devildog83

What am I gettng myself into, I am planning another $650 worth of upgrades.







I may be divorced soon.


----------



## Recr3ational

Quote:


> Originally Posted by *Jorginto*
> 
> Does anyone know id Powercolor 280X TurboDuo is voltage locked? I edited my bios with VBE7 from 1.2 to 1.225 and. Afterburner and HWinfo show increased voltage, but I'm not sure if that's correct.


Both of mine isn't locked. Goes all the way to 1.3
Quote:


> Originally Posted by *kersoz2003*
> 
> is it really possible ? flash an asus card with powercolor ? I know how to flash. is there any modding do I need to do ?


Yeah it's possible. I had an old MSI 7950 flashed with asus 7970 bios.
Research on here on how to bios flash GPUS

Quote:


> Originally Posted by *Devildog83*
> 
> What am I gettng myself into, I am planning another $650 worth of upgrades.
> 
> 
> 
> 
> 
> 
> 
> I may be divorced soon.


It's this site mate, it makes you buy stuff you dont need! Personally I love EK blocks., having 3 of them. I highly recommend it.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> Kamodo or EK waterblock? What do you think?
> 
> Thanks by the way!!!


komodo


----------



## buttface420

Quote:


> Originally Posted by *Devildog83*
> 
> Well it's here -


nice! big difference from the 270x huh?


----------



## Devildog83

Quote:


> Originally Posted by *buttface420*
> 
> nice! big difference from the 270x huh?


Size wise the devil's are the same. Performance wise the devil's in x-fire were very close but it's nice to have it in one card.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> Size wise the devil's are the same. Performance wise the devil's in x-fire were very close but it's nice to have it in one card.


Can i join again since i now have your devil haha


----------



## buttface420

yeah but imagine how that 290 x-fire would be!


----------



## neurotix

Quote:


> Originally Posted by *buttface420*
> 
> yeah but imagine how that 290 x-fire would be!


----------



## Jorginto

Guys, does any1 of you have any experience on keeping 280X (reference board)@ 1.3V other than benchmarking purposes? How does the VRM handle additional 100mV?


----------



## Recr3ational

Quote:


> Originally Posted by *Jorginto*
> 
> Guys, does any1 of you have any experience on keeping 280X (reference board)@ 1.3V other than benchmarking purposes? How does the VRM handle additional 100mV?


Mine is on 1.3v all the time. On air my powercolor cooler sucks.
But under water its roughly at 35c idle and about 45c max on load.
I wouldn't recommend it on air.

Although. I'm having some issues atm so its all at stock.


----------



## Jorginto

Quote:


> Originally Posted by *Recr3ational*
> 
> Mine is on 1.3v all the time. On air my powercolor cooler sucks.
> But under water its roughly at 35c idle and about 45c max on load.
> I wouldn't recommend it on air.
> 
> Although. I'm having some issues atm so its all at stock.


I've got the Watercool Heatkiller X3 universal block and on VRM I kept the stock aluminum radiator with and a 200mm fan on the side of my case is blowing directly on it. When I was gaming a bit @1.250V with v-sync off, the VRM temps were hitting 90d C.

Ohh and another thing, do you experience coil buzzing? It's a good thing that I play in headphones, couse that sh...it is just driving me nuts.


----------



## Jorginto

I'm waiting for my VRM kit, couse I bought only blocks. Damn, should have ordered the whole set in the first place.
http://shop.watercool.de/epages/WatercooleK.sf/en_GB/?ObjectPath=/Shops/WatercooleK/Products/17021

Anyway, that's my best so far:

http://www.3dmark.com/3dm/4165003

What do you think guys?


----------



## Devildog83

Been a bit quiet in here lately.

If anyone feels like they wish to take the club over just send me a PM and I will hook you up with Arizonian. I was told it's OK for me keep doing it but I thought it only fair to give anyone who wish's a chance.

Thanks,

DD


----------



## Devildog83

Quote:


> Originally Posted by *Jorginto*
> 
> I'm waiting for my VRM kit, couse I bought only blocks. Damn, should have ordered the whole set in the first place.
> http://shop.watercool.de/epages/WatercooleK.sf/en_GB/?ObjectPath=/Shops/WatercooleK/Products/17021
> 
> Anyway, that's my best so far:
> 
> http://www.3dmark.com/3dm/4165003
> 
> What do you think guys?


Firestrike score isn't bad at all. Nice job.


----------



## diggiddi

Which vapor X is the better version? the long one or the 7970 version. I see quite a few folks are in the 1200 mhz core clock zone and 1800 ram
Am I right in saying this is the highest overclocking 280x? How is the reliability though


----------



## neurotix

Quote:


> Originally Posted by *diggiddi*
> 
> Which vapor X is the better version? the long one or the 7970 version. I see quite a few folks are in the 1200 mhz core clock zone and 1800 ram
> Am I right in saying this is the highest overclocking 280x? How is the reliability though


The long one is newer and better. It has the Tri-X cooler with 3 fans, as well as a vapor chamber.

As far as the highest overclocking 280X, I don't know. That depends on the chip you get. But it's probably the coolest running 280X.


----------



## diggiddi

Quote:


> Originally Posted by *neurotix*
> 
> The long one is newer and better. It has the Tri-X cooler with 3 fans, as well as a vapor chamber.
> 
> As far as the highest overclocking 280X, I don't know. That depends on the chip you get. But it's probably the coolest running 280X.


Ok that one (tri-x) is too big for my case I'll have to pass on it then, plus I think its not that attractive thx +rep


----------



## neurotix

Yeah, I hate the yellow and black color scheme.

Not gonna paint my cards though, cuz then I probably couldn't sell or RMA them.


----------



## muhd86

add me to the club pls

r9-280x sapphire vaporx tri fire:-

muhd86 Tri Fire Vaporx- r8 280x



http://www.3dmark.com/3dm/4245947

skydiver 41406


----------



## Devildog83

Quote:


> Originally Posted by *muhd86*
> 
> add me to the club pls
> 
> r9-280x sapphire vaporx tri fire:-
> 
> muhd86 Tri Fire Vaporx- r9 280x
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/4245947
> 
> skydiver 41406


You have been added, Welcome !!!


----------



## Spork13

Better late than never.
Can I join please?
ASUS R9 280 came first, then I Crossfired with an ASUS R9 280X
Smashes most games @ 1080.


----------



## Devildog83

Quote:


> Originally Posted by *Spork13*
> 
> Better late than never.
> Can I join please?
> ASUS R9 280 came first, then I Crossfired with an ASUS R9 280X
> Smashes most games @ 1080.
> 
> 
> Spoiler: Warning: Spoiler!


You have been added, WELCOME !!!


----------



## Spork13

Thanks '83.


----------



## Pursuit of OC

Does any one know why i can change the voltage on r9 270x xfx dd with saphire trixx but not msi afterburner with msi after burner i set the EULA=1 and in settings I enabled the voltage control plus all the oc settings


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Pursuit of OC*
> 
> Does any one know why i can change the voltage on r9 270x xfx dd with saphire trixx but not msi afterburner with msi after burner i set the EULA=1 and in settings I enabled the voltage control plus all the oc settings


does the actual voltage change or does Trixx just read that it changes? bet its the former


----------



## crazymania88

Quote:


> Originally Posted by *Pursuit of OC*
> 
> Does any one know why i can change the voltage on r9 270x xfx dd with saphire trixx but not msi afterburner with msi after burner i set the EULA=1 and in settings I enabled the voltage control plus all the oc settings


Probably msi afterburner doesn't support that VRM, you need to edit some stuff in .ini files I don't really know enough to help on those.


----------



## Pursuit of OC

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> does the actual voltage change or does Trixx just read that it changes? bet its the former


It actually changes I got 2 pieces of evidence the first one is gpu z
Second one is I can only go up to 1100mhz coreclock on msi afterburner while with saphire trixx I can achieve 1200+ MHz reason I want afterburner is to see my fps since drape don't work on windows 8.1


----------



## Recr3ational

Guys do you know why my cards shows artefacts when in crossfire and gaming but it doesnt when I use the cards separately?


----------



## Spork13

Quote:


> Originally Posted by *Recr3ational*
> 
> Guys do you know why my cards shows artefacts when in crossfire and gaming but it doesnt when I use the cards separately?


All games or only some?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Pursuit of OC*
> 
> It actually changes I got 2 pieces of evidence the first one is gpu z
> Second one is I can only go up to 1100mhz coreclock on msi afterburner while with saphire trixx I can achieve 1200+ MHz reason I want afterburner is to see my fps since drape don't work on windows 8.1


Well ill be.. which version of after burnet and did you adjust any settings?


----------



## Pursuit of OC

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Well ill be.. which version of after burnet and did you adjust any settings?


I went to the config files and set the EULA=1 and also clicked on allow voltage monitoring and unlock voltage control

I think crazy mania might be right about msi not supporting my vrm controller I tried the newest version too btw reason I want msi afterburner is to show my framerate since fraps dont work with windows 8.1
(I said drapes)

So any help with enabling this will be appriciated thanx unless I can use msi and trixx at the same time msi for stats and trixx for ocing


----------



## neurotix

Quote:


> Originally Posted by *Recr3ational*
> 
> Guys do you know why my cards shows artefacts when in crossfire and gaming but it doesnt when I use the cards separately?


If you're really running them at 1250/1650 as per your sig, that is likely why.

The sweet spot seems to be 1200/1600mhz. (And I'm not just speaking out of my ass, I have a 7970, I've benched it for HWBOT on both AMD and Intel platforms. Though I don't have two to test with Crossfire, sorry.)

I would imagine the problem is probably due to too high of an OC. I would try lowering the memory to 1600mhz first and see if that fixes it. If it does, great. If it doesn't, reduce the core clock as well.

It might be a good idea to run them at stock and see if it happens. If it does, there's a problem.


----------



## diggiddi

Quote:


> Originally Posted by *Recr3ational*
> 
> Guys do you know why my cards shows artefacts when in crossfire and gaming but it doesnt when I use the cards separately?


I suspect your top card is heating up causing it to artifact, might have to dial down the overclock a tad


----------



## tabascosauz

Um, so here's my Vapor-X 280X.

The fans on this little one have quite the temper when under load. Sapphire's custom fan curve is horrible when the card is under load and the fluctuation in noise is unbearable. So I have three profiles that I keep at my fingertips:

Core/Mem/V
685MHz/1250MHz/1.1V - For everyday browsing and basic tasks to make sure the GPU stays cool and quiet. Custom fan curve on this one.
870MHz/1410MHz/1.17V - For some games that aren't that demanding. Fan is set at 35% for this, which is actually pretty quiet.
1070MHz/1550MHz/1.2V - Sapphire's default settings. On certain games the default fan curve is not enough to keep the GPU under my self-imposed 75°C limit, so I keep the fan on 45% while using this one.


----------



## neurotix

Quote:


> Originally Posted by *tabascosauz*
> 
> Um, so here's my Vapor-X 280X.
> 
> The fans on this little one have quite the temper when under load. Sapphire's custom fan curve is horrible when the card is under load and the fluctuation in noise is unbearable. So I have three profiles that I keep at my fingertips:
> 
> Core/Mem/V
> 685MHz/1250MHz/1.1V - For everyday browsing and basic tasks to make sure the GPU stays cool and quiet. Custom fan curve on this one.
> 870MHz/1410MHz/1.17V - For some games that aren't that demanding. Fan is set at 35% for this, which is actually pretty quiet.
> 1070MHz/1550MHz/1.2V - Sapphire's default settings. On certain games the default fan curve is not enough to keep the GPU under my self-imposed 75°C limit, so I keep the fan on 45% while using this one.


Nice card. How long have you had it?

I have a 7970 Vapor-X that's identical, but it doesn't have a backplate. I wear full coverage headphones and turn them up loud when gaming, so I have no problem with the noise. I'd agree, they're pretty loud but with the style of headphone I have that's not a problem. I ran mine at 1200/1600mhz 1.3v with 100% fan. Now that card is in my backup rig, so sadly it doesn't get used. My two 290s @ 100% fan are much louder, but I still can't even hear them through the headphones.

How do you like the performance? What games are you playing?


----------



## Recr3ational

Quote:


> Originally Posted by *Spork13*
> 
> All games or only some?


All games


----------



## tabascosauz

Quote:


> Originally Posted by *neurotix*
> 
> Nice card. How long have you had it?
> 
> I have a 7970 Vapor-X that's identical, but it doesn't have a backplate. I wear full coverage headphones and turn them up loud when gaming, so I have no problem with the noise. I'd agree, they're pretty loud but with the style of headphone I have that's not a problem. I ran mine at 1200/1600mhz 1.3v with 100% fan. Now that card is in my backup rig, so sadly it doesn't get used. My two 290s @ 100% fan are much louder, but I still can't even hear them through the headphones.
> 
> How do you like the performance? What games are you playing?


Good heavens, 100% fan? This thing takes on the sound profile of a CFM56 at just 50%.

I don't game with headphones, so I tend to value low noise over pure performance.

The farthest I've gone is 1120MHz/1570MHz/1.25V but I don't really want to push past 1.25V unless I o Kraken G10 or full loop.

I play BF3, a bit of BF4, Borderlands 2, CSGO, CSS, Sniper Elite 3, and Project Reality.


----------



## jason387

Has anyone tried flashing a 270X bios to a 7870?


----------



## Recr3ational

Anyone?
Artifacts in crossfire but not seperately?


----------



## End3R

Quote:


> Originally Posted by *Recr3ational*
> 
> Anyone?
> Artifacts in crossfire but not seperately?


My best guess is it's a heating issue when using both cards. Or one of the cards is bad.


----------



## Recr3ational

Quote:


> Originally Posted by *End3R*
> 
> My best guess is it's a heating issue when using both cards. Or one of the cards is bad.


Cards are underwater and works perfectly seperately. Confusing lol.


----------



## End3R

Quote:


> Originally Posted by *Recr3ational*
> 
> Cards are underwater and works perfectly seperately. Confusing lol.


Are they both the exact same card?


----------



## Recr3ational

Quote:


> Originally Posted by *End3R*
> 
> Are they both the exact same card?


Yes sir


----------



## End3R

Quote:


> Originally Posted by *Recr3ational*
> 
> Yes sir


----------



## muhd86

tri sapphire fire blades ..loving emm


----------



## Skye12977

I'm kinda curious on the typical max temps people are getting with their 270/270x cards
Assuming my ambient temp is about (70-72F) 22C, I get about 55C at max load/benchmarking.


----------



## jason387

Quote:


> Originally Posted by *Skye12977*
> 
> I'm kinda curious on the typical max temps people are getting with their 270/270x cards
> Assuming my ambient temp is about (70-72F) 22C, I get about 55C at max load/benchmarking.


That's it? It's based on the same core as the 7870 and I see temps go as high as 83c while gaming. I've redone the thermal paste as well.


----------



## Skye12977

Quote:


> Originally Posted by *jason387*
> 
> That's it? It's based on the same core as the 7870 and I see temps go as high as 83c while gaming. I've redone the thermal paste as well.


That's why I'm curious about the temps other people get.
My 7950/7970's would top out at the mid 80's


----------



## tabascosauz

Quote:


> Originally Posted by *Skye12977*
> 
> That's why I'm curious about the temps other people get.
> My 7950/7970's would top out at the mid 80's


55˚C?? How high are you running the fans???

TPU has the Hawk top out at 70˚C under load...................either that or you're running some super old/easy bench. 55˚C is not normal for a non-custom air-cooled mid-high end card.


----------



## Skye12977

Quote:


> Originally Posted by *tabascosauz*
> 
> 55˚C?? How high are you running the fans???
> 
> TPU has the Hawk top out at 70˚C under load...................either that or you're running some super old/easy bench. 55˚C is not normal for a non-custom air-cooled mid-high end card.


My sig Home PC is the computer that it is in.
Temps are with the side panel on as well.


----------



## neurotix

My Sapphire 270X Vapor-X with 1.3v and 1270mhz only goes to about 56C. With 100% fan.

(I have no problem running the fans at 100%- that's what they're there for. With high overclocks and voltage it's necessary. I don't mind the noise.)


----------



## sage101

Quote:


> Originally Posted by *neurotix*
> 
> My Sapphire 270X Vapor-X with 1.3v and 1270mhz only goes to about 56C. With 100% fan.
> 
> (I have no problem running the fans at 100%- that's what they're there for. With high overclocks and voltage it's necessary. I don't mind the noise.)


That's a really nice overclock. I can't seem to get a stable overclock even with a minimal OC. Can you tell me what's your memory clock and also what's the most stable OC can you achieve on stock voltage. BTW I got the same 270X as you.


----------



## neurotix

I run the memory at 1500mhz.

I don't OC on stock voltage.


----------



## sage101

Quote:


> Originally Posted by *neurotix*
> 
> I run the memory at 1500mhz.
> 
> I don't OC on stock voltage.


Seems like you got the luck of the draw. The best I can do is 1180 on the core and the slightest oc on the memory I get artifacts only on USF4 though so I left the memory on stock(1450). Btw how did you manage to increase your voltage, i cant seem to do that on afterburner. According to gpuz the memory is hynix, thought they were the best for oc guess I got low quality ones.


----------



## tabascosauz

Quote:


> Originally Posted by *sage101*
> 
> Seems like you got the luck of the draw. The best I can do is 1180 on the core and the slightest oc on the memory I get artifacts only on USF4 though so I left the memory on stock(1450). Btw how did you manage to increase your voltage, i cant seem to do that on afterburner. According to gpuz the memory is hynix, thought they were the best for oc guess I got low quality ones.


The VRAM on my 280X is Hynix, and I only have headroom up to ~25MHz above default clock. Beyond 1575MHz I get wild artifacting in everything (and yes, I mean incessantly in every single program) that will not cease until I reboot and reset the clocks.

However, the default clock on mine is already at 1550MHz which leads me to believe that the higher clocked cards simply get better binned chips.


----------



## sage101

Quote:


> Originally Posted by *tabascosauz*
> 
> The VRAM on my 280X is Hynix, and I only have headroom up to ~25MHz above default clock. Beyond 1575MHz I get wild artifacting in everything (and yes, I mean incessantly in every single program) that will not cease until I reboot and reset the clocks.
> 
> However, the default clock on mine is already at 1550MHz which leads me to believe that the higher clocked cards simply get better binned chips.


Well I guess that explains it, guess i'll just have to leave the memory @ stock.


----------



## Devildog83

Quote:


> Originally Posted by *sage101*
> 
> Well I guess that explains it, guess i'll just have to leave the memory @ stock.


I have seen very few 270x's that will do over 1550. My devil did and maybe the Toxi, maybe.

Having said that, any memory clock over 1450 has almost zero performance gains and sometimes restricts the core overclocking. I would downclock to even 1250, get as much out of the core you can and then increase the memory until it becomes unstable or stops giving any performance boost and leave it there.


----------



## tabascosauz

Quote:


> Originally Posted by *Devildog83*
> 
> I have seen very few 270x's that will do over 1550. My devil did and maybe the Toxi, maybe.
> 
> Having said that, any memory clock over 1450 has almost zero performance gains and sometimes restricts the core overclocking. I would downclock to even 1250, get as much out of the core you can and then increase the memory until it becomes unstable or stops giving any performance boost and leave it there.


D'oh! Stupid me. R9 270Xs have 256-bit buses at 5.6GHz effective. Chips should be binned for 1470MHz.

The R9 280Xs have 384-bit buses at 6GHz effective. Chips should be binned for 1500MHz at the very least.

(not to mention 280X usually is, like, $100-150 more expensive lel)


----------



## sage101

Quote:


> Originally Posted by *Devildog83*
> 
> I have seen very few 270x's that will do over 1550. My devil did and maybe the Toxi, maybe.
> 
> Having said that, any memory clock over 1450 has almost zero performance gains and sometimes restricts the core overclocking. I would downclock to even 1250, get as much out of the core you can and then increase the memory until it becomes unstable or stops giving any performance boost and leave it there.


Thanks for the input, I'll keep the memory @ stock and see what i can squeeze from the core.


----------



## neurotix

Quote:


> Originally Posted by *sage101*
> 
> Seems like you got the luck of the draw. The best I can do is 1180 on the core and the slightest oc on the memory I get artifacts only on USF4 though so I left the memory on stock(1450). Btw how did you manage to increase your voltage, i cant seem to do that on afterburner. According to gpuz the memory is hynix, thought they were the best for oc guess I got low quality ones.


Use Sapphire Trixx, not Afterburner. (In Afterburner you have to do a bunch of ridiculous stuff to unlock voltage control, type things in text files, enable unofficial overclocking mode, reboot the system numerous times, etc. Trixx just lets you control it straight away.) If 4.8.6 doesn't let you control your fans at all, then use 4.8.2. With my Tri-X cards, I couldn't control the fans anymore with the latest 4.8.6. They were stuck on 100% and the slider didn't work. So I reverted to 4.8.2.

It should allow overvoltage if your card isn't voltage locked. And if it's the same card as mine, it shouldn't be voltage locked. Most Sapphire cards are NOT voltage locked. I've had 2 4670s from them, a 6870, a 7770, a R7 265, a 7970, and my 270X and *NONE* have been voltage locked. This is one of the reasons I went with them initially in 2009, and why I've stuck with them since. It is also why I recommend Sapphire to people looking to purchase AMD GPUs here. Yes, they have a poor warranty period, and they also have an absolutely piss poor policy regarding customer modding. However, I'd rather deal with that stuff then have a high end card that I cannot overclock to it's full potential.

Of course, there does exist the possibility that yours is voltage locked and they started voltage locking these cards. Try Trixx and see if you can overvolt, then let us know.


----------



## jason387

Quote:


> Originally Posted by *neurotix*
> 
> Use Sapphire Trixx, not Afterburner. (In Afterburner you have to do a bunch of ridiculous stuff to unlock voltage control, type things in text files, enable unofficial overclocking mode, reboot the system numerous times, etc. Trixx just lets you control it straight away.) If 4.8.6 doesn't let you control your fans at all, then use 4.8.2. With my Tri-X cards, I couldn't control the fans anymore with the latest 4.8.6. They were stuck on 100% and the slider didn't work. So I reverted to 4.8.2.
> 
> It should allow overvoltage if your card isn't voltage locked. And if it's the same card as mine, it shouldn't be voltage locked. Most Sapphire cards are NOT voltage locked. I've had 2 4670s from them, a 6870, a 7770, a R7 265, a 7970, and my 270X and *NONE* have been voltage locked. This is one of the reasons I went with them initially in 2009, and why I've stuck with them since. It is also why I recommend Sapphire to people looking to purchase AMD GPUs here. Yes, they have a poor warranty period, and they also have an absolutely piss poor policy regarding customer modding. However, I'd rather deal with that stuff then have a high end card that I cannot overclock to it's full potential.
> 
> Of course, there does exist the possibility that yours is voltage locked and they started voltage locking these cards. Try Trixx and see if you can overvolt, then let us know.


If it's not too much trouble could you give me the bios that your 270X uses. I would like to try and flash that bios on my 7870 so I may get 1.3v actually under load instead of just 1.2v under load.


----------



## [CyGnus]

muhd86 for gaming more than 2 card's is a waste of money cause they do not scale very well
Quote:


> Originally Posted by *tabascosauz*
> 
> D'oh! Stupid me. R9 270Xs have 256-bit buses at 5.6GHz effective. Chips should be binned for 1470MHz.
> 
> The R9 280Xs have 384-bit buses at 6GHz effective. Chips should be binned for 1500MHz at the very least.
> 
> (not to mention 280X usually is, like, $100-150 more expensive lel)


My 280X can do 1825/1850 on the mem, you were unlucky with your card mem thats all


----------



## sage101

Quote:


> Originally Posted by *neurotix*
> 
> Use Sapphire Trixx, not Afterburner. (In Afterburner you have to do a bunch of ridiculous stuff to unlock voltage control, type things in text files, enable unofficial overclocking mode, reboot the system numerous times, etc. Trixx just lets you control it straight away.) If 4.8.6 doesn't let you control your fans at all, then use 4.8.2. With my Tri-X cards, I couldn't control the fans anymore with the latest 4.8.6. They were stuck on 100% and the slider didn't work. So I reverted to 4.8.2.
> 
> It should allow overvoltage if your card isn't voltage locked. And if it's the same card as mine, it shouldn't be voltage locked. Most Sapphire cards are NOT voltage locked. I've had 2 4670s from them, a 6870, a 7770, a R7 265, a 7970, and my 270X and *NONE* have been voltage locked. This is one of the reasons I went with them initially in 2009, and why I've stuck with them since. It is also why I recommend Sapphire to people looking to purchase AMD GPUs here. Yes, they have a poor warranty period, and they also have an absolutely piss poor policy regarding customer modding. However, I'd rather deal with that stuff then have a high end card that I cannot overclock to it's full potential.
> 
> Of course, there does exist the possibility that yours is voltage locked and they started voltage locking these cards. Try Trixx and see if you can overvolt, then let us know.


Thanks for recommendation neurotix. I'll give trixx a try when I get home. What's the safest max voltage can these 270x reach hoping that my card can overvolt?


----------



## jason387

Quote:


> Originally Posted by *sage101*
> 
> Thanks for recommendation neurotix. I'll give trixx a try when I get home. What's the safest max voltage can these 270x reach hoping that my card can overvolt?


Go for 1.3v







Here's what my 7870 can do.


----------



## muhd86

muhd86 - P22161 - 4960x @4ghz - Tri-Fire r9-280x Vaporx 1180/1650mhz / 1866Mhz Corsair Rams / Windows 8.1 Pro

http://www.3dmark.com/3dm11/8826200


----------



## DiceAir

I ordered some brackets from richie that I want to use to install 2 cooler master 140xl coolers on my r9 280x crossfire setup. I'm just awaiting the brackets and it should be here by this week or next week. So soon I will be able to push 1.3V and I'm aiming for 1200mhz core or even more currently have the club3d r9 280x royalking crossfire. Cooling on VRM is enough I'm just adding some fans that came with the intel stock coolers to provide additional cooling on my VRM etc. Can't wait to do this? I will post my temps and results here after installed


----------



## tabascosauz

Quote:


> Originally Posted by *DiceAir*
> 
> I ordered some brackets from richie that I want to use to install 2 cooler master 140xl coolers on my r9 280x crossfire setup. I'm just awaiting the brackets and it should be here by this week or next week. So soon I will be able to push 1.3V and I'm aiming for 1200mhz core or even more currently have the club3d r9 280x royalking crossfire. Cooling on VRM is enough I'm just adding some fans that came with the intel stock coolers to provide additional cooling on my VRM etc. Can't wait to do this? I will post my temps and results here after installed


Do you know if this will require a shim on the GPU? I have read that "old" Tahiti GPUs require a shim for proper contact with any kind of block; AFAIK Tahiti XT2 doesn't change anything physically so I think it still needs a shim?


----------



## DiceAir

Quote:


> Originally Posted by *tabascosauz*
> 
> Do you know if this will require a shim on the GPU? I have read that "old" Tahiti GPUs require a shim for proper contact with any kind of block; AFAIK Tahiti XT2 doesn't change anything physically so I think it still needs a shim?


Richie is sending me a shim anyway but i still think you need one


----------



## bigaza2151

you can add me to the owners list

i got the twin frozr 280x


----------



## Devildog83

Quote:


> Originally Posted by *bigaza2151*
> 
> you can add me to the owners list
> 
> i got the twin frozr 280x


If you could you post a pic and maybe what clocks you are running I will be happy to add you.


----------



## robE

A noob question







, i will get my first ATI card tommorow, a r9 270 from sapphire. My question is that if i download trixx and want to overclock, how is the boost clock calculated? I mean, if i overclock my clocks to let's say 1000mhz, how much will it boost? Ty.


----------



## Skye12977

Quote:


> Originally Posted by *robE*
> 
> A noob question
> 
> 
> 
> 
> 
> 
> 
> , i will get my first ATI card tommorow, a r9 270 from sapphire. My question is that if i download trixx and want to overclock, how is the boost clock calculated? I mean, if i overclock my clocks to let's say 1000mhz, how much will it boost? Ty.


1000 - stock speed = amount of boost
Your stock speed may or may not be the same as stated on the box or sites like Newegg.
Depends on if you have a stock bios too, on stock bios my stock clock is not what it actually is.


----------



## neurotix

@sage101: were you able to unlock voltage control using Trixx? Did it help your overclock at all?


----------



## sage101

Quote:


> Originally Posted by *neurotix*
> 
> @sage101: were you able to unlock voltage control using Trixx? Did it help your overclock at all?


Sorry for the delayed feedback neurotix. Yes Trixx managed to unlock my voltage, my current overclock is 1200/1500 @ 1.3v thanks man.


----------



## Goooober

Hi guys

In the past i've never increased the voltage when OC'ing any graphics card i've had but i now have an HIS 280X iPower IceQ X2 Boost and the cooler is really good so i'd like to bush the volts a bit. At the moment i'm struggling consistent voltage readings and i dont know why. Last night no matter what i set the voltage to in MSI Afterburner it would report the voltage as 1.256, as would HWMonitor. This morning though, having changed no settings, if i change the voltage in Afterburner it will then report that voltage accurately when i run a benchmark. HWMonitor on the other hand will now only report the voltage as 1.200! And for what its worth GPU-Z only ever reports the voltage as being 1.179 or something.

Basically, I have no idea what voltage is going through my card regardless of what i set it to. As such, i have no idea if my card is voltage locked or not. If anyone could help me out i'd really appreciate it.

Thanks


----------



## neurotix

Quote:


> Originally Posted by *sage101*
> 
> Sorry for the delayed feedback neurotix. Yes Trixx managed to unlock my voltage, my current overclock is 1200/1500 @ 1.3v thanks man.


Nice, glad it worked, and that's a decent overclock. Even though my 270X will do more for benching, it isn't entirely stable gaming for long periods and the performance difference isn't noticeable, so I usually run mine at that too.









To the guy with the 280X IceQ: try overvolting with Sapphire Trixx. If you still notice no difference in voltage, then your card is locked.


----------



## tabascosauz

Quote:


> Originally Posted by *Goooober*
> 
> Hi guys
> 
> In the past i've never increased the voltage when OC'ing any graphics card i've had but i now have an HIS 280X iPower IceQ X2 Boost and the cooler is really good so i'd like to bush the volts a bit. At the moment i'm struggling consistent voltage readings and i dont know why. Last night no matter what i set the voltage to in MSI Afterburner it would report the voltage as 1.256, as would HWMonitor. This morning though, having changed no settings, if i change the voltage in Afterburner it will then report that voltage accurately when i run a benchmark. HWMonitor on the other hand will now only report the voltage as 1.200! And for what its worth GPU-Z only ever reports the voltage as being 1.179 or something.
> 
> Basically, I have no idea what voltage is going through my card regardless of what i set it to. As such, i have no idea if my card is voltage locked or not. If anyone could help me out i'd really appreciate it.
> 
> Thanks


GPU-Z takes instantaneous readings, while Afterburner doesn't I think. But doesn't explain how you get different readings across the board.

Try OpenHardwareMonitor, that's what I use in conjunction with Afterburner and Trixx.


----------



## robE

Ok so i don't really understand something, i downloaded trixx/msi afterburner and doesn't really mater which software i use to overclock my r9 270 that the speeds does change in gpu-z main menu but when i go and benchmark with let's say ungine heaven my clock and memory readings in gpu-z still shows 920 and 1400(default), shouldn't it say 1050mhz and 1500 memory(my actual overclock)? Even in ungine heaven in the top corner it shows my default clocks not the OC one I don't understand what i am doing wrong


----------



## neurotix

Quote:


> Originally Posted by *robE*
> 
> Ok so i don't really understand something, i downloaded trixx/msi afterburner and doesn't really mater which software i use to overclock my r9 270 that the speeds does change in gpu-z main menu but when i go and benchmark with let's say ungine heaven my clock and memory readings in gpu-z still shows 920 and 1400(default), shouldn't it say 1050mhz and 1500 memory(my actual overclock)? Even in ungine heaven in the top corner it shows my default clocks not the OC one I don't understand what i am doing wrong


Try raising the power limit (max it out) if you haven't, and disable ULPS.

Also, what card do you have? (maker and model) That would help.


----------



## Goooober

Tried Trixx and it definitely looks like my card is locked. Any to unlock it with a bios flash for something?

Thanks


----------



## BruceB

Quote:


> Originally Posted by *Goooober*
> 
> Tried Trixx and it definitely looks like my card is locked. Any to unlock it with a bios flash for something?
> Thanks


Are you sure it's locked? If you haven't raised the _Power Limit_ (IIRC it's called _Power Target_ in MSI AB?) the Card will undervolt itsself again to stay within it's Power Limit.


----------



## xutnubu

Quote:


> Originally Posted by *Goooober*
> 
> Hi guys
> 
> In the past i've never increased the voltage when OC'ing any graphics card i've had but i now have an HIS 280X iPower IceQ X2 Boost and the cooler is really good so i'd like to bush the volts a bit. At the moment i'm struggling consistent voltage readings and i dont know why. Last night no matter what i set the voltage to in MSI Afterburner it would report the voltage as 1.256, as would HWMonitor. This morning though, having changed no settings, if i change the voltage in Afterburner it will then report that voltage accurately when i run a benchmark. HWMonitor on the other hand will now only report the voltage as 1.200! And for what its worth GPU-Z only ever reports the voltage as being 1.179 or something.
> 
> Basically, I have no idea what voltage is going through my card regardless of what i set it to. As such, i have no idea if my card is voltage locked or not. If anyone could help me out i'd really appreciate it.
> 
> Thanks


Try HWiNFO64.

My card in GPU-Z is always locked to 1.2v, but in HWiNFO64, the value "VRM Voltage" is the one that shows the real voltage of the card, at least that's how I think it works.

There's also a value called "VDDC" that always shows 1.2v (this is what GPU-Z shows), from what I understand VDDC is what the chip *requests* and the "VRM Voltage" value is what's actually being *delivered* to the GPU.


----------



## wermad

One down, one to go


----------



## MrPerforations

hello,
I got a 280x dual from ebuyer for £180, I went to see about a second one, it gone up by £50, has your price gone up too please?


----------



## [CyGnus]

Here in Portugal the 280x's dropped around 20/25e


----------



## Agent Smith1984

Weirdest thing.....
I did some adjustment to the fan placement in my case, and also took advantage of my household temps being 2-4c lower (which can impact case temps under load by even more!!)

I am using 1.3v core on my 280x in afterburner (which is giving me actual voltages between 1.24-1.26) Last night I ran a few benchmarks on my card, along with over an hour of Crysis 3 on 1290MHz core.....
Going over started to show some shardifacts









I am betting with a better power supply, I can hit the actual 1.3v core and go for 1300MHz+, The tahiti at that high of a clock speed is still pretty beefy, especially with the memory bus we have!!
I blame the power supply because my 12v is dipping from 12.19 idle, to 11.91 load. Even though that's within a safe range, it's pulling too many amps, I'm hitting way over the rated amperage draw on it....
It's an 8 year old unit, and uses (4) 12v rails. One of the rails powers one of the PCIE dongles, and another 12v rail powers the second PCIE and the second 4 plug CPU power, so with the card highly overclocked, and the CPU at 1.55v, on top of the aged capacitors, it's just too much for it. It's been good to me though!

Anyways, back to the point....
I am really considering a cheap water block for this thing, and going for 1.4v (which the card will go to in GPU Tweak) and hope for 1350MHz core.... maybe pushing memory to 1850+.....
I mean, with 2048 shaders and 384bit bus at those clocks, would the card not be competitive with some high end cards??


----------



## [CyGnus]

Agent Smith1984 how are you getting 1.4v in GPU Tweak mine only goes to 1.34v and that gives me 1.29v real voltage... And with that my 280x only gets 1250/1850

I am using Asus GPU Tweak 2.7.1.8


----------



## Agent Smith1984

GPU Tweak when used with this Asus 280x will allow for 1.4v core.... the problem is TEMPS

The actual reported voltage when I set 1.4v is only 1.36v but don't know if that is droop or not


----------



## [CyGnus]

What is your 280x? the Platinum version? Matrix? Mine is the Top and i only go to 1.35v and with vdrop its only 1.29v


----------



## Wirerat

Whats the best program to use to oc my asus r9 270?

Every gpu oc application seems to have something locked on this card.

Msi afterburner (latest version) was missing the voltage adjustment. Asus gpu tweak only lets core go to 1100mhz my card does that on stock voltage.

I havent tried oc inside the amd catalyst yet but I bet its locked down too.


----------



## inlandchris

Quote:


> Originally Posted by *[CyGnus]*
> 
> What is your 280x? the Platinum version? Matrix? Mine is the Top and i only go to 1.35v and with vdrop its only 1.29v


Mine is Asus R9-280x Platinum and I dont OC it, it runs great stock.


I watercooled mine so I cut the plasic "Matrix" off the fan housing and custom made a blank 5.25 bay plate to fit it and wired it to the video card, really didnt want to throw it away.


----------



## wermad

Just waiting on slow-mazon to ship my ram (







)


----------



## aaronsta1

Quote:


> Originally Posted by *Wirerat*
> 
> Whats the best program to use to oc my asus r9 270?
> 
> Every gpu oc application seems to have something locked on this card.
> 
> Msi afterburner (latest version) was missing the voltage adjustment. Asus gpu tweak only lets core go to 1100mhz my card does that on stock voltage.
> 
> I havent tried oc inside the amd catalyst yet but I bet its locked down too.


i use VBE7 to OC.

i dont rely on software.


----------



## SRICE4904

SRICE4904 - Sapphire Toxic - R9 280X - 1150/1600 & Sapphire Vapor-X - R9 280X - 1070/1550


No i don't mine.... and these are only the ones I have boxes for ROFL


----------



## Devildog83

Quote:


> Originally Posted by *SRICE4904*
> 
> SRICE4904 - Sapphire Toxic - R9 280X - 1150/1600 & Sapphire Vapor-X - R9 280X - 1070/1550
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> No i don't mine.... and these are only the ones I have boxes for ROFL


SRICE4904 has been added, Welcome!!!


----------



## long99x

any one have bios asus r9 280 non x support elpida ram please upload it for me
thanks
sry my eng


----------



## SRICE4904

Thanks! Did some tweaking and got Firestrike score of 14705!
http://www.3dmark.com/fs/3081968
max draw from my Kill-A-Watt was 971 during combined test, still have some headroom with my 1200w








GPU1: Sapphire 280x Toxic 1150/1600
GPU2: Sapphire 7950 Dual-X 1070/1550
GPU3: Sapphire 280x Vapor-X 1070/1550


----------



## Spork13

Quote:


> Originally Posted by *long99x*
> 
> any one have bios asus r9 280 non x support elpida ram please upload it for me
> thanks
> sry my eng


How do we do this?


----------



## diggiddi

Quote:


> Originally Posted by *SRICE4904*
> 
> Thanks! Did some tweaking and got Firestrike score of 14705!
> http://www.3dmark.com/fs/3081968
> max draw from my Kill-A-Watt was 971 during combined test, still have some headroom with my 1200w
> 
> 
> 
> 
> 
> 
> 
> 
> GPU1: Sapphire 280x Toxic 1150/1600
> GPU2: Sapphire 7950 Dual-X 1070/1550
> GPU3: Sapphire 280x Vapor-X 1070/1550


Wow thats a mishmash of gpu's How are they running? You must be slaughtering that 1080p resolution


----------



## SRICE4904

Quote:


> Originally Posted by *diggiddi*
> 
> Wow thats a mishmash of gpu's How are they running? You must be slaughtering that 1080p resolution


Very good with the 14.9 drivers. All 3 Tahiti's run similar, 280x's just come OC'ed. Benchmarked performance on all 3 gpu's with same/similar clocks is the same. Worth noting that non reference coolers are not designed for stacked tri-fire







cards preform 10-15 degrees cooler with proper spacing and great without 2 additional 250w TDP gpu's next to it ROFL, but with some down clocking and fan profiles heat is manageable, preformance loss is 10-14 fps less then OC'ed and not rising to 90*C level









And yes all the titles I've played so far (that scale with trifire) @ 1080p on my 42' 60hz TV is nice, very nice. With my 2500k @ 4.9ghz and the EVGA P67 FTW running PCI 2.0 x8, x16, x16 everything runs smooth.


----------



## long99x

Quote:


> Originally Posted by *Spork13*
> 
> How do we do this?



here
thank you


----------



## Spork13

Cheers mate.
I have the RoG GPUz, which doesn't have that button.
Will DL a different one tomorrow.


----------



## Spork13

Check your PM's long99x


----------



## long99x

Quote:


> Originally Posted by *Spork13*
> 
> Check your PM's long99x


thank you


----------



## diggiddi

Quote:


> Originally Posted by *SRICE4904*
> 
> Very good with the 14.9 drivers. All 3 Tahiti's run similar, 280x's just come OC'ed. Benchmarked performance on all 3 gpu's with same/similar clocks is the same. Worth noting that non reference coolers are not designed for stacked tri-fire
> 
> 
> 
> 
> 
> 
> 
> cards preform 10-15 degrees cooler with proper spacing and great without 2 additional 250w TDP gpu's next to it ROFL, but with some down clocking and fan profiles heat is manageable, preformance loss is 10-14 fps less then OC'ed and not rising to 90*C level
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And yes all the titles I've played so far (that scale with trifire) @ 1080p on my 42' 60hz TV is nice, very nice. With my 2500k @ 4.9ghz and the EVGA P67 FTW running PCI 2.0 x8, x16, x16 everything runs smooth.










I am seriously considering picking up either a 7970 or r9 280x to Xfire with my sapphire 7950 dual X, looking at getting eithera Used Asus DCu2 or Vapor X


----------



## jak0lantash

Hi guys,

TLDR: R9 280X at 80-85°C under load?? Even when downclocked?

I had a Sapphire Radeon HD 7970 OC with Boost (11197-03-40G), and decided to try Crossfire it with a R9 280X, so I purchased Sapphire Radeon R9 280X TRI-X OC (11221-08-40G).
Before putting the Crossfire together, I did a few tests, benchmarking of each card, etc.
With the 7970 alone at stock clock (950MHz/1425MHz), I get 65°C-70°C under Furmark, with a max power consumption (system) of around 380W.
With the 280X alone at stock clock (1100MHz/1600MHz), I get 80°C-85°C under Furmark, with a max power consumption (system) of around 480W.
If I downclock the 280X to match the 7970's clock (950MHz/1425MHz) using TRIXX, I still get 80°C-85°C under Furmark, with the same power consumption (around 480W).
By default, the VDDC is nearly at maximum (1256mV). If move it down, same thing.

- Isn't 85°C really hot?
- Why is the 280X much hotter & hungry than the 7970?
- Is the 280X faulty?

The 280X has a much bigger radiator, and three fans, the Tri-X cooling system should be more efficient than the 7970's...
Anyone has tips on how to tune the 280X?

Thanks for your help


----------



## Recr3ational

Quote:


> Originally Posted by *jak0lantash*
> 
> Hi guys,
> 
> TLDR: R9 280X at 80-85°C under load?? Even when downclocked?
> 
> I had a Sapphire Radeon HD 7970 OC with Boost (11197-03-40G), and decided to try Crossfire it with a R9 280X, so I purchased Sapphire Radeon R9 280X TRI-X OC (11221-01-40G).
> Before putting the Crossfire together, I did a few tests, benchmarking of each card, etc.
> With the 7970 alone at stock clock (950MHz/1425MHz), I get 65°C-70°C under Furmark, with a max power consumption (system) of around 380W.
> With the 280X alone at stock clock (1100MHz/1600MHz), I get 80°C-85°C under Furmark, with a max power consumption (system) of around 480W.
> If I downclock the 280X to match the 7970's clock (950MHz/1425MHz) using TRIXX, I still get 80°C-85°C under Furmark, with the same power consumption (around 480W).
> By default, the VDDC is nearly at maximum (1256mV). If move it down, same thing.
> 
> - Isn't 85°C really hot?
> - Why is the 280X much hotter & hungry than the 7970?
> - Is the 280X faulty?
> 
> The 280X has a much bigger radiator, and three fans, the Tri-X cooling system should be more efficient than the 7970's...
> Anyone has tips on how to tune the 280X?
> 
> Thanks for your help


Thermal paste? Fans speed? Air flow? Loads of possibility. I would try thermal paste first. I bet you the manufacturer put a little to much on..

To be fair 85c isn't deadly, just not recommended.


----------



## jak0lantash

Quote:


> Originally Posted by *Recr3ational*
> 
> Thermal paste? Fans speed? Air flow? Loads of possibility. I would try thermal paste first. I bet you the manufacturer put a little to much on..
> 
> To be fair 85c isn't deadly, just not recommended.


Thanks for your reply.

I tried to manually set fans to 100%, but it doesn't really change anything, like 1-2°C. On these cards, the fans never reach full speed on auto mode.
Regarding air flow, I have a well ventilated Cooler Master HAF X with a 200mm on front, 200mm on side, 2x 200mm on top, 140mm on rear side. I also tried to remove the left panel & leave the case open, it doesn't change anything.
It could be thermal paste, but if I remove the heatsink and it's not that, my warranty would be void, so I don't know if I should try it.

85°C isn't as bad as 95°C, but it seems to me well over what it should be.
Unfortunately, there isn't many reviews/tests on that particular version of the card. But there are tests of the Toxic version (higher) and of the Vapor-X Tri-X edition (similar). The temperatures shows in those tests for under Furmark are more acceptable.
Vapor-X -> 75°C: http://www.ocaholic.ch/modules/smartsection/item.php?itemid=1397&page=17
Toxic -> 75°C: http://www.anandtech.com/show/7406/the-sapphire-r9-280x-toxic-review/4

Did anyone have similar issues with those cards? Do you think I should try replace the thermal paste?
Cheers

(I updated my original post, my card is 11221-08-40G, not 11221-01-40G).


----------



## Agent Smith1984

Quote:


> thumb.gif I am seriously considering picking up either a 7970 or r9 280x to Xfire with my sapphire 7950 dual X, looking at getting eithera Used Asus DCu2 or Vapor X


Stay away from the Asus TOP 280x, and also the Sapphire Toxic (not nearly as many issues as Asus though)..... both have well known artifacting issues related to VRAM.... you've been warned!!


----------



## diggiddi

I guess its the Vapor -X then, I take it your Asus is not treating you well? I thought that was one of the better 280x's


----------



## slick2500

Stupid question. How do I fix this? I was looking at things in Gpuz and noticed that my 2 Sapphire Radeon R9 Vapor-X 270X's were different. Am I safe to assume that I should flash the other card so it has the same bios as the first one?
-Edit- all a bios flash did was make the cards now have the same bios. Why does one card have half the shaders as the second one if they are the same card?


----------



## aaronsta1

Quote:


> Originally Posted by *slick2500*
> 
> Stupid question. How do I fix this? I was looking at things in Gpuz and noticed that my 2 Sapphire Radeon R9 Vapor-X 270X's were different. Am I safe to assume that I should flash the other card so it has the same bios as the first one?
> -Edit- all a bios flash did was make the cards now have the same bios. Why does one card have half the shaders as the second one if they are the same card?


It's OK. It's because ulps is activated and it's not detecting it right.

If you fire up a game or something it will be correct.


----------



## slick2500

Yeah I see that now. This card only has 1280 Shaders.


----------



## SRICE4904

Quote:


> Originally Posted by *jak0lantash*
> 
> Hi guys,
> 
> TLDR: R9 280X at 80-85°C under load?? Even when downclocked?
> 
> I had a Sapphire Radeon HD 7970 OC with Boost (11197-03-40G), and decided to try Crossfire it with a R9 280X, so I purchased Sapphire Radeon R9 280X TRI-X OC (11221-08-40G).
> Before putting the Crossfire together, I did a few tests, benchmarking of each card, etc.
> With the 7970 alone at stock clock (950MHz/1425MHz), I get 65°C-70°C under Furmark, with a max power consumption (system) of around 380W.
> With the 280X alone at stock clock (1100MHz/1600MHz), I get 80°C-85°C under Furmark, with a max power consumption (system) of around 480W.
> If I downclock the 280X to match the 7970's clock (950MHz/1425MHz) using TRIXX, I still get 80°C-85°C under Furmark, with the same power consumption (around 480W).
> By default, the VDDC is nearly at maximum (1256mV). If move it down, same thing.
> 
> - Isn't 85°C really hot?
> - Why is the 280X much hotter & hungry than the 7970?
> - Is the 280X faulty?
> 
> The 280X has a much bigger radiator, and three fans, the Tri-X cooling system should be more efficient than the 7970's...
> Anyone has tips on how to tune the 280X?
> 
> Thanks for your help


85* for crossfire - no
For a single card in a case with good/excellent airflow - no
For a single or crossfire setup in a case with mediocre airflow - yeah
Yes much more hungry(try overclocking your 7970 to 1100/1600 and you'll see)
I don't think so, I have read issues with other sapphire products with bad TIM that with proper TIM application lowered temps by as much as 8-10* that might be it, but not knowing your airflow setup its a toss up. I will say that if it is a good overclocker, keep it(sapphire won't void you warranty if you re install TIM), if not you can get an RMA if nothing works.

I ran into the same issue with my 280'x in a system with a non reference cooler 7950, 280x Toxic maxed at 66*C alone and with the heat from another GPU it ran hotter, way hotter.... The problem is your dumping heat from the 7970 into the case for the 280x(highly overclocked 7970 with more volts) to suck up. You can turn down the power tune so your card throttles (but that sucks), used AB/Trixx to undervolt the card which might get temps in check.. Use AB to make a custom fan profile. You could make your side case intake fan( assuming you have one) into an exhaust and turn the case on its side. Or pick up a 5.25 Bay fan mount from frozen cpu and delta cool that sucker  haha







. Try any/ all of these and temps should be manageable, also
Although I will say the older vapor-x 280x sapphire card has a much more efficient cooler, it handles the heat much better then my 280x toxic, it does take up almost 3 slots though. But damn is the backplate sexy... the tri-x doesn't have one or i'd gotten one









And another thing are you watercooled or air cooled CPU?


----------



## SRICE4904

Quote:


> Originally Posted by *diggiddi*
> 
> 
> 
> 
> 
> 
> 
> 
> I am seriously considering picking up either a 7970 or r9 280x to Xfire with my sapphire 7950 dual X, looking at getting eithera Used Asus DCu2 or Vapor X


pick up a 7970 reference cooler for your crossfire setup, everything will run cooler in crossfire and you can watercool the reference PCB in the future or sell it easier


----------



## SRICE4904

Quote:


> Originally Posted by *diggiddi*
> 
> 
> 
> 
> 
> 
> 
> 
> I am seriously considering picking up either a 7970 or r9 280x to Xfire with my sapphire 7950 dual X, looking at getting eithera Used Asus DCu2 or Vapor X


Quote:


> Originally Posted by *Agent Smith1984*
> 
> Stay away from the Asus TOP 280x, and also the Sapphire Toxic (not nearly as many issues as Asus though)..... both have well known artifacting issues related to VRAM.... you've been warned!!


Yup! Its a gamble with any card lol some more then others, I just had a Reference Sapphire 7950 bought in January die on me, just got the replacement








http://www.tomshardware.com/answers/id-1971937/280x-toxic-black-squares-artifacts.html
http://www.techpowerup.com/forums/threads/new-batch-of-sapphire-r9-280x-toxic-is-with-changed-ram-reduce-efficiency-by-15.198526/
I was lucky to have gotten one with no problems, seems the Toxic'x were hit or miss, I have a good one, its really good







My card has the lower quality ram but it holds its massive overclock with no artifacts


----------



## SRICE4904

Quote:


> Originally Posted by *slick2500*
> 
> Stupid question. How do I fix this? I was looking at things in Gpuz and noticed that my 2 Sapphire Radeon R9 Vapor-X 270X's were different. Am I safe to assume that I should flash the other card so it has the same bios as the first one?
> -Edit- all a bios flash did was make the cards now have the same bios. Why does one card have half the shaders as the second one if they are the same card?


same thing on my non reference sapphire 7950 dual-x, reported same shaders as a 7970, they were unusable though and ran the same as my other 7950's and I wouldn't mess with the bios's on you cards unless your overclocking via bios or changing the powertune for a boost card. I've bricked and recovered cards before lol, always puckers you up HAHA.
As I have read all the horror stories in the past "I just flashed my new bios update and my card bricked!!"
If it ain't broke, don't fix it.....
but, nobody told me don't improve it


----------



## diggiddi

Quote:


> Originally Posted by *SRICE4904*
> 
> pick up a 7970 reference cooler for your crossfire setup, everything will run cooler in crossfire and you can watercool the reference PCB in the future or sell it easier


Cool idea I doubt i'll ever water cool it though. The side of my current case is off so heat should not be an issue, right???


----------



## jak0lantash

Quote:


> Originally Posted by *SRICE4904*
> 
> 85* for crossfire - no
> For a single card in a case with good/excellent airflow - no
> For a single or crossfire setup in a case with mediocre airflow - yeah
> Yes much more hungry(try overclocking your 7970 to 1100/1600 and you'll see)
> I don't think so, I have read issues with other sapphire products with bad TIM that with proper TIM application lowered temps by as much as 8-10* that might be it, but not knowing your airflow setup its a toss up. I will say that if it is a good overclocker, keep it(sapphire won't void you warranty if you re install TIM), if not you can get an RMA if nothing works.
> 
> I ran into the same issue with my 280'x in a system with a non reference cooler 7950, 280x Toxic maxed at 66*C alone and with the heat from another GPU it ran hotter, way hotter.... The problem is your dumping heat from the 7970 into the case for the 280x(highly overclocked 7970 with more volts) to suck up. You can turn down the power tune so your card throttles (but that sucks), used AB/Trixx to undervolt the card which might get temps in check.. Use AB to make a custom fan profile. You could make your side case intake fan( assuming you have one) into an exhaust and turn the case on its side. Or pick up a 5.25 Bay fan mount from frozen cpu and delta cool that sucker  haha
> 
> 
> 
> 
> 
> 
> 
> . Try any/ all of these and temps should be manageable, also
> Although I will say the older vapor-x 280x sapphire card has a much more efficient cooler, it handles the heat much better then my 280x toxic, it does take up almost 3 slots though. But damn is the backplate sexy... the tri-x doesn't have one or i'd gotten one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And another thing are you watercooled or air cooled CPU?


Thank you so much for your reply!

I do reach 85°C on the 280X when running alone...
Anyway, I put the Crossfire together, downclocked and downvolted both cards. I still get high temperatures on the 280X but lower.
I'm pretty sure that replacing the thermal paste will void the warranty, but I don't want to bother with RMA anyway, so I'll try to do that. Maybe on both cards.

Nice trick with the case fan behind the cards ^^

The Sapphire 280X Tri-X OC does have a backplate







--> http://oi58.tinypic.com/8zijva.jpg
Regarding my cooling system, it's air-flowed --> http://oi58.tinypic.com/2eo8e0z.jpg
4x 200mm + 1x 140mm
Case CM HAF XM
CPU Cooler NH-U12P on which I replaced both NF-P12 by NF-P12 PWM
I didn't put the PSU cable management door back yet
(SSD behind the MOBO)


----------



## SRICE4904

I'm water cooled so I have less heat then the noctua produces. You should turn your cpu cooler 90* clockwise so it blows up and out not out the side. Should help. I like noctua fans for being quiet but in my opinion the cfm sucks and I don't mind noise so I run at least 90 CFM fans on my air cooled systems.


----------



## SRICE4904

The real world testing I've done my side intake fans and the case closed runs cooler and the cards get less dusty. If I have 2 non reference cooler gpu's in crossfire and I lay the case on its side and make the side intake an exhaust they run cooler under load.


----------



## muhd86

*quad 280x / all stock

ecery thing maxed out and i mean every thing ...except tress fx ..will post another picture of that as well
*


----------



## muhd86

*http://www.3dmark.com/3dm/4566164

Sky diver 47022 with quad 280x

*


----------



## SRICE4904

Quote:


> Originally Posted by *muhd86*
> 
> *http://www.3dmark.com/3dm/4566164
> 
> Sky diver 47022 with quad 280x
> 
> *


I'm more interested in the firestrike score


----------



## robE

So i returned my 270 sapphire cause of artifacts and flickering and now i'm torn between R9 270X gigabyte or Asus R9 270(non-x). Some say that gigabyte is loud under load, some say it isn't.

About Asus...i saw this picture and i'm curious, is this a good cooling design?: I mean, with all these artifacts because of vram cooling, this DCU2 seems a poor implementation because basically it has no vram cooling beside the cooler, but maybe i'm wrong?



Any ideeas if i should go for gigabyte or Asus? Thanks!


----------



## DigDeep

Today I replaced TIM on cpu, and added some fans. I put my Sapphire 270 dual x, back, and after one hour I checked temps, and I saw that my gpu was at 67c, while doing nothing.
The reason for high temps was non working fans on graphic card. So I reseated the graphic card, and fans still would not turn, so I manually roatate them, and then they started to spin.

I hope there wont be any more problems with it. What could have happened?


----------



## jordanecmusic

Xfx 280x + xfx 7970


----------



## Spork13

Quote:


> Originally Posted by *robE*
> 
> So i returned my 270 sapphire cause of artifacts and flickering and now i'm torn between R9 270X gigabyte or Asus R9 270(non-x). Some say that gigabyte is loud under load, some say it isn't.
> 
> About Asus...i saw this picture and i'm curious, is this a good cooling design?: I mean, with all these artifacts because of vram cooling, this DCU2 seems a poor implementation because basically it has no vram cooling beside the cooler, but maybe i'm wrong?
> 
> 
> 
> Any ideeas if i should go for gigabyte or Asus? Thanks!


I had a Gigabyte GPU-a 6950 back in the day. The entire HSF fell off it for no reason. Whole lot was glued, not screwed, onto the pcb.
I replaced it wash an ASUS 6950 CU II which I still have in my spare PC today.
Since then, have only used ASUS GPU's, but I'm sure there are also plenty of people who swear by Gigabyte and won't touch ASUS products.

Currently using ASUS R9 280x and R9 280. They do run hot, or loud if I change fan profile, but at least that keeps temps < 70c. I have the 280X set to 1150 core and 6500 memory, could prob go higher, and no problems at all.


----------



## tsm106

Lmao. I don't call them GigaFail for no reason!


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *jordanecmusic*
> 
> 
> Xfx 280x + xfx 7970


Warning: offtopic ahead!
A CM 690 II case I believe? That reminds me I gotta update my sig rig info, I bought one of those cases almost 2 months ago!
Back on topic: a little while ago I sort of upgraded my 280X TurboDuo's cooling:

2 fans that were only getting in my way have been put to good use. It now runs a few degrees cooler compared to stock fans with a custom profile (stock profile was frying the VRMs) but more importantly, makes only half the noise under heavy gaming. It does take up about 3.1 expansion slots, but it's a single card on an ATX board so that isn't really a problem.


----------



## muhd86

http://www.3dmark.com/3dm/4565162

Rav-Anhilator U7385 - 4960x @ 4.4ghz - Quad r9-280x Stock - 16gb 1866mhz Vegnance - Win 8.1 Pro

fire strike ultra


----------



## muhd86

Quote:


> Originally Posted by *SRICE4904*
> 
> I'm more interested in the firestrike score


posted fire strike ultra score


----------



## DigDeep

Did anyone flashed 270 to 270x with ATiwinflash?

Is it safe?


----------



## jason387

Quote:


> Originally Posted by *DigDeep*
> 
> Did anyone flashed 270 to 270x with ATiwinflash?
> 
> Is it safe?


It's of no use. Why do you want to try that?


----------



## Wirerat

Quote:


> Originally Posted by *robE*
> 
> So i returned my 270 sapphire cause of artifacts and flickering and now i'm torn between R9 270X gigabyte or Asus R9 270(non-x). Some say that gigabyte is loud under load, some say it isn't.
> 
> About Asus...i saw this picture and i'm curious, is this a good cooling design?: I mean, with all these artifacts because of vram cooling, this DCU2 seems a poor implementation because basically it has no vram cooling beside the cooler, but maybe i'm wrong?
> 
> 
> 
> Any ideeas if i should go for gigabyte or Asus? Thanks!


I have the asus r9 270 diectcu oc. It hits 1100mhz on the core without even adding voltage.

Thats 50mhz above a ref 270x.

No reason to flash it. Just oc it.


----------



## robE

I was more concerned about the build quality but thanks







prolly will go for Asus!


----------



## inlandchris

Quote:


> Originally Posted by *muhd86*
> 
> http://www.3dmark.com/3dm/4565162
> 
> Rav-Anhilator U7385 - 4960x @ 4.4ghz - Quad r9-280x Stock - 16gb 1866mhz Vegnance - Win 8.1 Pro
> 
> fire strike ultra


I hope nobody minds that I do a compare. Muhd86 more than doubled my score except for the combined score, mine was more, why?

I have a (1) Asus R9-280X Matrix stock with 5 screens and the computer is R4BE 4930k cpu stock speed (3.9mhz), 32GB Ram 1866mhz RipJawsZ, go figure. With 4 video cards, shouldn't it be 4X more than mine?

http://www.3dmark.com/3dm11/8915768

I have the 3DMARK 11 but not sure what version Muhd86 is using.

Edit: downloaded a trial of Firestrike but not Ultra; I am using HD not 4k

http://www.3dmark.com/3dm/4603962?

http://www.3dmark.com/3dm/4604372?

It actually shows my score beats Muhd86, really confused now. Maybe the 4k is too much even for 4 GPU's


----------



## DigDeep

Quote:


> Originally Posted by *jason387*
> 
> It's of no use. Why do you want to try that?


Well, I dont like to run programs for my card to stay overclocked. Thats the only reason.


----------



## Spork13

Quote:


> Originally Posted by *inlandchris*
> 
> ... snip ...
> 
> It actually shows my score beats Muhd86, really confused now. Maybe the 4k is too much even for 4 GPU's


4k is very demanding, and my understanding is that you need a LOT of VRAM.
4 x 280x's gives you 4 lots of GB VRAM, but not 12GB.
2 x 290x's (with 4GB each) may do better.
The new 290x's coming out with ?8GB VRAMshould be the ducks guts for 4k, probably even better than the newer 980's, which only have 4GB.

*I think. Someone will correct me if I'm wrong.


----------



## jason387

Quote:


> Originally Posted by *DigDeep*
> 
> Well, I dont like to run programs for my card to stay overclocked. Thats the only reason.


Then increase the clock speeds by editing your own bios rather than flash to another.


----------



## aaronsta1

its best to use atiflash on a usb stick running dos..

and use VBE7 to edit the bios.


----------



## muhd86

Quote:


> Originally Posted by *inlandchris*
> 
> I hope nobody minds that I do a compare. Muhd86 more than doubled my score except for the combined score, mine was more, why?
> I have a (1) Asus R9-280X Matrix stock with 5 screens and the computer is R4BE 4930k cpu stock speed (3.9mhz), 32GB Ram 1866mhz RipJawsZ, go figure. With 4 video cards, shouldn't it be 4X more than mine?
> http://www.3dmark.com/3dm11/8915768
> I have the 3DMARK 11 but not sure what version Muhd86 is using.
> Edit: downloaded a trial of Firestrike but not Ultra; I am using HD not 4k
> http://www.3dmark.com/3dm/4603962?
> http://www.3dmark.com/3dm/4604372?
> It actually shows my score beats Muhd86, really confused now. Maybe the 4k is too much even for 4 GPU's


emmm well if you look at the hall of fame even --that about it i guess for 4 7970 / or 280x ...but still a decent feat


----------



## neurotix

Quote:


> Originally Posted by *Spork13*
> 
> 4k is very demanding, and my understanding is that you need a LOT of VRAM.
> 4 x 280x's gives you 4 lots of GB VRAM, but not 12GB.
> 2 x 290x's (with 4GB each) may do better.
> The new 290x's coming out with ?8GB VRAMshould be the ducks guts for 4k, probably even better than the newer 980's, which only have 4GB.
> 
> *I think. Someone will correct me if I'm wrong.


Not sure what you're trying to say.

To make it clear, 4x 280X would have a 3GB framebuffer. That is, of course, unless you got 6GB versions (not sure if they make them, they made 7970s with 6GB.)

To be even more clear: Video RAM does not stack in Crossfire. Each card has it's own individual buffer from what I assume. Then, the textures and stuff are mirrored between the cards. The driver treats Crossfired cards as a single card and *the RAM does not add up.* It is not additive.

So, sorry to let anyone down who didn't know this. If you have more than one card, then you only have essentially the RAM of one of those cards to work with.

It's not a big deal really, as long as you're willing to take compromises in quality. I personally see no reason to use anything besides FXAA because FXAA is fast and looks good, and I can't even tell the difference between FXAA and 8xMSAA. Tessellation, I notice no difference with it on except the performance hit and lower fps. Stuff like heavy post processing in game engines also kills your framerate and eats up your VRAM, there's a bunch of advanced settings in Crysis 3 for example, when I turned them from "Very High" to "Medium" I gained like 20 fps and noticed no difference in the quality. All these things combined will not only give you MORE fps, but also eat less VRAM, and you'd be hard pressed to tell the difference anyway. (Trying to "max out" every setting on a game is nearly impossible, and is just a stupid e-peen bragging right thing anyway. Always has been.)

I don't game at 4K but I game at 5760x1080 which has almost as many pixels, 6220800 vs 4Ks 8294400. I have yet to see a game run out of VRAM on my 290s. The most I've seen used is maybe 2500mb out of 4096mb. I believe that was Crysis 3. Lightly modded Skyrim in Eyefinity only uses about 1000mb with my settings. It still looks damn good and maintains a constant 60fps. If what they say is true and newer games are going to be locked to 30fps and require 4GB of VRAM for 1080p, then things don't look good. Either that or I'll just skip those games, they will probably suck anyway.

8GB cards = expensive. Most 4GB cards are already fairly expensive. I don't think we'll see cards with a LOT of VRAM (< 10GB) until they roll out that 3D stacked VRAM tech. Hynix is working on it I believe.

On another note, I'm using my R9 270X Vapor-X in my system now, because I RMA'ed both my Tri-X. I've tried a few games with it, in Eyefinity, at 1200/1500mhz. They are more than playable but I had to reduce some settings in Tomb Raider for example, to get the framerate up.


----------



## Spork13

Thats pretty much what I was trying to say - but I see I missed a "3" in this line: "4 x 280x's gives you 4 lots of GB VRAM, but not 12GB". - should be "4 x 280x's gives you 4 lots of 3GB VRAM, but not 12GB."
I was inferring that 290's with their 4GB would be better than 280's with 3GB, and the new 290's with 8GB will be possibly the best for lotsofpixels resolutions. Of course, the higher core speeds don't hurt rhe higher end cards either!


----------



## neurotix

Yeah, for sure, a lot of people get it mixed up or mistakenly believe otherwise, just wanted to be clear.


----------



## Kraxed

I have a Gigabyte Windforce 3X R9 280 with 3GB GDDR5.

http://geizhals.at/p/1032430.jpg

-Kraxed


----------



## muhd86

Quote:


> Originally Posted by *inlandchris*
> 
> I hope nobody minds that I do a compare. Muhd86 more than doubled my score except for the combined score, mine was more, why?
> I have a (1) Asus R9-280X Matrix stock with 5 screens and the computer is R4BE 4930k cpu stock speed (3.9mhz), 32GB Ram 1866mhz RipJawsZ, go figure. With 4 video cards, shouldn't it be 4X more than mine?
> http://www.3dmark.com/3dm11/8915768
> I have the 3DMARK 11 but not sure what version Muhd86 is using.
> Edit: downloaded a trial of Firestrike but not Ultra; I am using HD not 4k
> http://www.3dmark.com/3dm/4603962?
> http://www.3dmark.com/3dm/4604372?
> It actually shows my score beats Muhd86, really confused now. Maybe the 4k is too much even for 4 GPU's


u are geting that score with 1 gpu ---ULTRA FIRE STRIKE SCORE BENCHMARK

how is this even possible .


----------



## inlandchris

Quote:


> Originally Posted by *muhd86*
> 
> u are geting that score with 1 gpu ---ULTRA FIRE STRIKE SCORE BENCHMARK
> 
> how is this even possible .


Oh no, that is firestrike, not ultra. I have 3dmark ver.11 but not firestrike so I downloaded a trial version and since I dont have 4k, I just used the standard version.
Must be a big difference between the 2.


----------



## DigDeep

do i need to change TDP too?












this is 270x bios










And one more thing. I had problems with fan on this card, but somehow fixed it with more force, so when card went a little deeper into socket, fan started working. Its wierd because card works ok, only fans wont spin, unless I push it a little bit further.

https://www.youtube.com/watch?v=WYsG1kF2Kys


----------



## aaronsta1

Quote:


> Originally Posted by *DigDeep*
> 
> do i need to change TDP too?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> this is 270x bios
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And one more thing. I had problems with fan on this card, but somehow fixed it with more force, so when card went a little deeper into socket, fan started working. Its wierd because card works ok, only fans wont spin, unless I push it a little bit further.
> 
> https://www.youtube.com/watch?v=WYsG1kF2Kys


on my 270 to make it run 1150 i had to put it to 150 tdp..
but really thats the same as going in CCC and setting tdp to +20%

if you are just trying for the stock OC of 1070 you might not have to increase the tdp.

i would lower the volts to 1188 and give that a try.. my 270 worked just fine at 1070 with 1188.

watch out for the 270s VRMs.. the card i was using got so hot it started turning the board brown..
the 270 i was overlocking had only 1 power plug, i noticed in your vid that 270 has 2.. so you might not run into that problem.

do they have a cooler on the vrms on that card?

if you increase TDP to 150 and with stock volts 1.225 i bet you can hit 1150/1500

one thing, go into the fan speed part and set 100% to 70c or so.. its prob factory set to over 100c


----------



## marine88

hey man i have my Asus R9270x DCU II - 1200/1450 and locking to go foawrd if someone help me


----------



## DigDeep

Quote:


> Originally Posted by *aaronsta1*
> 
> on my 270 to make it run 1150 i had to put it to 150 tdp..
> but really thats the same as going in CCC and setting tdp to +20%
> 
> if you are just trying for the stock OC of 1070 you might not have to increase the tdp.
> 
> i would lower the volts to 1188 and give that a try.. my 270 worked just fine at 1070 with 1188.
> 
> watch out for the 270s VRMs.. the card i was using got so hot it started turning the board brown..
> the 270 i was overlocking had only 1 power plug, i noticed in your vid that 270 has 2.. so you might not run into that problem.
> 
> do they have a cooler on the vrms on that card?
> 
> if you increase TDP to 150 and with stock volts 1.225 i bet you can hit 1150/1500
> 
> one thing, go into the fan speed part and set 100% to 70c or so.. its prob factory set to over 100c


Actually thats not my card, I was just trying to say that some other guys have the same problem with the fans not spinning

Mine have one pin, only 270 that has two 6 pins is gigabyte windforce, I think that card in video is 270x dual x 4gb

The cooling on VRMs looks like this

http://www.insidehardware.it/hardware/schede-video/3369-sapphire-radeon-r9-270-dual-x-silenzio-e-prestazioni-al-giusto-prezzo?start=4#.VFy_zckqW71

so its probably not very good. ANd I have no vrm readings on that card.

I will try to hit 1070 with 1.188v


----------



## SRICE4904

Quote:


> Originally Posted by *muhd86*
> 
> posted fire strike ultra score


That's awesome that you have and can run ultra. But I don't think most of us have that ability. How about a firestrike regular bench to satisfy my curiosity please ;-)


----------



## inlandchris

Here is mine and I think Muhd86 should beat this by double; minimum. I overclocked my rig a little so the score improved minimal.

http://www.3dmark.com/3dm/4633681?


----------



## Agent Smith1984

Quote:


> Originally Posted by *inlandchris*
> 
> Here is mine and I think Muhd86 should beat this by double; minimum. I overclocked my rig a little so the score improved minimal.
> http://www.3dmark.com/3dm/4633681?


http://www.3dmark.com/fs/2915459
Keep pushin'.... keep pushin'


----------



## inlandchris

Quote:


> Originally Posted by *Agent Smith1984*
> 
> http://www.3dmark.com/fs/2915459
> Keep pushin'.... keep pushin'


Will do. I would ask how you did that but your AMD CPU must be performing better than Intel.

Great score.

Mr. Anderson.


----------



## Agent Smith1984

Quote:


> Originally Posted by *inlandchris*
> 
> Will do. I would ask how you did that but your AMD CPU must be performing better than Intel.
> Great score.
> Mr. Anderson.


Thanks, but it's definitely not the CPU.... lol
You have a beastly CPU compared to mine, look at your physics score compared to mine, now look at the graphics scores and my clock speeds....









Juice it and push it my friend!

That's on 1.3v stock Asus Directcu2 cooler, hynix memory at 1.6v

Edit: I have benched this card at 1270/1825 with no artifacts before, but she was hot hot hot, and got a few flickers, so I dialed it back a little for daily use...
Some peopel say it's a golden card, but I don't think so... I see a lot of the Asus TOP/Matrix and Sapphire Toxic cards break 1220 in reviews, and that's with 1.25v...
I'm assuming you have the "TOP" card right?


----------



## aaronsta1

Quote:


> Originally Posted by *DigDeep*
> 
> Actually thats not my card, I was just trying to say that some other guys have the same problem with the fans not spinning
> 
> Mine have one pin, only 270 that has two 6 pins is gigabyte windforce, I think that card in video is 270x dual x 4gb
> 
> The cooling on VRMs looks like this
> 
> http://www.insidehardware.it/hardware/schede-video/3369-sapphire-radeon-r9-270-dual-x-silenzio-e-prestazioni-al-giusto-prezzo?start=4#.VFy_zckqW71
> 
> so its probably not very good. ANd I have no vrm readings on that card.
> 
> I will try to hit 1070 with 1.188v


yeah the main difference between the x and non x models are the vrms.. but looking at the pic you sent me, and my sapphire 270X dual-x, they might have just removed the extra plug. i dont see any empty vrm slots?

its the same chip. might be binned slightly less but i think the production is so good that it will do x speeds with x volts.. stock on the sapphire 270x dual-x is 1070/1400 @ 1.238 with 150 tdp

1070 tho shoudnt stress much. you might have not even have to bump the tdp to make it stable... you might not get away with 1188 but thats where id start.. play a game like BF4 and look for flashing textures.. if you see any first bump the tdp in CCC to 20% and run again, then the volts up one until they go away.

dont run any programs like furmark.. those are bad on gpus..

oh and make sure you set the fan to be 100% at like 70c or so.. its set to like 90 or 100 from factory..


----------



## Agent Smith1984

Ever since I got this 280x, and saw how much the performance was impacted by overclocking, I have been on a quest to break 8500 on FIrestrike.... just seemed like a nice round achievable number..... well, I eventually got it stable to the point when I could break 8500 with GPU clocks alone (1260/1800)....... tonight I decided, before I pull this thing out for good, and get it cleaned up for whomever on craigslist buys it., that I would shoot for 8600!!

I knew it would take some CPU clocking so I went to work to find some stable/semi-stable settings for this old thuban.
With a few bumps in base clock, and another half multi, I was able to get a solid running 4.25GHz on 1.6v (too much voltage for standard water kit, the temps in prime/ITB/etc hit 62c) but it runs games/benchs rock solid....

So, for anyone who cares, here is a single 280x @ 1270/1825 (does not report correct clocks in futuremark)
Artifact free runs, 1.3vcore, stock asus cooler with 100% fan speed, running on an x6 @ 4.25 CPU, 2.8 NB, DDR3 1840

http://www.3dmark.com/3dm/4643616

Remember, tahiti dropped in late '11....... pretty impressive for a 3 year old GPU core, and a 5 year old CPU eh?









For comparison, here is a 3570k at 4.3GHz with higher GPU clocks....
http://www.3dmark.com/fs/1490804

Anyways, that's my goodbye rant to my 280x..... I spent so much time tweaking this thing, I have a little sentiment for it I guess









Edit: For anyone questioning the invalid result, it's just my driver. I'm using beta. Plenty of screen shots to prove.


----------



## [CyGnus]

Agent Smith1984 congrats on the Score i have to run mine and see what it does though mit only does 1250 core


----------



## Agent Smith1984

Glad to see somebody getting 1250.....
Mine will do up to 1270 without artifacts, but it does get a few really fast flckers..... These seem to occur when the vcore strangly drops out to sub 1.22 actual voltage. I'm guessing it's my 8 year old PSU, lol.

I keep it at 1250 to be safe, which is probably closer to my actual stable clock speed with the voltage fluctuations I'm having. It definitely doesn't throttle clock speed though.

Would love to see your FireStrike.... how does your memory do?


----------



## [CyGnus]

The 280x is a great card if it can do at least 1200 for 24/7 without any issues it will max everything at 60fps 1080p







, maybe you ahould replace that PSU take a look at the XFXPro 550w if you can spend a little more the XFX TS550 or even the Cooler Master V550S for one card 550 is enough for CFX of these 280x's you should look at 750w


----------



## Agent Smith1984

Well, I'm gonna be selling the 280x ASAP cause I'm picking up a tri-x 290 this weekend for $200....

I'm upgrade PSU by Christmas.

I'll still drop in here to help when I can though!!!


----------



## SRICE4904

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, I'm gonna be selling the 280x ASAP cause I'm picking up a tri-x 290 this weekend for $200....
> 
> I'm upgrade PSU by Christmas.
> 
> I'll still drop in here to help when I can though!!!


Nice! I picked up 2 Sapphire 290x's on ebay for about 300 a piece several months ago. Went reference design so i could get some full coverage blocks down the road for my 4k gaming build. The performance difference between my Tahiti's was MASSIVE!
Enjoy!
(I know the non reference 290's with good coolers are just as fast as the 290x's







)


----------



## JackCY

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Ever since I got this 280x, and saw how much the performance was impacted by overclocking, I have been on a quest to break 8500 on FIrestrike.... just seemed like a nice round achievable number..... well, I eventually got it stable to the point when I could break 8500 with GPU clocks alone (1260/1800)....... tonight I decided, before I pull this thing out for good, and get it cleaned up for whomever on craigslist buys it., that I would shoot for 8600!!
> 
> I knew it would take some CPU clocking so I went to work to find some stable/semi-stable settings for this old thuban.
> With a few bumps in base clock, and another half multi, I was able to get a solid running 4.25GHz on 1.6v (too much voltage for standard water kit, the temps in prime/ITB/etc hit 62c) but it runs games/benchs rock solid....
> 
> So, for anyone who cares, here is a single 280x @ 1270/1825 (does not report correct clocks in futuremark)
> Artifact free runs, 1.3vcore, stock asus cooler with 100% fan speed, running on an x6 @ 4.25 CPU, 2.8 NB, DDR3 1840
> 
> http://www.3dmark.com/3dm/4643616
> 
> Remember, tahiti dropped in late '11....... pretty impressive for a 3 year old GPU core, and a 5 year old CPU eh?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For comparison, here is a 3570k at 4.3GHz with higher GPU clocks....
> http://www.3dmark.com/fs/1490804
> 
> Anyways, that's my goodbye rant to my 280x..... I spent so much time tweaking this thing, I have a little sentiment for it I guess
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: For anyone questioning the invalid result, it's just my driver. I'm using beta. Plenty of screen shots to prove.


ASUS 280x DC2T?

Well I just came to ask this since I'm having some issues clocking mine:

What's the best tool to use to overclock the card?
ASUS 280x DC2T (015.041, latest BIOS, updated via Tweak, driver 14.301.1001.0 [14.9]), Win8.1 64bit

I have found ASUS GPUTweak to work best, but it's a pain to install, one has to enable Administrator account and install it there.
AfterBurner refused to change voltage and was reporting nonsense voltage and not changing with any of the 3 settings it has for it.
TriXX, I'm not sure the voltage worked either, would have to check again, well if I try to go to settings TriXX freezes, so a no go.

*My trouble is to find a tool that will apply the clocks on start.* As in the settings get used every time I start the PC.
No big deal right? Well not really :/

If I use CCC which allows me to get 1140/1800 without being able to touch voltage, the settings don't get always applied and when that happens the "turbo" acts as if it is off and the card runs 970/1600, not even the default 1070/1600, ignoring the set overclock 1140/1800. Restart solves it, happens quite randomly, it's not just reporting issue the clocks are truly 970/1600 from looking at benchmark results.
If I use ASUS GPUTweak, "keep setting for next start on close application", it does remember my setting but it does not get applied to the GPU automatically







(OverDrive is disabled in CCC)

So what is the trick for these GPUs to make the clock and at best also voltage be applied on start automatically?

Could it be that CCC doesn't like ASUS GPUTweak? Or AfterBurner or something? I have CCC, ASUS GPUTweak and AB installed now, only CCC and Tweak running, unless AB has some service running in background.

Also any way to change fan speed below the default? Especially on idle to go below 20% = 1000rpm?

---

I guess CCC and Tweak don't like each other... just had to restart only to return to standardclocks, restarted again, saw OC clocks and then they dropped to standard once CCC probably started, Tweak started sooner.
Uninstalled CCC and AB, doesn't resolve a thing.

Right now finished Firestrike and at the end when I close it the GPU goes down to default clock of 970MHz and from now on ignores any OC I set until I close 3DMark.
Why does it keep dropping to 970MHz when it doesn't need to, I would understand if it lowered clock under load but not when it's not under load and then keep ignoring turbo altogether.
OK it does that with this driver and bios even on stock clocks I just didn't notice yet. CCC uninstalled, Tweak disabled not running, only the service is running I bet in background. Again after Firestrike ends there is a short switch to 500MHz then a switch to 970MHz for a short while and then goes to 500MHz and now until I close 3DMark everything will run at 970/1600MHz, and the clock while idle stays on 500/1600MHz, ... WTH

Maybe there is some cheatcode/optimization hidden in the drivers for 3DMark? But it fails miserably?
Saw the same behavior even with Sky Diver.
3DMark v1.4.780.


----------



## inlandchris

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, I'm gonna be selling the 280x ASAP cause I'm picking up a tri-x 290 this weekend for $200....
> 
> I'm upgrade PSU by Christmas.
> 
> I'll still drop in here to help when I can though!!!


Too bad its not a Matrix P, I would be interested


----------



## inlandchris

Ok, I overclocked as much as I could safely. I did have to reboot when going too high. Now, 1.3 volts at 1245 clock.

I can't beat 8500, but close

http://www.3dmark.com/3dm/4661366


----------



## JackCY

Quote:


> Originally Posted by *inlandchris*
> 
> Ok, I overclocked as much as I could safely. I did have to reboot when going too high. Now, 1.3 volts at 1245 clock.
> 
> I can't beat 8500, but close
> http://www.3dmark.com/3dm/4661366



8374 with AMD Radeon R9 280X(1x) and Intel Core i5-4690K

Graphics Score
9695

Physics Score
9231

Combined Score
3876

You are pulling it a lot with the CPU on physics and lack it on memory OC.



Pointless for me to push the core clock higher though, it doesn't like it past 1.2GHz, starts to need more voltage past 1.14GHz.
RAM, that's a tricky one, could probably run a shot of 7.4GHz but for the sake of stability I settled on 7.2GHz, enough for me









Still can't get rid of the 3DMark bug, after every run of it I have to restart to get back to OC clocks or even normal Turbo clocks, it's like the turbo turns off after 3DMark is done. Anything else even Furmark, is no issue.


----------



## axist

Hi
I need some help ! I bought a sapphire r9 270x dual-x and i have it a week now and i get a lot of weird pixel distortions i you could call it that and i dont know what to do , I googled around but cant find a answer . Were can i get some help ? i have pics to show
thank you


----------



## aaronsta1

Quote:


> Originally Posted by *axist*
> 
> Hi
> I need some help ! I bought a sapphire r9 270x dual-x and i have it a week now and i get a lot of weird pixel distortions i you could call it that and i dont know what to do , I googled around but cant find a answer . Were can i get some help ? i have pics to show
> thank you


did you buy it new or used??


----------



## axist

Brand new .
It didnt show up at first i ran catzilla benchmark and all was great and played some games i think it was the 3rd day it started with the weird stuff . I remove the card and re inserted it and i also move it to a different pci-e


----------



## aaronsta1

Quote:


> Originally Posted by *axist*
> 
> Brand new .
> It didnt show up at first i ran catzilla benchmark and all was great and played some games i think it was the 3rd day it started with the weird stuff . I remove the card and re inserted it and i also move it to a different pci-e


id take it back to the store you got it from.


----------



## inlandchris

Quote:


> Originally Posted by *JackCY*
> 
> 8374 with AMD Radeon R9 280X(1x) and Intel Core i5-4690K
> 
> Graphics Score
> 9695
> 
> Physics Score
> 9231
> 
> Combined Score
> 3876
> 
> You are pulling it a lot with the CPU on physics and lack it on memory OC.
> 
> 
> 
> Pointless for me to push the core clock higher though, it doesn't like it past 1.2GHz, starts to need more voltage past 1.14GHz.
> RAM, that's a tricky one, could probably run a shot of 7.4GHz but for the sake of stability I settled on 7.2GHz, enough for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Still can't get rid of the 3DMark bug, after every run of it I have to restart to get back to OC clocks or even normal Turbo clocks, it's like the turbo turns off after 3DMark is done. Anything else even Furmark, is no issue.


I did push it but don't like to do that; last Matrix card in SouthEast Asia.

Score:8764

Graphics: 9528

Physics: 16450

Combined: 3807

I couldn't come up with the Graphics/Combined like you did so congrats on your score, top for a single GPU

my score:

http://www.3dmark.com/3dm/4665883





I think a single gpu is about it. I wish I could find another Matrix, then I would crossfire it but max of 2 gpu's for me; too much amperage/power for my UPS.


----------



## M1kuTheAwesome

Too bad my 280x can only run 1219 stable / 1225 for benching, but will give Fire Strike a shot later tonight. Never run Fire Strike before, let's hope I score well enough.


----------



## JackCY

Quote:


> Originally Posted by *axist*
> 
> Hi
> I need some help ! I bought a sapphire r9 270x dual-x and i have it a week now and i get a lot of weird pixel distortions i you could call it that and i dont know what to do , I googled around but cant find a answer . Were can i get some help ? i have pics to show
> thank you


Pictures please.

In what do you get it? Everything or only something?
If it's new and it does errors in only something and it's not common for the card, then return it. Well even if it's common...
ASUS 280x DC2T had some issues from the get go, I've encountered them in one game only, seems to be gone after BIOS update that is supposed to fix it.
Quote:


> Originally Posted by *inlandchris*
> 
> I did push it but don't like to do that; last Matrix card in SouthEast Asia.
> Score:8764
> Graphics: 9528
> Physics: 16450
> Combined: 3807
> I couldn't come up with the Graphics/Combined like you did so congrats on your score, top for a single GPU
> my score:
> http://www.3dmark.com/3dm/4665883
> 
> I think a single gpu is about it. I wish I could find another Matrix, then I would crossfire it but max of 2 gpu's for me; too much amperage/power for my UPS.


8764









See, memory boosts 3DMark score.

Yeah I was looking at the individual scores as well, trying to find those scores for other GPUs and cards but didn't have much luck, reviews mostly post the overall score that is some kind of a magic number, it's not an average of the three sub-scores. And browsing 3DMark site for single GPU scores is a nightmare, that search is broken and useless. Ah here are some scores.
I only played with FireStrike now because it seemed to give the GPU the hardest time in terms of stability.
Heaven is easier. FurMark is useless, eats power that's it. Games don't care.
FireStrike tends to freeze when it's unstable.

What's up with these Matrix and Matrix-P cards? The P cards are supposed to be the cream on the top hey? Unlocked memory voltage too.

I think other tests are better to tell overall performance. Where you get avg/min/max FPS and stability.
I run the 4690K at 4.6/4.3GHz and RAM at 2.4GHz/CL11.

How do you know you have the last Matrix or Matrix-P card in SEA?


----------



## axist

well this is what my desktop looks like at some points but as soon as i move my mouse it disappears , in game its like some of the textures becomes pixels like the ones on my desktop but not all of them and i can play normally with no issues, it almost look like they fade away .


Im going to the store were i bought it and see if i can get them to change it for a new one , I was thinking about geting the msi r9 270 gaming .


----------



## inlandchris

Quote:


> Originally Posted by *JackCY*
> 
> Pictures please.
> 
> In what do you get it? Everything or only something?
> If it's new and it does errors in only something and it's not common for the card, then return it. Well even if it's common...
> ASUS 280x DC2T had some issues from the get go, I've encountered them in one game only, seems to be gone after BIOS update that is supposed to fix it.
> 8764
> 
> 
> 
> 
> 
> 
> 
> 
> 
> See, memory boosts 3DMark score.
> 
> Yeah I was looking at the individual scores as well, trying to find those scores for other GPUs and cards but didn't have much luck, reviews mostly post the overall score that is some kind of a magic number, it's not an average of the three sub-scores. And browsing 3DMark site for single GPU scores is a nightmare, that search is broken and useless. Ah here are some scores.
> I only played with FireStrike now because it seemed to give the GPU the hardest time in terms of stability.
> Heaven is easier. FurMark is useless, eats power that's it. Games don't care.
> FireStrike tends to freeze when it's unstable.
> 
> What's up with these Matrix and Matrix-P cards? The P cards are supposed to be the cream on the top hey? Unlocked memory voltage too.
> 
> I think other tests are better to tell overall performance. Where you get avg/min/max FPS and stability.
> I run the 4690K at 4.6/4.3GHz and RAM at 2.4GHz/CL11.
> 
> How do you know you have the last Matrix or Matrix-P card in SEA?


I dont know too much about FireSrike except it stresses out the gpu. I didnt increase the voltage first and did the gpu clock to 1250 and the gpu almost crapped out so increased the voltage and everything works. Too chicken to go higher in voltage. I dont know the limits and I dont want to experiment to find out.


----------



## JackCY

Quote:


> Originally Posted by *axist*
> 
> well this is what my desktop looks like at some points but as soon as i move my mouse it disappears , in game its like some of the textures becomes pixels like the ones on my desktop but not all of them and i can play normally with no issues, it almost look like they fade away .
> 
> 
> Im going to the store were i bought it and see if i can get them to change it for a new one , I was thinking about geting the msi r9 270 gaming .


Yeah that card is probably done, return it, get a different one or get money back and buy a different GPU altogether.
Not sure 270 is going to be able to push much anything these days, nor 270x for that matter. But it's ok if you play low resolution and medium details.

I bought my barely used 280x for a price of new 270-270x. Prices of some new cards have dropped, 290s mainly but the rest stays, including used cards at least here on this market :/


----------



## marine88

Hi people im trying to reach the maximum on my 270x direct cu 2 from asus. im using msi afterburner and I get unstable when reach 1500 mhz on memories do you think is normal? temps are good!


----------



## axist

if i get store credit should i get an msi 270x or sapphire 280 ? they cost about the same here


----------



## [CyGnus]

280 it has 3gb vram vs 2gb it has 384bits vs 256bits so r9 280 with no doubts


----------



## Wirerat

Quote:


> Originally Posted by *[CyGnus]*
> 
> 280 it has 3gb vram vs 2gb it has 384bits vs 256bits so r9 280 with no doubts


280x will be much better at 1080p or above.

I used a r9 270 with a g3258 rig but only because its only at 1440x900. With an oc the 270 has performed way above my expectation at that res anyway.


----------



## Agent Smith1984

*Just to address a few things I have seen over the last few pages....*

The FireStrike not repprting proper clocks is not a bug. It's based on what the clocks are set to when the bench is done.....

It will say 1070/1600 because those are your default 3d clocks. The card runs at the overdrive speeds (which is considered your boost clock) during a benchmark, then reverts to default 3d mode after it runs, and will not go back into 2d mode while a benchmark is open (always the case in my experience, did a lot of testing on this). Same holds true for CCC and GPUTweak overclocking.

When overclocking these cards, you will get the very best results with MSI Afterburner IF, you have a few certain settings correct.

The first things you want to verify is that you have voltage cotrol, voltage monitoring, and force constant voltage turned on. The latter is basically the software based equivalent of LLC for CPU voltage, and will help quite a bit with overclocking.

You also want to turn on extended limits overclocking mode, and make sure that CCC overdrive IS TURNED ON, but does not get used to OC or have a set OC in place. Also, make sure in Afterburner, that the GPU type is set to "standard" versus "GHz edition," because remember that all the GHz edition 7970 and all the OC 280x cards are considered to be "boost" cards. If you use the GHz edition, it can sometimes only use the max OC as a boost setting, not your standard 3d setting.

Once you find stable max clock speeds in Afterburner, you can save your 3d profile, go to the profile settings, and in the bottom section, select your profile for the 3d mode and afterburner will automatically toggle your clocks to the appropriate setting when a 3d app is launched, and it will always report correct info in 3dmark as well. Once you have all that configured, just select to apply overclock at startup, start afterburner with windows, and start minimized, and you will have hassle free max OC clocks for gaming and benching.... auto profile switching, correct clock speeds reported, and constant voltage force (especially helpful after 1.2v on 280x.)









*Also*,
Nice FireStrike scores so far guys....
I see a lot of good scores, especially on those higher end CPU's..... wew wee.....
But try to keep the focus on the graphics score only, and here is why:

My CPU only hits 9k or so points on the physics score tops, but my graphics score is consistently over 10,100+ points. I say this not to brag, but to make you aware of roughly how much headroom you have.

Also, because the VRAM of the Asus cards runs at 1.6v, this can be a gift and a curse. I say this because for the most part, this memory will run at really high clocks, but it doesn't always mean that it "likes it."

I was able to bench on my VRAM artifact free at up to 1860MHz without artifacts, but I noticed performance hits using anything over 1810Mhz, even though it was completely stable up to 1840MHz. In some cases the hit was hundreds of points, so make sure to not just find the max stable memory setting, but to try it a little lower to verify the clock speed is giving you the best performance.

These cards have a ton of memory bandwidth, especially when overclocked, but to really put it to good use, the core needs to be as high as possible.

1.25-1.3v seems to be the sweet spot for these cards. It's very common to see core clocks between 1200 and 1280MHz in that voltage range. The increase in voltage is perfectly fine for these cards if you are comfortable with 80c+ temps. It really doesn't bother me, since that's well within the recommended temp range.

And lastly...

*ARTIFACTS*

Yes, the Asus 280x DCU2 TOP and the Matrix platinum, as well as many Sapphire Toxic models, and some other non reference overclocked version suffer from some of the strangest artifacts you will ever encounter....

These are usually small sections of checkerboard artifacts, sometimes just one small spot on the screen, sometimes larger, or in several places.
It's not just in games, but sometimes on the desktop, and in browsers.

I have eliminated these for myself by using a few methods....

First; try different drivers.... my best experience has been with 14.7 drivers. Some people say it's the BIOS, some people say drivers, some people say hardware.... me? I say it's a little bit of everything wrong with these cards....

Asus set the VRAM voltage to 1.6v. It is hard locked at that amount, and can not be reduced manually. They did release a BIOS revision that reduces the VRAM to 1.5v, but honestly, even after flashing to this version, MSI AB still reports 1.6v for me. GPU-Z sensor shows the 1.6 is correct. Most people I've seen ended up with 1.5v, which usually fixed, or helped with the artifacting, but also reduced the overclocking headroom they had originally (not always).

Also, I have found that if I pop the VRAM's cherry (read on







), the artifacts will come much quicker, and stay much longer....
What I mean by that, is recall earlier in my post when I had mentioned the VRAM being able to run at a much higher clock than what it performs best at?? I believe this is due to the default VRAM voltage, and possibly due to heat since the TOP model doesn't cool VRAM at all, and the Matrix only cools 8 of 12 chips.

It seems that, I can run a benchmark or game at a given speed, let's say 1840MHz....
I can play the game for 45 minutes with no issues, then all of the sudden, I will see either strange shimmers, or that stupid little checkerboard. on some random part of the screen!!!

I can alt/ctrl/del and immediately reduce the clocks back down, even to stock or lower, and these artifacts do not go away. I can even close the game etc, and restart it at stock clocks, and none of that works. It seems also, the card takes a 15-20% performance hit once this happens, even benchmarks artifact really badly during this situation..... At this point, the cherry is popped....
Through a lot of trial and error, I have discovered that the only way to fix it is to restart windows. Initially I would do full restarts assuming that the power cycling was the fix, however through more recent testing, I discovered that it's as simple as logging out of windows and logging back in.
It literally fixes the issues instantly, and does NOT return as long as I do not exceed 1810MHz of VRAM clocks during prolonged gaming.

That tells me that simply reloading the driver is what seems to be fixing the issue. Why is that? I honestly have no clue... that's why I assume the issues with these cards seems to be a little bit of everything....

Another good thing to do, is use afterburtner (during your OC sessions) to create not only your max OC setting, but create a new 2d profile also, that keeps the memory at a little higher setting than the default value of 150MHz. Some have theorized the large jump in clock speed from 150MHz to 1600MHz could be the culprit... I'm not really sure how true that is, but I will close out on this last observation.... and it may be a stupid logic, but it's what I experienced.

I know many people still had/have the artifacts with these cards, even at default speeds. Again, I suggest the BIOS update and driver experimenting first... I initially had the most artifacting issues between 1625 and 17**+/- MHz memory with this card.... I saw more of the artifacts through those clock speeds, than I did with the memory set past it's current max stable OC.....

How could that be??? And then I thought to myself (again, this probably sounds ridiculous):
If you were to take an idling engine, and introduce too much fuel into the cylinders without hitting the accelerator to open the butterfly and introduce more air to mix with it, then you would bog or flood the engine and it would run horribly rich, or not run at all. However, once you hit the pedal, the air and fuel are both increased together and the engine properly revs to a higher RPM....

So... I said hmmmm, what if the VRAM, with it's factory overvolting (that BTW is NOT listed as a voltage for these chips on the Hynix info sheet)
just has too much fuel going to it,and is running too rich. Maybe I need to hit the pedal and use some of that fuel??

So that is when I started bumping the memory up even further (crazy considering the card was already showing intermittent artifacts, but I was desperate), and oddly enough, after 1740MHz or so, things got better and better and the card had fewer issues the closer I got to 1800MHz.










Anyways, that has been my experience with this card.... It works perfectly for me 99% of the time now, even with the fairly high overclocks I am running on it (24/7), but it did take some trial and error and lots of experimenting to get the card to this level of performance, and still be hassle free.

Good luck!


----------



## JackCY

Quote:


> Originally Posted by *axist*
> 
> if i get store credit should i get an msi 270x or sapphire 280 ? they cost about the same here


280, simple choice
Quote:


> Originally Posted by *Agent Smith1984*
> 
> The card runs at the overdrive speeds (which is considered your boost clock) during a benchmark, then reverts to default 3d mode after it runs, and will not go back into 2d mode while a benchmark is open (always the case in my experience, did a lot of testing on this). Same holds true for CCC and GPUTweak overclocking.


I find the 3DMark more messy then it used to be. Now when I close the 3Dmark window after a test, it does go back to boost clocks when running other tests/games. But when kept open, it won't it will do 970/1600 base clock and that's it even if I run 3DMark test again. It's just weird.
Quote:


> When overclocking these cards, you will get the very best results with MSI Afterburner IF, you have a few certain settings correct.
> 
> The first things you want to verify is that you have voltage cotrol, voltage monitoring, and force constant voltage turned on. The latter is basically the software based equivalent of LLC for CPU voltage, and will help quite a bit with overclocking.
> 
> You also want to turn on extended limits overclocking mode, and make sure that CCC overdrive IS TURNED ON, but does not get used to OC or have a set OC in place. Also, make sure in Afterburner, that the GPU type is set to "standard" versus "GHz edition," because remember that all the GHz edition 7970 and all the OC 280x cards are considered to be "boost" cards. If you use the GHz edition, it can sometimes only use the max OC as a boost setting, not your standard 3d setting.
> 
> Once you find stable max clock speeds in Afterburner, you can save your 3d profile, go to the profile settings, and in the bottom section, select your profile for the 3d mode and afterburner will automatically toggle your clocks to the appropriate setting when a 3d app is launched, and it will always report correct info in 3dmark as well. Once you have all that configured, just select to apply overclock at startup, start afterburner with windows, and start minimized, and you will have hassle free max OC clocks for gaming and benching.... auto profile switching, correct clock speeds reported, and constant voltage force (especially helpful after 1.2v on 280x.)


Hmm, might give it a try again, see if it truly works, I've tried all the options, but maybe I missed to turn those on at once. I uninstalled that silly CCC now though. Use only Tweak and works fine so far as it should.
MSI AB voltage control on ASUS was nonsense and never got applied. Clocks worked I think but the voltage didn't.
Quote:


> 1.25-1.3v seems to be the sweet spot for these cards. It's very common to see core clocks between 1200 and 1280MHz in that voltage range. The increase in voltage is perfectly fine for these cards if you are comfortable with 80c+ temps. It really doesn't bother me, since that's well within the recommended temp range.
> 
> And lastly...


For me it always reports a drop of 30-50mV no matter what voltage is set. In Tweak the voltage is reported a little higher than in GPU-Z.
Past 1170MHz it takes a lot of voltage to get stable. 1200-1280, not on my card









Quote:


> *ARTIFACTS*Yes, the Asus 280x DCU2 TOP and the Matrix platinum, as well as many Sapphire Toxic models, and some other non reference overclocked version suffer from some of the strangest artifacts you will ever encounter....
> 
> These are usually small sections of checkerboard artifacts, sometimes just one small spot on the screen, sometimes larger, or in several places.
> It's not just in games, but sometimes on the desktop, and in browsers.
> 
> I have eliminated these for myself by using a few methods....
> 
> First; try different drivers.... my best experience has been with 14.7 drivers. Some people say it's the BIOS, some people say drivers, some people say hardware.... me? I say it's a little bit of everything wrong with these cards....


I've disabled Firefox hardware acceleration, that stopped it from crashing. But later in one game I had a vertex on an object extended somewhere to infinity (reproducible every time) which was causing large moving planes in the scene, haven't seen it after VBIOS update.
Quote:


> Asus set the VRAM voltage to 1.6v. It is hard locked at that amount, and can not be reduced manually. They did release a BIOS revision that reduces the VRAM to 1.5v, but honestly, even after flashing to this version, MSI AB still reports 1.6v for me. GPU-Z sensor shows the 1.6 is correct. Most people I've seen ended up with 1.5v, which usually fixed, or helped with the artifacting, but also reduced the overclocking headroom they had originally (not always).


I haven't seen the memory voltage reported anywhere yet, not Tweak, not AB, not GPU-Z, with 280x DC2T. Maybe in AB there was memory voltage reported with one of the options but the values reported were nonsensical.
Quote:


> Also, I have found that if I pop the VRAM's cherry (read on
> 
> 
> 
> 
> 
> 
> 
> ), the artifacts will come much quicker, and stay much longer....
> What I mean by that, is recall earlier in my post when I had mentioned the VRAM being able to run at a much higher clock than what it performs best at?? I believe this is due to the default VRAM voltage, and possibly due to heat since the TOP model doesn't cool VRAM at all, and the Matrix only cools 8 of 12 chips.


I have seen the same with DDR3 RAM I have, past a certain point = the stock speed 2.4GHz they perform worse, doesn't matter what voltage.
Quote:


> It seems that, I can run a benchmark or game at a given speed, let's say 1840MHz....
> I can play the game for 45 minutes with no issues, then all of the sudden, I will see either strange shimmers, or that stupid little checkerboard. on some random part of the screen!!!


With memory I usually get colored stripes all over the screen.
With core I usually get a random blink of a couple of red horizontal stripes or a black screen blink or some other instability of image or it freezes the screen.
Quote:


> I can alt/ctrl/del and immediately reduce the clocks back down, even to stock or lower, and these artifacts do not go away. I can even close the game etc, and restart it at stock clocks, and none of that works. It seems also, the card takes a 15-20% performance hit once this happens, even benchmarks artifact really badly during this situation..... At this point, the cherry is popped....
> Through a lot of trial and error, I have discovered that the only way to fix it is to restart windows. Initially I would do full restarts assuming that the power cycling was the fix, however through more recent testing, I discovered that it's as simple as logging out of windows and logging back in.
> It literally fixes the issues instantly, and does NOT return as long as I do not exceed 1810MHz of VRAM clocks during prolonged gaming.










Quote:


> That tells me that simply reloading the driver is what seems to be fixing the issue. Why is that? I honestly have no clue... that's why I assume the issues with these cards seems to be a little bit of everything....


If you just log out, the driver stays. But who knows what other hidden magic is there and what gets flushed out during log out.
All cards have something, I haven't seen one nor one brand that would be magically immune.

---

1030/7200MHz, 1100mV, 100% - Firestrike 7496, 8505
1140/7200MHz, 1200mV, 120% - 7988, 9182
1170/7200MHz, 1244mV, 120% - 8210, 9464

Trying AB now again, ok if I use all the settings listed above as turned ON, it works. But that means using constant voltage which sets voltage for all modes, even 2D mode and if it is not used then it will not control voltage at all, or at least not when card is in 2D and boost/turbo mode.
When I touched the memory voltage and started to slide it, didn't apply, just drag it, I got my typical memory error, vertical stripes all over the screen. That thing is certainly not very friendly when accessing the GPU features.
Clocks work, but voltage has to be forced which makes it good for benching but "useless" for regular use with cards other than MSI I guess, because it doesn't set boost/turbo voltage, not up, not down, at all.
With forced voltage, it works, but it suffers from the same droop of 30-50mV.

Voltage in monitor is reported as 1.2V for core and 1.6V for memory, but in the settings it reported originally 1144mV and 1500mV. This is 015.041 VBIOS and 14.9 driver. AB 4.0.0.

I have disabled Tweak along with it's service, didn't install CCC again.

And when it starts and sets up 1144mV as core clock, it happily applies it and undervolts the core when the forced voltage is enabled. AB should really have 3 options to set, 2D, 3D and 3D boost/turbo. It has one and it for each setting it behaves differently.

For benching, AB ok.
For normal use, Tweak all the way.


----------



## Devildog83

Quote:


> Originally Posted by *marine88*
> 
> Hi people im trying to reach the maximum on my 270x direct cu 2 from asus. im using msi afterburner and I get unstable when reach 1500 mhz on memories do you think is normal? temps are good!


1500 Mhz memory is very high for a 270x, I would run it lower, like at 1450 and concentrate on the core clock. You will not notice much performance for change from1450 to 1500 anyhow.


----------



## Devildog83

I have been thinking about all of the issues folks are having with clocks reporting correctly on overclocking software and benchmarks. It seems that every GPU thread has the same thing, post after post about clocks not reporting correctly or changing to stock plus many other issues along that line. Every monitoring software like GPU-Z or HWinfo64 and every benchmark like Futruemark ones or Unigen auto detect your hardware and settings with a system info file of there own. Could it be that when running multiple programs and CCC at the same time, even if they are not in use that these could be conflicting with each other and causing the clock and clock reporting issues? I am no genius but it seems to me the less programs I have installed and or running the less issues I have. Just wondering.


----------



## marine88

Quote:


> Originally Posted by *Devildog83*
> 
> 1500 Mhz memory is very high for a 270x, I would run it lower, like at 1450 and concentrate on the core clock. You will not notice much performance for change from1450 to 1500 anyhow.


what program do u recomend? afterburner or asus tweaking tool?


----------



## tsm106

Quote:


> Originally Posted by *Devildog83*
> 
> I have been thinking about all of the issues folks are having with clocks reporting correctly on overclocking software and benchmarks. It seems that every GPU thread has the same thing, post after post about clocks not reporting correctly or changing to stock plus many other issues along that line. Every monitoring software like GPU-Z or HWinfo64 and every benchmark like Futruemark ones or Unigen auto detect your hardware and settings with a system info file of there own. Could it be that when running multiple programs and CCC at the same time, even if they are not in use that these could be conflicting with each other and causing the clock and clock reporting issues? I am no genius but it seems to me the less programs I have installed and or running the less issues I have. Just wondering.


Yah, do not run with Overdrive enabled, basic AMD overclocking 101. And yea, concur the less diagnostic apps running while benching the lower overhead, the better. Also Futuremarks FSI will display base clocks if it cannot get a clock reading. If there are any instabilities at play it will display base clocks as well. One can also run AB in no profile mode and it will hold clocks so it will show your clocks.

http://www.3dmark.com/fs/2563952


----------



## Devildog83

Quote:


> Originally Posted by *marine88*
> 
> what program do u recomend? afterburner or asus tweaking tool?


I run afterburner but if you have an Asus card the Asus version bay be better, I don't know for sure.


----------



## Agent Smith1984

Quote:


> Originally Posted by *JackCY*
> 
> 280, simple choice
> I find the 3DMark more messy then it used to be. Now when I close the 3Dmark window after a test, it does go back to boost clocks when running other tests/games. But when kept open, it won't it will do 970/1600 base clock and that's it even if I run 3DMark test again. It's just weird.
> Hmm, might give it a try again, see if it truly works, I've tried all the options, but maybe I missed to turn those on at once. I uninstalled that silly CCC now though. Use only Tweak and works fine so far as it should.
> MSI AB voltage control on ASUS was nonsense and never got applied. Clocks worked I think but the voltage didn't.
> For me it always reports a drop of 30-50mV no matter what voltage is set. In Tweak the voltage is reported a little higher than in GPU-Z.
> Past 1170MHz it takes a lot of voltage to get stable. 1200-1280, not on my card
> 
> 
> 
> 
> 
> 
> 
> 
> I've disabled Firefox hardware acceleration, that stopped it from crashing. But later in one game I had a vertex on an object extended somewhere to infinity (reproducible every time) which was causing large moving planes in the scene, haven't seen it after VBIOS update.
> I haven't seen the memory voltage reported anywhere yet, not Tweak, not AB, not GPU-Z, with 280x DC2T. Maybe in AB there was memory voltage reported with one of the options but the values reported were nonsensical.
> I have seen the same with DDR3 RAM I have, past a certain point = the stock speed 2.4GHz they perform worse, doesn't matter what voltage.
> With memory I usually get colored stripes all over the screen.
> With core I usually get a random blink of a couple of red horizontal stripes or a black screen blink or some other instability of image or it freezes the screen.
> 
> 
> 
> 
> 
> 
> 
> 
> If you just log out, the driver stays. But who knows what other hidden magic is there and what gets flushed out during log out.
> All cards have something, I haven't seen one nor one brand that would be magically immune.
> 
> ---
> 
> 1030/7200MHz, 1100mV, 100% - Firestrike 7496, 8505
> 1140/7200MHz, 1200mV, 120% - 7988, 9182
> 1170/7200MHz, 1244mV, 120% - 8210, 9464


A few things in response to all that....

I have always found it better to leave CCC installed so that overdrive is turned on. I had problems getting extended overclocking limits without it.
I will say though, that it is never good to run any of the other OC tools simultaneously...

Also..... VRAM voltage is only reported, and accessible in MSI AB. There is an arrow tab beside the core voltage, hitting this will drop down other voltage settings, core, VRAM and auxillary.
Not sure if the TOP models running 1.5v have locked memory voltage, but my card is definitely locked at 1.6.

Voltage drops will occur no matter what, but especially with CCC/Trixx/GPUTweak, because they have no active voltage stabilization like AB does (Force constant voltage).
I have tested with and without this option and it definitely made a difference. Usually vdroops were reduced by .03-05v during heavy loads. Again, I suggest you not run any of the other tools with AB.

Driver being loaded, even during logout is true, but I believe it may be getting reloaded when you log back in because I hear my fan speed drastically change from the login screen, to the desktop, and then change again when the custom profile is loaded from AB. I have tested this several times, and the log out fixes any artifact issues every time.

I haven't tried disabling hardware acceleration, though I have heard it does wonders for browser artifacts. Where I'm at with the card now, I don't really need to.

I guess I have been somewhat luckier than most with this card, given the fact that the core clocks as well as it does, and I was pretty much able to get rid of my artifacts.

In closing, my key points for anyone with the 280x (especially the Asus), would be:


Overclocking with stock voltage is fine on GPUTweak, but if you really want to push these cards, try overclocking with AB using my recommended settings...
Experimenting is always good too though, what works for me, may not work for you....
Test performance of core and RAM, individually and together..... don't just find the highest stable speeds, cause they may not always be the best.
Use driver 14.4 or newer for best performance and the least likely chance of artifacts. Be careful with 14.9-14.9.1 & 2 though! I had bad driver crashing, black screening, and random clock speed changing between the card's default settings and the overclock settings with all of those.


And, again, in regards to the reported clock speeds...
I can make 3dmark show my correct speeds.......
http://www.3dmark.com/fs/2860668

Or NOT show my correct speeds








http://www.3dmark.com/fs/3176147


----------



## JackCY

Quote:


> Originally Posted by *Devildog83*
> 
> I have been thinking about all of the issues folks are having with clocks reporting correctly on overclocking software and benchmarks. It seems that every GPU thread has the same thing, post after post about clocks not reporting correctly or changing to stock plus many other issues along that line. Every monitoring software like GPU-Z or HWinfo64 and every benchmark like Futruemark ones or Unigen auto detect your hardware and settings with a system info file of there own. Could it be that when running multiple programs and CCC at the same time, even if they are not in use that these could be conflicting with each other and causing the clock and clock reporting issues? I am no genius but it seems to me the less programs I have installed and or running the less issues I have. Just wondering.


They all detect it somewhere but mostly not live. So 3DMark might just read the default clock (names from GPU-Z) and that's it. Which is 1070/1600 for mine. But other benches may get GPU clock (names from GPU-Z) but in fact the card may be running on lower clocks... Heaven, Furmark, ... they do this.
Quote:


> Originally Posted by *marine88*
> 
> what program do u recomend? afterburner or asus tweaking tool?


MSI: AB
ASUS: Tweak
...
If it's for every day use and your card is not 100% compatible with AB to set voltages without forcing one voltage for all modes.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> A few things in response to all that....
> 
> I have always found it better to leave CCC installed so that overdrive is turned on. I had problems getting extended overclocking limits without it.


For me it seemed to override the third party OC tools when CCC was installed and overdrive enabled. The settings got pushed from the tools to CCC but after a restart it would mess it up, voltage would not get applied as set in Tweak...
It behaves fine now once CCC is gone.
Quote:


> I will say though, that it is never good to run any of the other OC tools simultaneously...


I would add have them installed simultaneously, Tweak has a service, not sure about AB. Makes it a bit of a pain to try them all. Might be best to do one by one and uninstall the previous ones, make sure all the stuff they install is gone or disabled at the least, not running. The hidden services and such.
Quote:


> Also..... VRAM voltage is only reported, and accessible in MSI AB. There is an arrow tab beside the core voltage, hitting this will drop down other voltage settings, core, VRAM and auxillary.
> Not sure if the TOP models running 1.5v have locked memory voltage, but my card is definitely locked at 1.6.


I know, but it says there 1500mV for memory yet report 1600mV in monitor. Shows 1144mV for core but reports 1200mV in monitor. When forced voltage is used, then it shows what is set, not what there really is with droop like GPU-Z does. And with forced it drops 0.1V for 2D mode now, hmm.
Quote:


> Voltage drops will occur no matter what, but especially with CCC/Trixx/GPUTweak, because they have no active voltage stabilization like AB does (Force constant voltage).
> I have tested with and without this option and it definitely made a difference. Usually vdroops were reduced by .03-05v during heavy loads. Again, I suggest you not run any of the other tools with AB.


I only often run GPU-Z to see more truly what is going on with these different tools.
I'll try and see if it helps to force the voltage with higher clocks.

Nope, it doesn't help for me with my card, same artifacts when trying 1220MHz as with Tweak setting the voltage.

---

AB, I've set both 2D and 3D profile they get applied etc. but still 3DMark shows the default clocks. Don't care that it doesn't show properly but curious as to how it's done.
AB monitoring does have a performance impact, not exactly tiny like the other tools.


----------



## Agent Smith1984

I agree that AB's monitor seems to be it's own biggest fan!! lol

I never use it to monitor actual vcore.... GPU-Z only for that. If I look in AB, it never shows any fluctuations at all









It's weird though, you can set voltage and clocks in AB and then close it, and they stick, but if you run something and monitor the voltage in GPU-Z they will fluctuate A LOT!!

I have voltage set to 1.3v

Actual reports at 1.292 in GPU-Z
Under full load in Crysis 3, with AB closed, it averages between 1.235 and 1.265, but will drop down as low as 1.21!! and I get a driver crash, artifacts, or game freeze depending on how close it gets to the 1.21








Mind you, that could be my 8 year old power supply's fault some too....

But, on the other hand, if I leave AB open and force constant voltage, it seems to stay a pretty consistent 1.24-1.26 under load, and only max drops to 1.225 worst case. Never crashes, locks, or artifacts at all.

That is with 1250 core!!! It will run artifact free at 1270, but gets these really fast flickers when the voltage dips below 1.235.

Tested this several times.


----------



## JackCY

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It's weird though, you can set voltage and clocks in AB and then close it, and they stick, but if you run something and monitor the voltage in GPU-Z they will fluctuate A LOT!!
> 
> I have voltage set to 1.3v
> 
> Actual reports at 1.292 in GPU-Z
> Under full load in Crysis 3, with AB closed, it averages between 1.235 and 1.265, but will drop down as low as 1.21!! and I get a driver crash, artifacts, or game freeze depending on how close it gets to the 1.21
> 
> 
> 
> 
> 
> 
> 
> 
> Mind you, that could be my 8 year old power supply's fault some too....
> 
> But, on the other hand, if I leave AB open and force constant voltage, it seems to stay a pretty consistent 1.24-1.26 under load, and only max drops to 1.225 worst case. Never crashes, locks, or artifacts at all.


WEIRD









I'm back to Tweak, uninstalled AB and it's RivaTuner thing, reenabled Tweak service and back in business that does what I expect it to do.
AB didn't help me get higher clocks even for benching, had to use the forced voltage to get it applied at all.
I get similar dips in voltage reported from GPU-Z, a little less from Tweak monitor. My PSU doesn't even spin it's fan I think, I've had Furmark running too. 12.125V reported on idle, 11.968V on Furmark, but it jumps in sort of these steps, either 1 step above 12 or 1 step below. Nor Tweak or GPU-Z can reported it precisely. I don't think EVGA 850 G2 has an issue with the 12V line and a single GPU


----------



## Agent Smith1984

Not as weird as your and my physics scores almost being the same!!!!


----------



## Agent Smith1984

I think I'm going to do some 1.35-1.4v testing with trixx later


----------



## inlandchris

Well, I did some OC on the CPU and GPU and had a near-death experience; but got the computer up and running again by loading the 160Blk profile on the bios (R4BE).

I followed someones bios screens and got big trouble;

http://www.overclock.net/t/1444356/official-asus-rampage-iv-black-edition-owners-club/10530_30#post_23123522

Anyway, I put the GPU back to default and calling it a night...its night over here.

Thanks all for the tips and tricks, I will study them tomorrow.


----------



## Agent Smith1984

Quote:


> Originally Posted by *inlandchris*
> 
> Well, I did some OC on the CPU and GPU and had a near-death experience; but got the computer up and running again by loading the 160Blk profile on the bios (R4BE).
> I followed someones bios screens and got big trouble;
> http://www.overclock.net/t/1444356/official-asus-rampage-iv-black-edition-owners-club/10530_30#post_23123522
> Anyway, I put the GPU back to default and calling it a night...its night over here.
> Thanks all for the tips and tricks, I will study them tomorrow.


What kind of issues did you have with the GPU? What clocks/voltages??
I've gone pretty nuts on mine before with no ongoing issues.... CPU overclocking is another story though


----------



## JackCY

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What kind of issues did you have with the GPU? What clocks/voltages??
> I've gone pretty nuts on mine before with no ongoing issues.... CPU overclocking is another story though


The GPUs do seem to have more safety limits, voltage either not adjustable or only in a "safe" range. Can't be said the same for CPUs









Anyway, ran 3DMark Vantage:
Score *P36308* with AMD Radeon R9 280X(1x) and Intel Core i5-4690K
Graphics Score *41224* CPU Score 26741

I'm using Tweak, and in Vantage it shows the clocks lol. In the newer 3DMark it won't. Also CPU has correct OC clock shown, RAM is wrong though, that one runs on 1.2GHz times two equals 2.4GHz.

3DMark 11
Score *P10839* with AMD Radeon R9 280X(1x) and Intel Core i5-4690K
Graphics Score *11672* Physics Score 8883 Combined Score 9001
^This one is with power limit 100%, noticed that I don't need to increase it as it doesn't seem to hit even with Furmark.
But I've found that OCCT test is even worse than Furmark, runs ok on stock but on the OCed it likes to freeze because the amperage, power is crazy. It's like running Prime95/AVX-stress on CPU, except this stresses crazily GPU. Furmark is ok, good enough for me, every other benchmark is fine too.

1170/1800MHz @ 1244mV is sweet to me. Voltage droop makes it 1160mV at it's lowest with the craziest tests, max doesn't seem to go north of 1230mV, reported by GPU-Z. Temperature is couple C higher compared to stock. But even with quieter fan profile it didn't go past 80C = 40% fan in Heaven [email protected] 77C with default auto fan, [email protected] 75C, [email protected] 72C.


----------



## inlandchris

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What kind of issues did you have with the GPU? What clocks/voltages??
> I've gone pretty nuts on mine before with no ongoing issues.... CPU overclocking is another story though


GPU clock: 1240

GPU Volt: 1.3

VDDC:mV 0875

Mem Clock: 7400

Mem Volt: 1.700

Pwr Target: 120%

Load Line: 75%

VRM Clock: 0750

and I had some overclocking in the cpu. Result when I loaded Firestrike, Matrix LED dark blue and all 5 screens black. Waited but dead. Pwr'd down and lost some data, and put the cpu back to stock and when booted to win7, GPU was still overclocked, so I put it to the defaults. I didn't burn the bios but when booted, asus gpu tweak remembers the old setting and there you are, clocked up.

Had set the MSI burner (yes, it was up too) because gpu tweak didn't have the volts but can't remember what it was, sorry.


----------



## Agent Smith1984

Is that 1.7v memory a typo???

That WILL cause major issues. These things already hate the 1.6v going to them.


----------



## inlandchris

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Is that 1.7v memory a typo???
> 
> That WILL cause major issues. These things already hate the 1.6v going to them.


yep, I did that. I put it up just now to get the readings and then put it back to default. I now have little artifacts on one screen. Requires a reboot, I guess.

Actually, default is 1.6v.


----------



## M1kuTheAwesome

Can any kind sir help me find out why my 280X doesn't like to clock even up to 1200MHz core / stock mem anymore? I tried running Firestrike on my previous, but for a while unused, 1219MHz @ 1.3v AB profile and it crashed. I tried 1200MHz @ 1.3v (power limit always at +20) and still artifacts leading to a crash. Why am I not even getting near my previous sweetest clock? Is it AB acting up or even worse, have I broken my card? I tried switching to the other BIOS but that changed nothing at all. My current AB settings:

Hope it's a simple fix... I want my 1219 without artifacts back...


----------



## diggiddi

BTW Is firestrike free ?


----------



## Agent Smith1984

The standard version that makes you watch the AWESOME, but way too long of a demo is....

lol

I guess the good thing... the demo let's your card get good and hot before benching, and will normally reveal any artifacting issues before you even get to the actual benchmark part.


----------



## JackCY

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The standard version that makes you watch the AWESOME, but way too long of a demo is....
> 
> lol
> 
> I guess the good thing... the demo let's your card get good and hot before benching, and will normally reveal any artifacting issues before you even get to the actual benchmark part.


Exactly, the demo uses a lot of stuff where as the tests that are afterwards are specialized on one thing only, plus the last is combined CPU+GPU. The demo is kind of worse than what goes afterwards









M1ku: card has aged, like CPUs do after running high voltage for a long time. I would rather find out again highest OC and settle on something less on the edge.


----------



## StangMan04

How is my overclock and 3dmark11 score? First time overclocking a GPU.

R9 280 overclocked to 1175MHz/1375MHz memory clock.

www.3dmark.com/3dm11/8955570


----------



## buttface420

okay so this just happened in BF4, is my card dying, just a minor glitch, or new driver 14.11.1 beta being stupid?


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *JackCY*
> 
> Exactly, the demo uses a lot of stuff where as the tests that are afterwards are specialized on one thing only, plus the last is combined CPU+GPU. The demo is kind of worse than what goes afterwards
> 
> 
> 
> 
> 
> 
> 
> 
> 
> M1ku: card has aged, like CPUs do after running high voltage for a long time. I would rather find out again highest OC and settle on something less on the edge.


So simply getting old after 8 months of wear? Too bad.


----------



## JackCY

Quote:


> Originally Posted by *buttface420*
> 
> 
> okay so this just happened in BF4, is my card dying, just a minor glitch, or new driver 14.11.1 beta being stupid?


It's called you are dead








From what I've heard BF, any version, has so many bugs it's pretty much pointless to judge anything from those games.

Does it do that on stock clocks?
Quote:


> Originally Posted by *StangMan04*
> 
> How is my overclock and 3dmark11 score? First time overclocking a GPU.
> 
> R9 280 overclocked to 1175MHz/1375MHz memory clock.
> 
> www.3dmark.com/3dm11/8955570


Pretty neat, only shows that 280 is barely a chopped down 280x, apart from the few stream processors and lower frequencies they are equal.
Otherwise there are many scores to compare with, just look on top of that page you posted for example, or in reviews, etc. simple.

But it's more about FPS and stability than bench scores. There the 280 loses a little more compared to 280x.
Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> So simply getting old after 8 months of wear? Too bad.


Yeah it can wear out. Or voltage OC is not applied but then I doubt it would not freeze instantly or you wouldn't notice it's not applied.


----------



## inlandchris

Alright, I am starting over. Here is stock R4BE no clocking and the GPU is under-clocked; power saving mode. Mem Volt=1.6, GPU=1.188 and GPU Clk=1050, these are the lowest settings from Asus GPU Tweak, Pwr Target get set as 90% and Load Line 75%.

Tomorrow, I will go from there. No damage to the GPU and installed new bios on the Mobo, running cooler.

http://www.3dmark.com/3dm/4689282


----------



## Agent Smith1984

Quote:


> Originally Posted by *inlandchris*
> 
> Alright, I am starting over. Here is stock R4BE no clocking and the GPU is under-clocked; power saving mode. Mem Volt=1.6, GPU=1.188 and GPU Clk=1050, these are the lowest settings from Asus GPU Tweak, Pwr Target get set as 90% and Load Line 75%.
> Tomorrow, I will go from there. No damage to the GPU and installed new bios on the Mobo, running cooler.
> http://www.3dmark.com/3dm/4689282


What are you trying to accomplish with the underclocking/volting?


----------



## JackCY

Quote:


> Originally Posted by *inlandchris*
> 
> Alright, I am starting over. Here is stock R4BE no clocking and the GPU is under-clocked; power saving mode. Mem Volt=1.6, GPU=1.188 and GPU Clk=1050, these are the lowest settings from Asus GPU Tweak, Pwr Target get set as 90% and Load Line 75%.
> Tomorrow, I will go from there. No damage to the GPU and installed new bios on the Mobo, running cooler.
> http://www.3dmark.com/3dm/4689282


What are you doing?
Matrix P has memory voltage unlocked and modifiable with Tweak?
Power target only limits maximum power draw and reduces clocks and hopefully voltage too when the limit is reached. Mostly only protects cards from getting fried with stuff like Furmark.
You can also change load line on Matrix P with Tweak?


----------



## Blinky7

After having played with 4 Tahiti cores in the past year ( 3x 7950 and a 7970) I can say that this was a joy to overclock. VBE7 totally took care of the power limits, all my gpus could go to 1.3v and my last reference 7970 could even go to 1.38v with the modded trixx and 1.7v for memory. Hynix chips were god up to 1850mhz....everything was great.

Unfortunately having to sell it in order to sell the fullcover block as a package deal, I needed a card to tide me over til after christmash and the new r9 390X series... Searching in the classifieds of my country for a good deal, this card ended up being an Asus R9 270X Top DCII...

Now I need some help overclocking it since I got no experience with Pitcairn. First of all, does VBE7 work well with the Pitcairn bioses? Any limitations on what you can change? (for example in the Tahiti you could not change 2D clocks). Because the Pitcairn has no dual-bios, and I get the warning that "this is an UEFI bios not supported by VBE7, blah blah blah" when I load the stock bios on VBE, I would like to know beforehand if it will work and what I can change, as to not brick the card...

Then, I can see the card going up to 1.3v in gpu-tweak, but no control for memory voltage. Is there any other software that can add voltage to the memory or that can supply more than 1.3v? What is the suggested software to use here?

Also, should I be worried about the power limiter? Shall I disable through bios (if it works) or generally these cards with their limited overclocking don't approach the power limiter when set to +20%?

Unfortunately my card has Elpida ram that is so crappy it doesn't even run at its rated 1500mhz...(hoping for some way to add voltage here!).

Any advice would be welcome!


----------



## ZorracK

Hey guys, i just bought a Sapphire R9 280 dual-x OC.
Out of the box i take to 1070 and 1450 scoring firestrike 7150 on [email protected]
Temp in my case chieftec dragon full tower modded never over 64 C.
Is this good? I can try for more?


----------



## inlandchris

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What are you trying to accomplish with the underclocking/volting?


Establish a baseline. This is the lowest clocks and still with "OK" score.

Quote:


> Originally Posted by *JackCY*
> 
> What are you doing?
> Matrix P has memory voltage unlocked and modifiable with Tweak?
> Power target only limits maximum power draw and reduces clocks and hopefully voltage too when the limit is reached. Mostly only protects cards from getting fried with stuff like Furmark.
> You can also change load line on Matrix P with Tweak?


Baseline, baseline; starting over.

Now, I overclocked both cpu and gpu.

cpu=4.2 with no over volt but using X.M.P.

Gpu settings is on the website in the description box.

Cannot go over 1250 GPU clock, getting artifacts.

GPU voltage didn't dare go to 1.3, kept it at 1.282 v.

Mem voltage kept it at 1.620; that is the standard games preset set by ASUS.

Before I had 1.7 and did cause too much problems.

So, going over 1.6 v. is ok with Matrix.

http://www.3dmark.com/3dm/4703602

I don't know why the website only posts stock clocks but guess that is a bug everyone is talking about.

Still, can't beat AgentSmith's graphics score; good one!

Some graphics cards are better than others; like CPU's.


----------



## Agent Smith1984

Quote:


> Originally Posted by *inlandchris*
> 
> Establish a baseline. This is the lowest clocks and still with "OK" score.
> 
> Baseline, baseline; starting over.
> Now, I overclocked both cpu and gpu.
> cpu=4.2 with no over volt but using X.M.P.
> Gpu settings is on the website in the description box.
> Cannot go over 1250 GPU clock, getting artifacts.
> GPU voltage didn't dare go to 1.3, kept it at 1.282 v.
> Mem voltage kept it at 1.620; that is the standard games preset set by ASUS.
> Before I had 1.7 and did cause too much problems.
> So, going over 1.6 v. is ok with Matrix.
> 
> http://www.3dmark.com/3dm/4703602
> 
> I don't know why the website only posts stock clocks but guess that is a bug everyone is talking about.
> 
> Still, can't beat AgentSmith's graphics score; good one!
> Some graphics cards are better than others; like CPU's.


Okay, so in your firestrike link, there are these notes:

GPU CLK=1250 GPU V. = 1.282 VDDCI 0.875 V. MEM CLK=7400 MEM V. =1.620 PWR TARGET= 120% Load Line = 75% VRM CLK=0705 Khz

If those are the clocks you are running at, then something isn't right... your graphics score should be about 10,100+

What driver are you using??

Have you tried memory at 1800 instead of 1850?

Mine would do 1860 artifact free, but took a performance hit after 1820......
Try 1800-1825 and see if the score improves..... My graphics score at 1250/1800 was 10,080+/-

Still, you are looking a lot better, these cards scale really nice performance wise with overclocking.....


----------



## inlandchris

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Okay, so in your firestrike link, there are these notes:
> 
> GPU CLK=1250 GPU V. = 1.282 VDDCI 0.875 V. MEM CLK=7400 MEM V. =1.620 PWR TARGET= 120% Load Line = 75% VRM CLK=0705 Khz
> 
> If those are the clocks you are running at, then something isn't right... your graphics score should be about 10,100+
> 
> What driver are you using??
> 
> Have you tried memory at 1800 instead of 1850?
> 
> Mine would do 1860 artifact free, but took a performance hit after 1820......
> Try 1800-1825 and see if the score improves..... My graphics score at 1250/1800 was 10,080+/-
> 
> Still, you are looking a lot better, these cards scale really nice performance wise with overclocking.....


Nope, that is the max. I am using ver. 14.10.0 vid drivers. The only way to get faster is put in a second gpu and since there are no more around, I am happy with these scores. Thanks for pushing but these scores are good enough.


----------



## Agent Smith1984

What I'm saying is.... Running the memory at 1800 may score better, it did on mine....
Also, try cat 14.7, that was the most stable and artifact free on my card


----------



## JackCY

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Okay, so in your firestrike link, there are these notes:
> 
> GPU CLK=1250 GPU V. = 1.282 VDDCI 0.875 V. MEM CLK=7400 MEM V. =1.620 PWR TARGET= 120% Load Line = 75% VRM CLK=0705 Khz
> 
> If those are the clocks you are running at, then something isn't right... your graphics score should be about 10,100+
> 
> What driver are you using??
> 
> Have you tried memory at 1800 instead of 1850?
> 
> Mine would do 1860 artifact free, but took a performance hit after 1820......
> Try 1800-1825 and see if the score improves..... My graphics score at 1250/1800 was 10,080+/-
> 
> Still, you are looking a lot better, these cards scale really nice performance wise with overclocking.....


I agree, my puny 280x DC2T does only stable 1170/1800MHz @1250mV and it's graphics score is higher and FPS is also a tiny bit higher.
With 1250/1850MHz the card should be way above.

So despite the Matrix P having all the goodies unlocked to manipulate voltages and LLC and all that, it doesn't seem to run as fast as it should :|

---

On a different note, I've noticed that Heaven with specific settings likes to freeze with OC even a smaller one than what I use. That is when I set anti aliasing to 4x, only 4x, anything else seems to work fine, 8x, off, other programs and games even using 4x AA.
The image freezes. Once I also saw a black screen and then the whole thing resumed before I got to the PC to reset it. Weird. Doesn't seem to do on stock though so it's something due to OC. Not artifacts anywhere in anything else even Heaven at other than 4x AA settings








Sometimes Heaven would flop and die/crash, return to desktop instead of freeze.

Another is that if I have flash running in Firefox or that 3DMark open that both cause GPU to run at 500/1600MHz, unless I close these apps so the GPU can return to 300/600MHz, if I open a game or anything the boost/turbo doesn't get applied and it runs only 970/1600MHz. There is something wrong with how the boost/turbo and different modes are handled. Will have to see now what it does after a fresh install if it's caused by Tweak or CCC or it does on clean install as well.

Can't say I like the ATI/AMD drivers, not that Nvidia had better though sometimes they never made one for a card that worked 100% everywhere it was like choosing what game do you want to have artifacts in based on what driver was installed.


----------



## Agent Smith1984

1170-1200 at the 1.25 is about right...... You can get 1220-1260 at 1.3

Something about the 1.6v memory lets it clock super high stable, but it doesn't seem to like being ran past 1800.....

My graphics score was 10k, so he isn't very off, mine was about the same as his when I tried to run the memory too high, that's why I think he should try 1800.

Do you have a firestrike for yours at 1170/1800?


----------



## Pibbz

Figured I'd make an appearance on this thread since I bought a PowerColor 280 not long ago. It has that fan coil noise but it runs great regardless.


----------



## JackCY

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 1170-1200 at the 1.25 is about right...... You can get 1220-1260 at 1.3
> 
> Something about the 1.6v memory lets it clock super high stable, but it doesn't seem to like being ran past 1800.....
> My graphics score was 10k, so he isn't very off, mine was about the same as his when I tried to run the memory too high, that's why I think he should try 1800.
> 
> Do you have a firestrike for yours at 1170/1800?


I can run up to 1220 but not artifact free or stable at 1.3V, my card doesn't scale or likes to go above 1170 no matter the voltage. Probably could 1180 at 1.3 but that's pointless. It didn't seem to run 1200 at 1.3V artifact free.

I've tried 1180 and 1175 at 1.25V but that seemed to not be stable on some occasions so it's 1170 which seems to run fine except the Heaven with 4x AA that crashes any setting I'm running close to the limit lol.
Definitely doesn't seem to scale memory above 1800MHz, tried but it either freezes or artifacts vertical lines all over the screen.

I have Firestrike runs on the previous pages/posts, I save the results but let me see if I can dig out a link from history.
I think this was the 1210/1800.
This should be 1070/1600 stock.
1170/1800 this is what I use,
1175/1800


----------



## Agent Smith1984

Your clocks are somewhat on the low end of typical results.... are you forcing constant voltage in afterburner?

Have you checked what the actual voltage is reporting at? Have you checked temps? In my experience, my 280x didn't want anything over 90c with an overclock before it would artifact.

Also, I find it so crazy how close my x6 is to your 4690k in performance on FireStrike...... at 4.2GHz I can hit 8900 points on Physics!


----------



## inlandchris

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What I'm saying is.... Running the memory at 1800 may score better, it did on mine....
> Also, try cat 14.7, that was the most stable and artifact free on my card


You are right! I installed version 14.9 (non-beta) and my score is much better than ver. 14.4 I was using. Same voltages used, ~239 better. 300 better on Gamers preset from ASUS GPU Tweak. Below is score with my last voltages/Freq the same, no higher, just newer drivers.

details in description box: 1850 Mhz

http://www.3dmark.com/3dm/4715013


----------



## Agent Smith1984

I still think you should try 1800MHz memory, my score was over 10k (graphics) at 1250/1800

I could run at 1850 too, but the memory didn't "like" it as much as it did 1820 or less, and ran better at a lower frequency.....

I think you could be looking at 9k (total) points if you get the graphics score where it should be (around 10k)!!! Very nice so far....

Edit: Score clarifications in parenthesis.


----------



## inlandchris

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I still think you should try 1800MHz memory, my score was over 10k (graphics) at 1250/1800
> 
> I could run at 1850 too, but the memory didn't "like" it as much as it did 1820 or less, and ran better at a lower frequency.....
> 
> I think you could be looking at 9k (total) points if you get the graphics score where it should be (around 10k)!!! Very nice so far....
> 
> Edit: Score clarifications in parenthesis.


I did try 1800 and the score went down by 111 points or 96 graphic points lower.

http://www.3dmark.com/3dm/4716191

Calling it a night, going to bed, thanks for the help


----------



## Agent Smith1984

Good deal bud, either way 1250/1850 are great clocks if they are stable for daily use....

Your total firestrike score is great for a 280x system, of course that sweet ass CPU helps quite a bit









Enjoy the gaming, cause a few hundred points don't matter down the stretch, enjoying the game time DOES.


----------



## JackCY

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Your clocks are somewhat on the low end of typical results.... are you forcing constant voltage in afterburner?
> 
> Have you checked what the actual voltage is reporting at? Have you checked temps? In my experience, my 280x didn't want anything over 90c with an overclock before it would artifact.
> 
> Also, I find it so crazy how close my x6 is to your 4690k in performance on FireStrike...... at 4.2GHz I can hit 8900 points on Physics!


AB and forcing voltage did not help, it's the only way to get it applied by AB, tried it but the voltage was reported at the same levels as always. Didn't help with higher clocks at all.
Voltage likes to drop by as much as .06V, 12V line drops to reported 11.94V lowest.
Temps. never go over 80C.
1090T has 6 cores though and an 8350/8370 now is better








I still chose Intel because it's newer and overall faster (CPUs, I/O on mobo, ...) with lower power consumption.

I would not say 280x clocks easily over 1200MHz, maybe if you cherry pick the card or are lucky. They didn't get past 1200MHz in many reviews often either with 280x.
Second hand card bought for cca 50% price of what a 970 costs now. Can't beat the bang for buck.
I've seen user results of 970 in Heaven and while yes it is faster on average it is not by much (just like 290) and the minimum fps instead of being in the 20-30s as with 280x on 970 dips below 10fps for some reason...
Quote:


> Originally Posted by *inlandchris*
> 
> You are right! I installed version 14.9 (non-beta) and my score is much better than ver. 14.4 I was using. Same voltages used, ~239 better. 300 better on Gamers preset from ASUS GPU Tweak. Below is score with my last voltages/Freq the same, no higher, just newer drivers.
> details in description box: 1850 Mhz
> 
> http://www.3dmark.com/3dm/4715013


Why would you keep using old drivers? :S
Sure if they help your performance but they clearly didn't








Now it looks in order.


----------



## Agent Smith1984

Quote:


> Originally Posted by *JackCY*
> 
> I would not say 280x clocks easily over 1200MHz, maybe if you cherry pick the card or are lucky. They didn't get past 1200MHz in many reviews often either with 280x.
> Second hand card bought for cca 50% price of what a 970 costs now. Can't beat the bang for buck.
> I've seen user results of 970 in Heaven and while yes it is faster on average it is not by much (just like 290) and the minimum fps instead of being in the 20-30s as with 280x on 970 dips below 10fps for some reason...
> Why would you keep using old drivers? :S
> Sure if they help your performance but they clearly didn't
> 
> 
> 
> 
> 
> 
> 
> 
> Now it looks in order.


Well, 280x in general, 1200+ is not common, but on the Asus TOP and Matrix it is...

All of the Asus DC2T reviews I saw before getting my card showed core clocks of 1200-1260MHz....

It's not to say that they all will though, I did see a few where 1180 was the max..... That card probably had more reviews done on it than any other 280x offered.....


----------



## Recr3ational

My 280x can go above 1200? Both my powercolors were at 1250/1650? If I recall. Currently upgrading radiators, will bench when it's up and running


----------



## mutatedknutz

Hey can i be added?
asus amd r9 280


----------



## Devildog83

Quote:


> Originally Posted by *mutatedknutz*
> 
> Hey can i be added?
> asus amd r9 280
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


I will add you but could you post a picture and let me know if it's a DC II or not please. Welcome !!!


----------



## JackCY

3DMark sucks, made a fresh install, guess what...
Start Firestrike, demo starts loading and soon closes during the loading only to proceed to go to test 1, ... which works.
So reinstall of 3DMark, now to default directory just to be sure. Guess what, demo works, but the scores don't as the systeminfo service nonsense did not work so it told me to install latest version of it with a link, got it, run install, tells me I already have the latest version installed of that service. duh.
Sick of these 3DMark bugs, uninstalled it all to keep the system clean.

Setting clocks in CCC now works fine and are kept between restarts, power off etc.
Also 2D minimum voltage is 0.85V instead of 0.95V when I use Tweak. I think it's even higher voltage in 2D when forcing it via AB.
1140 vs 1170 doesn't seem like a huge difference, might actually keep this default voltage with clock 1140, a little colder too which means less fan noise.

GPU-Z shows ASIC quality 59.0%.

ASUS GPU Tweak lol. Ok this time I don't need an administrator account to install it, just ran it as admin via right click menu and it installed fine, BUT, here are two differences:
fan profile setup is gone, which was wonky anyway BTW
setting 2D clocks is gone, clicking the 3D up top doesn't bring out the 2D/3D selection, lets hope the 2D voltage stays low unlike last time when it bumped to 0.95V and raised consumption a little

Yep, seems like no conflict this time between CCC and GPU Tweak, both enabled and Tweak sets the stuff and voltage drops even with Tweak setting the OC nicely back to 0.85V in 2D mode.
There must have been some unlucky mess last time. Fresh install helped.

I'm done with 3DMark, maybe if they decide to fix the bugs and release newer demos.

So here are Heaven 4.0 results:
4690K stock, RAM 1333 CL9, 280x DC2T stock
FPS: 42.9 Score: 1081 Min FPS: 8.1 Max FPS: 78.5

4690K 46/43, RAM 2400 CL11, 280x DC2T stock
FPS: 42.4 Score: 1069 Min FPS: 23.3 Max FPS: 76.5
Worse scores for who knows what reason, consistently, but the minimum FPS is up a lot. Maybe it was a lucky run at those stock CPU and RAM clocks or it liked the low memory latency.

4690K 46/43, RAM 2400 CL11, 280x DC2T 1140/1800 1.20V 120%
FPS: 45.9 Score: 1156 Min FPS: 24.7 Max FPS: 82.1

4690K 46/43, RAM 2400 CL11, 280x DC2T 1170/1800 1.25V 120%
FPS: 46.6 Score: 1174 Min FPS: 24.9 Max FPS: 83.0

Settings
Render: Direct3D11
Mode: 1920x1200 8xAA fullscreen
Preset Custom
Quality Ultra
Tessellation: Normal


----------



## choLOL

Hi, can I join the club?








PowerColor TurboDuo R9 280x


Spoiler: Warning: Spoiler!





Link to GPU-Z validation


----------



## JackCY

Add me too to the list while you're at it









*JackCY - ASUS R9280X-DC2T-3GD5 - 1170/1800*


----------



## jrizzz

Quote:


> Originally Posted by *JackCY*
> 
> Add me too to the list while you're at it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *JackCY - ASUS R9280X-DC2T-3GD5 - 1170/1800*


Is that on stock memory voltage or were you able to adjust it? Cus I kno you cant adjust mem voltage on the DC2T unless maybe a bios flash


----------



## craige

Guyz,

My 7870 Ghz Is dead and Sapphire is gona give me replacement.... So I assume it should Be R9 270, Right?

Now the confusion is if its indeed R9 270 - Do they have to give plain 270 OR 270X ? Which one is closer to the spec of 7870....

Please let me know.

Pease give me links tht clearly shows: That they are identical, as I have to give the dumb service centers printout to prove that.

PS: The core clock of 270 is lower than 7870 but rest of the spec is higher, but 270x core clock is higher than 7870.

Snapshot of my GPU Specs:


----------



## wermad

My guess is 270 but cross your fingers for a 270X (or a 280







).


----------



## jason387

I'm in the same position as you as my Sapphire 7870 Ghz Edition started giving me black and grey lines and then died. I expect a R9 270X. The R9 270 comes with a lower core clock than the 7870 while the 270X comes with a higher core clock.


----------



## craige

Quote:


> Originally Posted by *jason387*
> 
> I'm in the same position as you as my Sapphire 7870 Ghz Edition started giving me black and grey lines and then died. I expect a R9 270X. The R9 270 comes with a lower core clock than the 7870 while the 270X comes with a higher core clock.


Well was the replacement New ? OR, Refurbished in a Box ?


----------



## jason387

Quote:


> Originally Posted by *craige*
> 
> Well was the replacement New ? OR, Refurbished in a Box ?


Haven't yet received it yet. A friend of mine got his GPU back from RMA. He gave in a Sapphire 7950 and got back a Refurb 7950. We might even get Refurbished 7870's.


----------



## mutatedknutz

Quote:


> Originally Posted by *Devildog83*
> 
> [/SPOILER]
> 
> I will add you but could you post a picture and let me know if it's a DC II or not please. Welcome !!!


hey man here are the pics
hope its good to go


----------



## buttface420

on my 280x, i sometimes have a random black screen flash...it only last half a second. sometimes during the screen flash it'll have a green bar/bars run thru it. it does it randomly maybe 3-4 times during a 8 hour day weather gaming or web surfing or watching videos, or doing nothing at all. no other artifacts or crashing ever. plays games excellent.

its done this ever since i got the card,and since its so minor i never really cared to look into it. but im wondering if anyone has had the same issue and if they had found a solution.


----------



## BruceB

Quote:


> Originally Posted by *buttface420*
> 
> on my 280x, i sometimes have a random black screen flash...it only last half a second. sometimes during the screen flash it'll have a green bar/bars run thru it. it does it randomly maybe 3-4 times during a 8 hour day weather gaming or web surfing or watching videos, or doing nothing at all. no other artifacts or crashing ever. plays games excellent.
> 
> its done this ever since i got the card,and since its so minor i never really cared to look into it. but im wondering if anyone has had the same issue and if they had found a solution.


That's not what I'd call a minor Problem!








First step is to properly clean out all your graphics Drivers and do a re-install. If that dosen't work then try it in a different machine.


----------



## Agent Smith1984

Quote:


> Originally Posted by *buttface420*
> 
> on my 280x, i sometimes have a random black screen flash...it only last half a second. sometimes during the screen flash it'll have a green bar/bars run thru it. it does it randomly maybe 3-4 times during a 8 hour day weather gaming or web surfing or watching videos, or doing nothing at all. no other artifacts or crashing ever. plays games excellent.
> 
> its done this ever since i got the card,and since its so minor i never really cared to look into it. but im wondering if anyone has had the same issue and if they had found a solution.


I've said this before.....

Welcome to the hell that is owning a 280x :/

You aren't the first person to have issues like this.....

It could be driver, could be hardware..... best thing to do is try different drivers....
This isn't by chance an Asus TOP/Matrix or Sapphire Toxic 280x is it??


----------



## End3R

Yea I had a similar issue with my 270x, and it progressively got worse, had to end up RMAing it. That being said, my replacement has been an absolute beast.

Still try the normal update drivers/reseat the card steps though that have been suggested.


----------



## M1kuTheAwesome

Not sure if this is in any way a similar issue, but my Windows Aero occasionally closes for a few seconds, then starts up again, causing the screen to flicker. Never saw that as too much of an issue and I hope I'm right cause if it gets worse and I need a new card I probably have to buy one, after I put my ghetto cooling on I lost one of the original fan screws. They will probably notice that if I ever try to RMA.


----------



## buttface420

its a sapphire dual-x 280x.

funny thing is my previous card (7790) also did this. i have since then completely changed my pc (new mb,cpu,power supply,ram, even case) and i still have this problem.. i always thought it was beta drivers but a tech at sapphire told me its a signal dump im experiencing and to check my hdmi cable and guess what? thats the only thing i havent replaced and its bent really bad at the connector piece so im waiting to try another one to see if thats actually it. also my wall socket that ive been using is faulty so im wondering if its causing some type of dump


----------



## aaronsta1

Quote:


> Originally Posted by *buttface420*
> 
> its a sapphire dual-x 280x.
> 
> funny thing is my previous card (7790) also did this. i have since then completely changed my pc (new mb,cpu,power supply,ram, even case) and i still have this problem.. i always thought it was beta drivers but a tech at sapphire told me its a signal dump im experiencing and to check my hdmi cable and guess what? thats the only thing i havent replaced and its bent really bad at the connector piece so im waiting to try another one to see if thats actually it. also my wall socket that ive been using is faulty so im wondering if its causing some type of dump


it might be your monitor reacting to the change in performance modes?

does it mainly do it when you are viewing in chome?

ive had something similar to this on my 7870xt, every once in awhile my screen jumps down and back up like the v hold is out on an hold tv..

i noticed that it does it when it switches from the 300/300 profile to the 500/1250 profile.

im guessing the monitor is the blame.. its not fast enough to compensate. i have it hooked up to dvi, i havent really tested to see if hdmi will help.


----------



## JackCY

280x DC2T are any of the memory modules cooled?








I'm trying to figure out what sometimes causes artifacts. If it's core or memory, I guess it's memory since it repeats until I restart OS, restarting game doesn't help it must remain in memory or some code somewhere loaded on GPU gets corrupted.
Maybe those 1800MHz are still a tiny bit too much. Trying to find out by playing at different settings. It takes a long time for it to occur. Say once a game day.


----------



## Devildog83

Quote:


> Originally Posted by *craige*
> 
> Guyz,
> 
> My 7870 Ghz Is dead and Sapphire is gona give me replacement.... So I assume it should Be R9 270, Right?
> 
> Now the confusion is if its indeed R9 270 - Do they have to give plain 270 OR 270X ? Which one is closer to the spec of 7870....
> 
> Please let me know.
> 
> Pease give me links tht clearly shows: That they are identical, as I have to give the dumb service centers printout to prove that.
> 
> PS: The core clock of 270 is lower than 7870 but rest of the spec is higher, but 270x core clock is higher than 7870.
> 
> Snapshot of my GPU Specs:


I am a bit late on this but the 270 is about the same as a 7850 and a 270x is a rebranded 7870. I would guess 270x, there are slight differences but you get the picture.


----------



## Devildog83

Quote:


> Originally Posted by *mutatedknutz*
> 
> hey man here are the pics
> hope its good to go


Thanks !! Nice card.


----------



## Devildog83

*cloLOL and JackCY* have been added. Welcome to the club !!! Could you possibly post pics of your cards for the rest of the members to see?

Thanks,
DD


----------



## Paradox3713

Forgive me for troubling you guys, but can anyone point me to a waterblock that fits the Sapphire 280X Toxic? I've got three soon to be four cards that I need watercooled for Star Citizen but trying to find a waterblock for these cards is pretty difficult. Any assistance would be much appreciated. Thanks!

- 'Paradox'


----------



## wermad

Quote:


> Originally Posted by *Paradox3713*
> 
> Forgive me for troubling you guys, but can anyone point me to a waterblock that fits the Sapphire 280X Toxic? I've got three soon to be four cards that I need watercooled for Star Citizen but trying to find a waterblock for these cards is pretty difficult. Any assistance would be much appreciated. Thanks!
> 
> - 'Paradox'


They're not "reference" and ek is one of the few that makes blocks for non-reference cards. So there are no "full cover" blocks for your Toxic models. Your only option if you keep these cards is to go w/ universal blocks. There's quite a few of them. Other then this, you can always buy cards that are compatible. Use this link to help:

http://coolingconfigurator.com/step1_complist

I'm in the same boat as you. My MSI 280X's are slightly different then the first version and no longer are compatible w/ the EK blocks (







). I don't plan to keep these as I want to go fullcover blocks eventually on my gpu's.


----------



## craige

Quote:


> Originally Posted by *Devildog83*
> 
> I am a bit late on this but the 270 is about the same as a 7850 and a 270x is a rebranded 7870. I would guess 270x, there are slight differences but you get the picture.


Yes sure your input is appreciated., I have clearly told the service center I aint accepting non "X" version and they were like let us see...


----------



## mutatedknutz

T
Quote:


> Originally Posted by *Devildog83*
> 
> Thanks !! Nice card.


Thanks man
By the way you got my name wrong on the list, its mutatedknutz







just to be safe


----------



## buttface420

Quote:


> Originally Posted by *mutatedknutz*
> 
> T
> Thanks man
> By the way you got my name wrong on the list, its mutatedknutz
> 
> 
> 
> 
> 
> 
> 
> just to be safe


lol they also got me down as buttface240 , its buttface420!!! how do i put the official tag in my signature?


----------



## JackCY

Quote:


> Originally Posted by *Devildog83*
> 
> *cloLOL and JackCY* have been added. Welcome to the club !!! Could you possibly post pics of your cards for the rest of the members to see?
> 
> Thanks,
> DD


This ghetto will have to do:


Testing it with and without the cover:


It's better with the cover for power temperatures and only impacts core temperature by 1C.


----------



## wermad

Quote:


> Originally Posted by *JackCY*
> 
> This ghetto will have to do:
> 
> 
> Testing it with and without the cover:
> 
> 
> 
> It's better with the cover for power temperatures and only impacts core temperature by 1C.


Box testing FTW







. I do that all the time I get a new board









Btw, you can imbed your images via ocn's image ft, just click the image symbol, browse your puter or paste the image url, done. It will also save them on ocn so you can re-attach them and view them in your gallary.


----------



## JackCY

Yeah I did that, forgot when digging links to the pics.
It's handy to test outside the case with new parts, I didn't have the case because it was stuck with a mouse out of stock in the order, returned the mouse later anyway (>_<)
Now it's all in a case that's been moded. Grills cut out and cut a bigger intake for PSU, higher feet, all cages gone, added anti vibration material, added HDD LED and changed power LED. Don't use PSU FAN nor the LEDs lol. Should have rather converted the bottom filter for front extraction. This way it's never gonna get cleaned.


----------



## mutatedknutz

Quote:


> Originally Posted by *buttface420*
> 
> lol they also got me down as buttface240 , its buttface420!!! how do i put the official tag in my signature?


click on your name on top right and click edit signature and paste the tag which is on the first page of the thread


----------



## wermad

Quote:


> Originally Posted by *JackCY*
> 
> Yeah I did that, forgot when digging links to the pics.
> *It's handy to test outside the case with new parts*, I didn't have the case because it was stuck with a mouse out of stock in the order, returned the mouse later anyway (>_<)
> Now it's all in a case that's been moded. Grills cut out and cut a bigger intake for PSU, higher feet, all cages gone, added anti vibration material, added HDD LED and changed power LED. Don't use PSU FAN nor the LEDs lol. Should have rather converted the bottom filter for front extraction. This way it's never gonna get cleaned.












Alrighty guys, I just ordered a Sapphire 290 TriX oc card, so my lovely MSI 280X will be up for sale soon.


----------



## Falconx50

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Good deal bud, either way 1250/1850 are great clocks if they are stable for daily use....
> 
> Your total firestrike score is great for a 280x system, of course that sweet ass CPU helps quite a bit
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Enjoy the gaming, cause a few hundred points don't matter down the stretch, enjoying the game time DOES.


just seeing these Firestrike results makes me want to change my CPU, so which should I ask Santa for? The FX 8320 or another type, and why? My Graphics score is fantastic I think, but my FX 6100 is slowing me down.

http://www.3dmark.com/3dm/4776881


----------



## [CyGnus]

Get a 4690K and a Gigabyte Z97 Gaming 5 problem solved







the intel will be faster than the 8320 will give much better framerate though is a bit expensive the smarter buy would be in fact the 8320 but i think that is more like a sidegrade than a upgrade


----------



## aaronsta1

Quote:


> Originally Posted by *Falconx50*
> 
> just seeing these Firestrike results makes me want to change my CPU, so which should I ask Santa for? The FX 8320 or another type, and why? My Graphics score is fantastic I think, but my FX 6100 is slowing me down.
> 
> http://www.3dmark.com/3dm/4776881


yeah but not by much

http://www.3dmark.com/fs/2719933

http://www.3dmark.com/fs/3009222

sucks because my "old" i7 is still faster then my new 8350..

here is my 8350 with the 7870XT cards..
http://www.3dmark.com/fs/3270444

and the i7 with the 7870XT cards
http://www.3dmark.com/fs/2241648


----------



## Agent Smith1984

Quote:


> Originally Posted by *Falconx50*
> 
> just seeing these Firestrike results makes me want to change my CPU, so which should I ask Santa for? The FX 8320 or another type, and why? My Graphics score is fantastic I think, but my FX 6100 is slowing me down.
> 
> http://www.3dmark.com/3dm/4776881


I gotta be honest, that score looks REALLY low..... my x6 at 4GHz is getting 8600 points, and at 4.2, it gets 9k.....

Are you able to overclock to somewhere in the 4.5-4.6 range?

The 4690k is definitely a good suggestion.....


----------



## JackCY

Quote:


> Originally Posted by *Falconx50*
> 
> just seeing these Firestrike results makes me want to change my CPU, so which should I ask Santa for? The FX 8320 or another type, and why? My Graphics score is fantastic I think, but my FX 6100 is slowing me down.
> 
> http://www.3dmark.com/3dm/4776881


What's yours or santa's budget?

[email protected] + 280x gives 9250-9300 physics score, 7500-8500 total score depending on GPU clocks.

I'm not sure what the latest AMD CPUs are but I'm guessing 8370, 8320, 8350. But usually for the same cash you can get Intel i5 like 4590. 4690K only makes sense if you clock it hard. 4690 has way too big premium over 4590 usually.
FX6100 sounds old and weak.

8350/8370 is somewhat comparable to 4590/4690. Obviously gaining somewhere loosing elsewhere due to different core counts. For benchmarks I think AMD doesn't do well in Firestrike etc. for example, won't help you but to switch to Intel.
AMD CPUs are also quite old honestly, Intel is more up to date platform overall including the motherboards.


----------



## Agent Smith1984

Quote:


> Originally Posted by *JackCY*
> 
> What's yours or santa's budget?
> 
> [email protected] + 280x gives 9250-9300 physics score, 7500-8500 total score depending on GPU clocks.
> 
> I'm not sure what the latest AMD CPUs are but I'm guessing 8370, 8320, 8350. But usually for the same cash you can get Intel i5 like 4590. 4690K only makes sense if you clock it hard. 4690 has way too big premium over 4590 usually.
> FX6100 sounds old and weak.
> 
> 8350/8370 is somewhat comparable to 4590/4690. Obviously gaining somewhere loosing elsewhere due to different core counts. For benchmarks I think AMD doesn't do well in Firestrike etc. for example, won't help you but to switch to Intel.
> AMD CPUs are also quite old honestly, Intel is more up to date platform overall including the motherboards.


Well, don't settle on a physics score in Firestrike either.... be careful, cause in that benchmark, the 8350 and 4690k both overclocked will produce similar results, but in real world gaming situations, the 4690k will stomp it.....

Hopefully Firestrike gives us a little taste of how well games COULD be optimized in the future??? Doubt it.....


----------



## REDEFINE

Hello all, First of all I just wanted to introduce myself I am new to the forums but I have been lurking for several years. And I want to thank all of the members for providing and entertaining and informational place to hang out. Now for my problem I have noticed that my R9 280X Toxic edition will not run at its default speed of 1150 MHz while playing Battlefield 4, For some reason it down clocks to 1100mhz I Know that it used to run at 1150Mhz just last week. But in other games like Call of Duty or Company of Heroes 2 it will let me run at 1150mhz or even higher. I know that my GPU isn't being bottlenecked I Battlefield 4 because my I seen 95-99% GPU usage while im in game. Does any one have any idea what would cause my graphics card to only run at 1100mhz while in bf4. Sapphire says that it must be hardware related and I should just
RMA the card but I don't want to do that because I don't understand why it can run at its full potential using other games. Any help would be great.

My System
fx6300 @ 4.5ghz
sapphire r9280x toxic 1150-1600
Samsung 850 pro 256gb
Samsung 840 pro 120gb
msi g46a 970 motherboard
corsair cx750m pus
Kinston ddr3 1333mhz


----------



## BruceB

Quote:


> Originally Posted by *REDEFINE*
> 
> Hello all, First of all I just wanted to introduce myself I am new to the forums but I have been lurking for several years. And I want to thank all of the members for providing and entertaining and informational place to hang out. Now for my problem I have noticed that my R9 280X Toxic edition will not run at its default speed of 1150 MHz while playing Battlefield 4, For some reason it down clocks to 1100mhz I Know that it used to run at 1150Mhz just last week. But in other games like Call of Duty or Company of Heroes 2 it will let me run at 1150mhz or even higher. I know that my GPU isn't being bottlenecked I Battlefield 4 because my I seen 95-99% GPU usage while im in game. Does any one have any idea what would cause my graphics card to only run at 1100mhz while in bf4. Sapphire says that it must be hardware related and I should just
> RMA the card but I don't want to do that because I don't understand why it can run at its full potential using other games. Any help would be great.
> 
> My System
> fx6300 @ 4.5ghz
> sapphire r9280x toxic 1150-1600
> Samsung 850 pro 256gb
> Samsung 840 pro 120gb
> msi g46a 970 motherboard
> corsair cx750m pus
> Kinston ddr3 1333mhz


Welcome to the Forums!








First things first: check the temps while playing BF4 using speedfan or whatever you have, then Report back!


----------



## REDEFINE

Gpu z is showing my average temp at 73c with the high being 74c it does not show the vrm temps though I Think sapphire has hidden them on my card


----------



## BruceB

Quote:


> Originally Posted by *REDEFINE*
> 
> Gpu z is showing my average temp at 73c with the high being 74c it does not show the vrm temps though I Think sapphire has hidden them on my card


That should be ok, the max temp for a 280X is 85°C. Next step: Drivers. Do a clean install of the latest Drivers and see if that fixes it.


----------



## REDEFINE

Quote:


> Originally Posted by *BruceB*
> 
> That should be ok, the max temp for a 280X is 85°C. Next step: Drivers. Do a clean install of the latest Drivers and see if that fixes it.


Ive done the driver installation a couple of times trying different drivers and each time I use ddu "display driver uninstaller" and that doesn't make a difference. Im currently using ccc 14.9. I have also tried MSI AB and Sapphire trixx to try to run at my default clocks and they don't work either so I have them uninstalled while I am still trouble shooting


----------



## Spork13

Grab a second hand 280X for about $160
RMA existing card.
Get 2nd card back from RMA
Crossfire.
Goodness.


----------



## REDEFINE

Quote:


> Originally Posted by *Spork13*
> 
> Grab a second hand 280X for about $160
> RMA existing card.
> Get 2nd card back from RMA
> Crossfire.
> Goodness.


I actually considered that but I would have to shell out more money for a new PSU


----------



## Spork13

What PSU are you using now?

I went an 850, but only (very) occasionally draw 680w, mostly only around 650. Mind you, only one of my cards is a 280x, the other is a plain 280.


----------



## Falconx50

Quote:


> Originally Posted by *JackCY*
> 
> What's yours or santa's budget?
> 
> Well JackCY,
> 
> My budget isn't that big. $200.00 or less I'd say. And your suggestion would cost at least... i5-4690 = $219.00 + Mobo minimum $90.00 for a Black Friday deal, Total = $300.00 +... too much.
> 
> 
> 
> 
> 
> 
> 
> 
> It would be nice though.


----------



## aaronsta1

Quote:


> Originally Posted by *Falconx50*
> 
> Well JackCY,
> 
> My budget isn't that big. $200.00 or less I'd say. And your suggestion would cost at least... i5-4690 = $219.00 + Mobo minimum $90.00 for a Black Friday deal, Total = $300.00 +... too much.
> 
> 
> 
> 
> 
> 
> 
> 
> It would be nice though.


to be honest.. i dont have any problems playing any games on my 8350. the only game that has given me problems in the last few ive played was watch dogs.. for some reason i can only play that on high settings on my laptop.. (i actually think ubisoft has put code in their games that make amd gpus run like crap)

instead of paying all that money for a few pts in firestrike (which wont make a great deal of improvements in games unless you got a 4k monitor or something), maybe just get a better cpu?

are you having any issues? or just want a higher score?


----------



## JackCY

Quote:


> Originally Posted by *Falconx50*
> 
> My budget isn't that big. $200.00 or less I'd say. And your suggestion would cost at least... i5-4690 = $219.00 + Mobo minimum $90.00 for a Black Friday deal, Total = $300.00 +... too much.
> 
> 
> 
> 
> 
> 
> 
> 
> It would be nice though.


Then you have no other choice. In US it's dirt cheap. In EU you're screwed because everything is marked up by 25%









PCPartPicker part list / Price breakdown by merchant

*CPU:* AMD FX-8320 3.5GHz 8-Core Processor ($119.99 @ Newegg)
*Motherboard:* MSI 970A-G46 ATX AM3+ Motherboard ($79.99 @ Newegg)
*Total:* $199.98
_Prices include shipping, taxes, and discounts when available_
_Generated by PCPartPicker 2014-11-21 04:02 EST-0500_


----------



## aaronsta1

Quote:


> Originally Posted by *JackCY*
> 
> Then you have no other choice. In US it's dirt cheap. In EU you're screwed because everything is marked up by 25%
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PCPartPicker part list / Price breakdown by merchant
> 
> *CPU:* AMD FX-8320 3.5GHz 8-Core Processor ($119.99 @ Newegg)
> *Motherboard:* MSI 970A-G46 ATX AM3+ Motherboard ($79.99 @ Newegg)
> *Total:* $199.98
> _Prices include shipping, taxes, and discounts when available_
> _Generated by PCPartPicker 2014-11-21 04:02 EST-0500_


he doesnt need a new board for that cpu..

his profile says he is using a 990XA-UD3


----------



## marine88

Hey guys i have a 270x can i grab 1 280x and make crossfire? with both 1 270x and 1 280x ??


----------



## [CyGnus]

You can only crossfire similar GPUs, 270x and a 280x are different cards so is not possible.


----------



## Falconx50

Quote:


> Originally Posted by *aaronsta1*
> 
> to be honest.. i dont have any problems playing any games on my 8350. the only game that has given me problems in the last few ive played was watch dogs.. for some reason i can only play that on high settings on my laptop.. (i actually think ubisoft has put code in their games that make amd gpus run like crap)
> 
> instead of paying all that money for a few pts in firestrike (which wont make a great deal of improvements in games unless you got a 4k monitor or something), maybe just get a better cpu?
> 
> are you having any issues? or just want a higher score?


Well, World of Tanks (WoT) has always been a little slow in the FPS department I believe, between 30 - 90; mostly around 45, but the big item for me is the choppy frames. I mean I'll play a game like "The Evil Within" and the FPS is V Synced at 30, but I can see the picture stopping ever so slightly, just a little jerky, but I'll notice. I also see it more when I am using Benchmarking tools, such as 3DMark or Heaven or MSI Afterburner. It is almost like a florescent lamp. If I am doing every day stuff, watching a movie or using the internet I don't see it happen. I assume it might be the CPU because that is where the Physics part of gaming and Benchmarking is happening with AMD equipment.


----------



## aaronsta1

Quote:


> Originally Posted by *Falconx50*
> 
> Well, World of Tanks (WoT) has always been a little slow in the FPS department I believe, between 30 - 90; mostly around 45, but the big item for me is the choppy frames. I mean I'll play a game like "The Evil Within" and the FPS is V Synced at 30, but I can see the picture stopping ever so slightly, just a little jerky, but I'll notice. I also see it more when I am using Benchmarking tools, such as 3DMark or Heaven or MSI Afterburner. It is almost like a florescent lamp. If I am doing every day stuff, watching a movie or using the internet I don't see it happen. I assume it might be the CPU because that is where the Physics part of gaming and Benchmarking is happening with AMD equipment.


you might have something else going on.. running 2 270x gpus at 1195/1400 in crossfire with a 6 core 4.0ghz cpu should be able to run any game at near max settings in 1080 resolution.

scan for virus or adware lately?


----------



## Spork13

I'm pretty sure WoT doesn't support Xfire.
I get no better frame rates with Xfire enabled on WoT, and 2nd GPU shows low temps and low usage after a game.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Spork13*
> 
> I'm pretty sure WoT doesn't support Xfire.
> I get no better frame rates with Xfire enabled on WoT, and 2nd GPU shows low temps and low usage after a game.


That means that xFire is not used


----------



## M1kuTheAwesome

Isn't WoT single threaded? Last I checked it was and that might explain low FPS on AMD systems. I've noticed that same problem myself. Pretty sure if it used all your cores it would be much smoother.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Isn't WoT single threaded? Last I checked it was and that might explain low FPS on AMD systems. I've noticed that same problem myself. Pretty sure if it used all your cores it would be much smoother.


most MMOs are..


----------



## NexusRed

About to go pick up 2 BNIB MSI R9 280X for $400!!! Selling one to a friend for $300 and using the other to crossfire my current!

I'll be switching out my current CPU and motherboard for a 4690K and a MSI Z97 Gaming 7.


----------



## choLOL

Hey guys, is it normal for the 280x to have high temps? My 280x core reaches 75 degrees C when gaming, and VRM's reach 91 deg.C. I'm used to my old 7850's low temps, so I'm pretty worried about these temps. I want to put it _under water_ but my student budget prevents me from buying the parts.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *choLOL*
> 
> Hey guys, is it normal for the 280x to have high temps? My 280x core reaches 75 degrees C when gaming, and VRM's reach 91 deg.C. I'm used to my old 7850's low temps, so I'm pretty worried about these temps. I want to put it _under water_ but my student budget prevents me from buying the parts.


With my XFX 280x thats about the temps I get


----------



## aaronsta1

Quote:


> Originally Posted by *marine88*
> 
> Hey guys i have a 270x can i grab 1 280x and make crossfire? with both 1 270x and 1 280x ??


no.

you can only CF 270x with 270, 270x, 7850, 7870 nonXT. and 280x with 280, 280x, 7950, 7970.


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *choLOL*
> 
> Hey guys, is it normal for the 280x to have high temps? My 280x core reaches 75 degrees C when gaming, and VRM's reach 91 deg.C. I'm used to my old 7850's low temps, so I'm pretty worried about these temps. I want to put it _under water_ but my student budget prevents me from buying the parts.


VRM temps do get hot on 280x cards. Tge record for mine was 106C and that did get my attention.








More aggressive fan profile fixed it though, went down to 85C. I've heard that 90C is in general the temp limit for all component on graphics cards but i'm sure that 1C is not gonna kill it anytime soon. You can try tweaking the fan profile in AB though.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> VRM temps do get hot on 280x cards. Tge record for mine was 106C and that did get my attention.
> 
> 
> 
> 
> 
> 
> 
> 
> More aggressive fan profile fixed it though, went down to 85C. I've heard that 90C is in general the temp limit for all component on graphics cards but i'm sure that 1C is not gonna kill it anytime soon. You can try tweaking the fan profile in AB though.


I agree with this.. its quite normal depending on in case ambients. The card is working at normal temps


----------



## aaronsta1

Quote:


> Originally Posted by *Spork13*
> 
> I'm pretty sure WoT doesn't support Xfire.
> I get no better frame rates with Xfire enabled on WoT, and 2nd GPU shows low temps and low usage after a game.


ive never played wot but some games to use crossfire you have to go into the settings and specify full screen instead of windowed maximized.


----------



## JackCY

Quote:


> Originally Posted by *choLOL*
> 
> Hey guys, is it normal for the 280x to have high temps? My 280x core reaches 75 degrees C when gaming, and VRM's reach 91 deg.C. I'm used to my old 7850's low temps, so I'm pretty worried about these temps. I want to put it _under water_ but my student budget prevents me from buying the parts.


Normal. Up to 80C on GPU and VRM is normal. It all depends on what cooling your card has and what voltage it runs. If the fans go slow it will climb higher a little, if the heatsink is small or inefficient then it will go over 80C.
On stock they do around 75C. Closed case.

---

Agent Smith1984: What bench works best to discover errors? Apart from endless changing of settings when gaming and watching for errors? So one could more easily tweak the clocks and be more sure the clocks are stable. I think mine doesn't like the clocks I set with Firestrike and Heaven. Something keeps restarting in the driver when playing SCII, game pauses screen goes black, then game cursor appears then rest resumes. Noticed sometimes in notification area is some driver crash when this happened haha, and VRAM usage always decreases as it gets flushed of unused stuff, but never goes over 50% available RAM anyway so shouldn't be an issue of low VRAM. Hard to figure out if it's GPU clock or memory clock as it is not a regular thing to happen but it seems to me to be GPU clock thing. Or maybe it's drivers, only other thing crashing/freezing is Heaven with 4xAA, no idea why either, I guess GPU clocks but it's weird that only with 4xAA.


----------



## Spork13

Quote:


> Originally Posted by *aaronsta1*
> 
> ive never played wot but some games to use crossfire you have to go into the settings and specify full screen instead of windowed maximized.


Cheers mate, will check that next session.
Get 55-60 fps on single card anyway, but just out of curiosity.

Edit:
Enablex Xfire, started game, made sure "Full screen" ticked.
No change in FPS.

Tabbed out to desktop, still only one GPU was being used. Also had resource monitor running, and it showed only one physical core had been doing any significant work during the shore session.

Considering the quality of the graphics in this free game and all game mechanics only require ONE CORE, and a single GPU, the dev's have done a pretty good bloody job IMO.

Imagine how good the $60 - $80 AAA titles would look if their developers could be bothered optimising them for mid / high end PC's!


----------



## REDEFINE

Hey, I just wanted to let you guys know that I found out that the reason my card wouldn't run at its default speed in bf4 was because having the internet explorer browser left open after joining a game for some reason the app acceleration makes the gpu reserve some of its clocks and as we all know bf4 launches from a browser so just close it after your done


----------



## Falconx50

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> That means that xFire is not used


Well I know that WoT doesn't support crossfire or sli, but I figure that I should get better frames than I do.








Also I know I don't have any viruses since I just completed a new SSD install with a complete reboot.


----------



## Paradox3713

Okay so looks like I there is not and will not ever be a decent waterblock for the Sapphire 280X Toxic. Thanks for the few that replied. So does anyone know of any aftermarket heatsink and fans to replace the ones that come with this card? Or at least a way to replace that one stupid fan that always fails? Can you order one from the company?


----------



## DavidTiger

Looking to get more power for keeping games maxed out at a steady 60fps.
Currently running a single ASUS 280x TOP @ 1200 core 1625 mem and see 30-60 fps in games but unstable jumps below 60fps..

Do I:
Find another 280x TOP and run it in xfire and hope I can clock them around the same OC?
or
Find a 7970 GHz and xfire that with the 280x?
or
Sell the 280x and get a better single card? 290x or back to nvidia with 970 or 980?

I'm only running at 1080p but at somepoint in the hopefully near future I'd like to get another 2 monitors and go 5760x1080









Also would 750W bronze rated be enough to power 2 cards (with or without OC) plus my CPU @ 4GHz..
I know my MB isn't the best for OC or XFire though I do plan on getting a 990FX board at some point. At least until I can afford the jump to an i5-4690k or maybe the i7-4790k as I do record and stream gameplay at the same time and the i7 would do that with ease vs the i5 and especially my Phenom x6


----------



## JackCY

Don't use CPU for recording anymore since there are HW encoders/accelerations on GPUs. Shadowplay and GVR use them.
If you want to switch then it's best not to put money into old stuff like 990FX.
I'm not sure how well xfire works nowadays but until they change the rendering to SFR I don't want it, plus it's a power eater.
Better get a 970 instead.


----------



## aaronsta1

Quote:


> Originally Posted by *DavidTiger*
> 
> Looking to get more power for keeping games maxed out at a steady 60fps.
> Currently running a single ASUS 280x TOP @ 1200 core 1625 mem and see 30-60 fps in games but unstable jumps below 60fps..
> 
> Do I:
> Find another 280x TOP and run it in xfire and hope I can clock them around the same OC?
> or
> Find a 7970 GHz and xfire that with the 280x?
> or
> Sell the 280x and get a better single card? 290x or back to nvidia with 970 or 980?
> 
> I'm only running at 1080p but at somepoint in the hopefully near future I'd like to get another 2 monitors and go 5760x1080
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also would 750W bronze rated be enough to power 2 cards (with or without OC) plus my CPU @ 4GHz..
> I know my MB isn't the best for OC or XFire though I do plan on getting a 990FX board at some point. At least until I can afford the jump to an i5-4690k or maybe the i7-4790k as I do record and stream gameplay at the same time and the i7 would do that with ease vs the i5 and especially my Phenom x6


id save up and put the money in a new cpu/board combo.. your cpu is prob bottle necking the 280x you have now.. throw on a 2nd 280x on and for sure its holding things back..
to be honest i dont think there is a cpu out there that wont bottle neck 2 280x's

by the time you get the cash to upgrade to that i7 280s will be cheap..


----------



## DavidTiger

I doubt the Phenom x6 at 4GHz bottlenecks the 280x even oc'd. Recently been playing Far Cry 4 and can see 99% GPU usage and only 35-40% CPU, and I've seen other games get high utilization of the gpu.
Maybe if I tried to get 1440p or higher in ultra detail I'd start to see a bottleneck but certainly at 1080p and my current cpu/gpu overclock I don't see the 280x being held back.

I'm always saving and hopefully soon after Christmas I'll be able to go ahead with an i5/i7. Buying a second 280x fairly cheap off eBay or local selling sites just now would hopefully give enough of a performance boost and can still be used later when I upgrade and maybe by that time the 970/980 will have come down a little in price and can sell on the 280s to go towards it.


----------



## rdr09

Quote:


> Originally Posted by *DavidTiger*
> 
> I doubt the Phenom x6 at 4GHz bottlenecks the 280x even oc'd. Recently been playing Far Cry 4 and can see 99% GPU usage and only 35-40% CPU, and I've seen other games get high utilization of the gpu.
> Maybe if I tried to get 1440p or higher in ultra detail I'd start to see a bottleneck but certainly at 1080p and my current cpu/gpu overclock I don't see the 280x being held back.
> 
> I'm always saving and hopefully soon after Christmas I'll be able to go ahead with an i5/i7. Buying a second 280x fairly cheap off eBay or local selling sites just now would hopefully give enough of a performance boost and can still be used later when I upgrade and maybe by that time the 970/980 will have come down a little in price and can sell on the 280s to go towards it.


2 280X even at stock will definitely put a damper on the thuban. you might be better off with a slower single 290 . . .

http://www.3dmark.com/3dm/4479341?


----------



## BruceB

Quote:


> Originally Posted by *rdr09*
> 
> 2 280X even at stock will definitely put a damper on the thuban. you might be better off with a slower single 290 . . .
> http://www.3dmark.com/3dm/4479341?


Could you do a normal Fire Strike run on that Card so I can compare, please?


----------



## rdr09

Quote:


> Originally Posted by *BruceB*
> 
> Could you do a normal Fire Strike run on that Card so I can compare, please?


i only have one with the gpu at stock . . .

http://www.3dmark.com/3dm/4488113?

i have since moved the 290 to my intel sig.


----------



## DavidTiger

Quote:


> Originally Posted by *rdr09*
> 
> 2 280X even at stock will definitely put a damper on the thuban. you might be better off with a slower single 290 . . .
> 
> http://www.3dmark.com/3dm/4479341?


Yea I think 2 cards will start to see a bottleneck. I can push the cpu to 4.5GHz stable to try and reduce it.
Though I don't think at 1080p the cards will run high enough together to produce a bottleneck but definitely going 1440p and beyond.
I'll probably end up making it do me until early next year and upgrade to Intel and a 970 or 980. Unless AMD do something impressive for a good price with their next card


----------



## rdr09

Quote:


> Originally Posted by *DavidTiger*
> 
> Yea I think 2 cards will start to see a bottleneck. I can push the cpu to 4.5GHz stable to try and reduce it.
> Though I don't think at 1080p the cards will run high enough together to produce a bottleneck but definitely going 1440p and beyond.
> I'll probably end up making it do me until early next year and upgrade to Intel and a 970 or 980. Unless AMD do something impressive for a good price with their next card


4.5? wow. i did 4.5GHz on my thuban only for benching. gets too hot pass 4.3GHz for 7/24. it becomes snappier than my i7 at 4.4GHz.

if ever, go for an i7 coming from a thuban.


----------



## DavidTiger

Using Corsair H100 with 4 SP120 fans in pushpull. At 4ghz Running on the lowest fan profile on the pump it gets around 62c on 100% prime95 load. Normal gaming temps only reach the low 50s at most.
Cranking it to 4.5 and going to the highest profile, prime95 gives around 68c and gaming will reach the 60s.
It is pushing 1.6v at that so not ideal but I guess as long as I can keep it cool enough and keep the VRMs cool it can run stable. RAM has to drop to 1333 but I've never noticed any significant difference from 1600.


----------



## rdr09

Quote:


> Originally Posted by *DavidTiger*
> 
> Using Corsair H100 with 4 SP120 fans in pushpull. At 4ghz Running on the lowest fan profile on the pump it gets around 62c on 100% prime95 load. Normal gaming temps only reach the low 50s at most.
> Cranking it to 4.5 and going to the highest profile, prime95 gives around 68c and gaming will reach the 60s.
> It is pushing 1.6v at that so not ideal but I guess as long as I can keep it cool enough and keep the VRMs cool it can run stable. RAM has to drop to 1333 but I've never noticed any significant difference from 1600.


i think 4.3GHz is enuf to handle one 970 or even one 980. two 280Xs, in some games, will not be pushed hard enuf by a thuban. i doubt it.


----------



## aaronsta1

Quote:


> Originally Posted by *rdr09*
> 
> i think 4.3GHz is enuf to handle one 970 or even one 980. two 280Xs, in some games, will not be pushed hard enuf by a thuban. i doubt it.


i wouldnt say it wouldnt work.. it will leave him some upgrade room in the future if he wants to get a faster cpu.. but id keep the 280x and just upgrade cpu/board if it was me, then later the gpu.

my old i7 920 with 2 270x's is still playing games with a more steady frame rate then my new 8350 with 2 7870xt's..
an example.. got dragon age inquisition..
i7 frame rate from the benchamark is 90s max 80s avg.
8350 frame rate from the benchmark is 70s max and 50s avg.
this is on dx11 and automatic settings.

up a few i posted some firestrike results of my computers.. you can tell the difference.

here is a quick example.
i7
http://www.3dmark.com/fs/3301424
8350
http://www.3dmark.com/fs/3270444

im almost 100% positive those gpus are being starved by the cpu tho.. i bet if i got a new i7 id be in the 10k

i t hink most game manufactures are working with intel and nvidia to optimize their games and are leaving amd to figure it out with drivers..
if i was to build a new system today it would be intel with nvidia.. and i wouldn't get anything with less then 4gb of vram.


----------



## rdr09

Quote:


> Originally Posted by *aaronsta1*
> 
> i wouldnt say it wouldnt work.. it will leave him some upgrade room in the future if he wants to get a faster cpu.. but id keep the 280x and just upgrade cpu/board if it was me, then later the gpu.
> 
> my old i7 920 with 2 270x's is still playing games with a more steady frame rate then my new 8350 with 2 7870xt's..
> an example.. got dragon age inquisition..
> i7 frame rate from the benchamark is 90s max 80s avg.
> 8350 frame rate from the benchmark is 70s max and 50s avg.
> this is on dx11 and automatic settings.
> 
> up a few i posted some firestrike results of my computers.. you can tell the difference.
> 
> here is a quick example.
> i7
> http://www.3dmark.com/fs/3301424
> 8350
> http://www.3dmark.com/fs/3270444
> 
> im almost 100% positive those gpus are being starved by the cpu tho.. i bet if i got a new i7 id be in the 10k


I agree. if budget is a concern, keeping the 280X and the thuban is really a good optiion. if not, upgrading the gpu is the best but not adding another 280X.

here is 290 @ 1200 with an i7 @ 4.5GHz.

http://www.3dmark.com/3dm/4079609?


----------



## diggiddi

Phenom II X6 at 4.5 is beast mode, if you can do that you don't need to upgrade cpu imo


----------



## Roboyto

Quote:
Originally Posted by *DavidTiger* 

Looking to get more power for keeping games maxed out at a steady 60fps.

*Sell the 280x* and get a better single card? 290x or back to nvidia with 970 or 980?

Also would *750W* bronze rated be enough to power 2 cards (with or without OC) plus my CPU @ 4GHz..
I know my MB isn't the best for OC or XFire though I do plan on getting a 990FX board at some point. At least until I can afford the *jump to an i5-4690k or maybe the i7-4790k* as I do record and stream gameplay at the same time and the i7 would do that with ease vs the i5 and especially my Phenom x6










> Quote:
> 
> 
> 
> Originally Posted by *DavidTiger*
> 
> Using Corsair H100 with 4 SP120 fans in pushpull. At 4ghz Running on the lowest fan profile on the pump it gets around 62c on 100% prime95 load. Normal gaming temps only reach the low 50s at most.
> Cranking it to 4.5 and going to the highest profile, prime95 gives around *68c and gaming will reach the 60s.*
> It is pushing 1.6v at that so not ideal but I guess as long as I can keep it cool enough and keep the VRMs cool it can run stable. *RAM has to drop to 1333* but I've never noticed any significant difference from 1600.
Click to expand...

Single card is always the better bet IMO because there is much less likelihood of complications possible with Xfire or SLI. Sure plenty of people have no issues running multiple cards, but plenty of people also have problems with running multiple cards. If you're going to change something in the GPU department, then upgrade to something bigger and badder. With the prices on 290s right now it would be silly not to take advantage. 290X typically aren't stunningly faster than the 290, so they're usually not worth the price premium.

If you're pushing a Thuban to 4.5 GHz I doubt a 750W PSU would be enough if you threw a 2nd 280X into the mix. I'd imagine your CPU at that voltage/speed is in the 300W ballpark under load; X6 at stock pull ~140W. *Edit* Just noticed you said running at 4GHz...I think it would still be a stretch. On average your card probably pulls ~200W gaming, so we double that, plus your power hungry CPU = needing a bigger PSU.

I would be careful running your Thuban at those temperatures regularly. If I remember correctly from my days with Phenom II chips, ~60C was the max you wanted to hit for benching. Running the chip at those temps/volts all the time isn't a good idea. At the least the chip performance will degrade much more rapidly than normal, if it doesn't go kaput from the high voltage and temps.

I was in your shoes once upon a time...Running an ASRock 890FX board with an unlocked 960T @ 4.2GHz with an Asus 7870 DC2 @ 1250/1500 with full watercooling. I made the switch to an ASRock Z77 Extreme6 with a 3570k. The CPU/MoBo swap alone, with the i5 @ stock speeds, gave me a solid 20% gaming performance jump...let alone the PC as a whole running better. I wouldn't rack your brain over which GPU to get...make the switch to an i5 first, or better yet an i7, and never look back.

Modern games are starting to take advantage of RAM speeds, especially when you consider DDR4 starts at 2400. If you make the jump to Intel, then the best blend of price/performance is 2133 MHz. You can typically snatch an 8GB kit for under $100..if you want to assist in keeping Intel CPU temps down then you can spend a couple extra $ and get some 1.5V 2133


----------



## JackCY

And now Heaven with 4xAA doesn't crash/freeze anymore with my OC. Maybe uninstalling CCC helped since it was again messing with Tweak and OC would get properly applied only if Tweak started later which is very difficult since it starts fast, faster than the sluggish CCC.
These cards and drivers can sure act a little sometimes.

---
Quote:


> Phenom II X6 at 4.5 is beast mode, if you can do that you don't need to upgrade cpu imo


If that is a beast what are latest Intel CPUs at the same clocks? i5, i7, E, Xeons









Right now Intel got more expensive here and AMD is considerably cheaper, makes those 8320s attractive option for low end value buyers especially since it seems they can run almost the same OC as 8350s.
Unfortunately AMD CPUs seem to not do well in games especially with fast GPUs.


----------



## statyksyn

How can i join??


----------



## jason387

Quote:


> Originally Posted by *marine88*
> 
> Hey guys i have a 270x can i grab 1 280x and make crossfire? with both 1 270x and 1 280x ??


The 270X should Crossfire with another 270X. Eve
Quote:


> Originally Posted by *statyksyn*
> 
> How can i join??


Now GPU-Z pic


----------



## statyksyn

Quote:


> Originally Posted by *jason387*
> 
> The 270X should Crossfire with another 270X. Eve
> Now GPU-Z pic


I dont have one at work but i will as soon as i get home ! Awesome!


----------



## marine88

What is the maximum stable u think i can get on my r920x asus dc2 i have a cool temperature 70c and my screen freezes


----------



## DavidTiger

Thinking about it I'll probably just keep the current setup and save up for a better single gpu and an i5/i7 setup.
I don't want to risk burning out the psu by being close to maximum or even going over at full load by adding another card that's going to pull an extra ~250w with overclock on top of 250-300w the cpu will be pulling under load..

Quote:


> Originally Posted by *marine88*
> 
> What is the maximum stable u think i can get on my r920x asus dc2 i have a cool temperature 70c and my screen freezes


On my 280x I'm running 1.2v @ 1200/1625 +20% power.
Highest I've seen temps is 73c playing FarCry4 with 99% usage. Not sure what the fans are running it as it's just the default Afterburner profile. I could probably reduce those temps a little by setting up a custom fan profile.


----------



## statyksyn

Quote:


> Originally Posted by *jason387*
> 
> The 270X should Crossfire with another 270X. Eve
> Now GPU-Z pic


Here it is!


----------



## DavidTiger

I guess I'll submit my entry as well








The card I have is the; *ASUS R9 280x DirectCUII TOP 3GB* overclocked to 1200/1650 @ 1.2v

Too lazy to open the case to take a pic just now, but here's one I send to my brother showing the size difference between my old GTX560Ti and the 280x










GPU-Z Image:

and validation link:
http://www.techpowerup.com/gpuz/details.php?id=4bceu


----------



## marine88

Quote:


> Originally Posted by *DavidTiger*
> 
> Thinking about it I'll probably just keep the current setup and save up for a better single gpu and an i5/i7 setup.
> I don't want to risk burning out the psu by being close to maximum or even going over at full load by adding another card that's going to pull an extra ~250w with overclock on top of 250-300w the cpu will be pulling under load..
> On my 280x I'm running 1.2v @ 1200/1625 +20% power.
> Highest I've seen temps is 73c playing FarCry4 with 99% usage. Not sure what the fans are running it as it's just the default Afterburner profile. I could probably reduce those temps a little by setting up a custom fan profile.


how can u see the voltage ?


----------



## JackCY

HWiNFO or GPU-Z but GPU-Z sometimes likes to freeze the system when you open it they still have bugs in it. I got this a lot at one moment, now not really anymore. Still HWiNFO has many more useful information and better graphs and consumes less CPU I noticed.


----------



## DavidTiger

Quote:


> Originally Posted by *marine88*
> 
> how can u see the voltage ?


I use Afterburner which has Rivatuner Statistics server for OSD built-in. Then I use HWiNFO to add extra info for Temps, CPU and RAM usage etc.
So ingame I see this in the top right corner, looks big but its just big enough to read @ 1080p


----------



## Spork13

I have ASUS cards, and am using the ASUS GPU tweak software - but not entirely happy with it.
Will afterburner or other GPU overclocking software work for my cards?


----------



## statyksyn

Quote:


> Originally Posted by *Spork13*
> 
> I have ASUS cards, and am using the ASUS GPU tweak software - but not entirely happy with it.
> Will afterburner or other GPU overclocking software work for my cards?


Have you tried the and catalyst? But yes afterburner will work


----------



## statyksyn

Maybe sobody can help me decide if i should spend another chunk of change here....I love my 270x's but thinking about jumping to 280x's....how big of a difference in performance is it?


----------



## aaronsta1

Quote:


> Originally Posted by *statyksyn*
> 
> Maybe sobody can help me decide if i should spend another chunk of change here....I love my 270x's but thinking about jumping to 280x's....how big of a difference in performance is it?


its an ok jump..

going from 2 270x to 1 290 is a bigger jump tho..

altho to be honest its not needed at this time.. 2 270x crossifre will max out every game on the market. unless you are running some crazy 3 monitor gaming setup or something.


----------



## statyksyn

Quote:


> Originally Posted by *aaronsta1*
> 
> its an ok jump..
> 
> going from 2 270x to 1 290 is a bigger jump tho..
> 
> altho to be honest its not needed at this time.. 2 270x crossifre will max out every game on the market. unless you are running some crazy 3 monitor gaming setup or something.


Thats exactly what i needed to hear to cool my jets lol, everything looks amazing i just kept thinking i could go even further. ill save my money till the new series of cards come out from amd


----------



## jrizzz

Quote:


> Originally Posted by *Spork13*
> 
> I have ASUS cards, and am using the ASUS GPU tweak software - but not entirely happy with it.
> Will afterburner or other GPU overclocking software work for my cards?


Afterburner can only change power limit, core and memory clock. Core voltage is only adjustable if you select "force constant voltage" in the settings and memory voltage is not adjustable.

You can use CCC but it does not have access to voltages.


----------



## jumpy2219

R9 270 , Reference card. clocked at reference speeds







925 stock/ 1100 OC

I got the reference card from AMD, I will be doing a video on it soon.

Just in case any of you are interested here is my YouTube: https://www.youtube.com/user/adrtechreviews


----------



## jumpy2219

Btw hope you don't mind, i added the 270 into the sig. I noticed it wasn't there before. lol


----------



## giantgjohnson

Sapphire R9 280X Tri-X OC Edition (11221-22 / 100363-4L)

OC'd to 1200 Core / 1700 Memory (1.25v)


----------



## JackCY

Quote:


> Originally Posted by *Spork13*
> 
> I have ASUS cards, and am using the ASUS GPU tweak software - but not entirely happy with it.
> Will afterburner or other GPU overclocking software work for my cards?


AB will work but you have to force the voltage for it to work. TriXX crashed when I accessed settings, so I didn't use it for OCing.
What's the issue with Tweak? For me it's the only thing that actually works as expected.
If you are having issue with getting voltage applied, it is because CCC is overriding Tweak, leave CCC overdrive enabled and uninstall it. Or don't install CCC at all.
Not sure if forcing constant voltage in AB solves this CCC issue but they could endup fighting each other constantly, not good.
I don't think having overdrive disabled in CCC helped me, I did it twice and both times I ended up uninstalling CCC so it doesn't override Tweak OC since Tweak starts fast and as a service where as CCC is very lazy and starts late = after Tweak and overrides settings Tweak set before CCC started.
You could setup some delay for Tweak or Tweak service to start after CCC, but I found uninstalling CCC an easier option.

Tweak does have custom fan profile and 2D clocks as well, but I got access to that only the first time when I was forced to install Tweak as administrator (account (deactivated by default), not run as administrator, yes the account has even higher privileges) because otherwise it wouldn't install.
The fan profile can be buggy though and can reset the whole profile to minimum sometimes, not sure under what conditions but it happened to me and it's some old bug. Than fan would not ramp up, not good.


----------



## Paradox3713

Quote:


> Originally Posted by *giantgjohnson*
> 
> 
> 
> Sapphire R9 280X Tri-X OC Edition (11221-22 / 100363-4L)
> 
> OC'd to 1200 Core / 1700 Memory (1.25v)


Whoa you're getting those clocks on air? Did you do anything additional to it?


----------



## Johnny Utah

running 14.11 beta and afterburner 4.0 with my 290x's. I have voltage set to +100. gpu 1 is reporting ~1.24v and gpu is reporting the (expected) ~1.31v. I've set force constant voltage and disabled ulps in ab and the registry.

not sure if this is the cause or a separate issue, but in DA: Inq my core clocks fluctuate regardless of gpu load it seems. I have both set to 1100mhz at the moment, but gpu2 will go down to just under 1000mhz at some points.

any ideas?


----------



## Spork13

Having several issues.
The fan profile - sometimes works as expected, sometimes doesn't.
Everything else... sometimes I can edit and save profiles - sometimes I can't, and it seems to confuse which of the 2 cards I am trying to apply profile to (I have a 280X and a 280 non-x, and can't push the latter as far as I can the former)

I "need" CCC, as it't the only way I know of enabling / disabling Crossfire, some games work better with it, some run like crap with it, and some just don't need it. Is there another way to turn 2nd card on and off?

DL Afterburner today. Haven't had time to try using it to OC, but I do like the OSD for individual GPU temps and usage in real time. Shame it doesn't log max temps - or does it???


----------



## Agent Smith1984

Quote:


> Originally Posted by *JackCY*
> 
> AB will work but you have to force the voltage for it to work. TriXX crashed when I accessed settings, so I didn't use it for OCing.
> What's the issue with Tweak? For me it's the only thing that actually works as expected.
> If you are having issue with getting voltage applied, it is because CCC is overriding Tweak, leave CCC overdrive enabled and uninstall it. Or don't install CCC at all.
> Not sure if forcing constant voltage in AB solves this CCC issue but they could endup fighting each other constantly, not good.
> I don't think having overdrive disabled in CCC helped me, I did it twice and both times I ended up uninstalling CCC so it doesn't override Tweak OC since Tweak starts fast and as a service where as CCC is very lazy and starts late = after Tweak and overrides settings Tweak set before CCC started.
> You could setup some delay for Tweak or Tweak service to start after CCC, but I found uninstalling CCC an easier option.
> 
> Tweak does have custom fan profile and 2D clocks as well, but I got access to that only the first time when I was forced to install Tweak as administrator (account (deactivated by default), not run as administrator, yes the account has even higher privileges) because otherwise it wouldn't install.
> The fan profile can be buggy though and can reset the whole profile to minimum sometimes, not sure under what conditions but it happened to me and it's some old bug. Than fan would not ramp up, not good.


When I had my Asus 280x..... I found that afterburner was the only reliable way to go. And it also allowed for the absolute highest stable clocks with forced voltage.
Ran 1.3v 1250/1800 pretty solid.


----------



## choLOL

Can anyone confirm if this is correct -- is the PowerColor R9 280x Turbo Duo really based on the reference 7970/280x PCB? According to _this review_, it is; but that's the only site that says the card is based on the reference design, no other site mentions anything about it.

The reason why I'm asking is I want to put a full cover GPU block on it because of the high temps, but I can't tell for sure if it *is* based on the reference PCB. Thanks! +rep


----------



## JackCY

Is it over 80C? Isn't full cover water block going to actually cost more than the card is worth?


----------



## choLOL

Well, not really over 80. The core reaches 78 deg.C under moderate gaming, and the VRM's reach 90 deg.C.
I bought the card here for around $240, and there's an XSPC block that costs $105 here. I'm gonna get a loop soon, and if I can squeeze in the video card in, why not?









XSPC replied via email saying that the block will fit the card.


----------



## aaronsta1

Quote:


> Originally Posted by *choLOL*
> 
> Well, not really over 80. The core reaches 78 deg.C under moderate gaming, and the VRM's reach 90 deg.C.
> I bought the card here for around $240, and there's an XSPC block that costs $105 here. I'm gonna get a loop soon, and if I can squeeze in the video card in, why not?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> XSPC replied via email saying that the block will fit the card.


do you have a side fan on that case?


----------



## Mr-Dark

Any one have sapphire 280 non x here ? i need bios for this card


----------



## choLOL

Quote:


> Originally Posted by *aaronsta1*
> 
> do you have a side fan on that case?


Yeah, but the whole grill area is filtered by pantyhose, as well as the bottom intake fan. It's just that the ambient temps are relatively high here.

edit: the overdue picture of my card as per requirement (I think) to join the club.









Spoiler: Warning: Spoiler!




This is the card installed. I was cleaning the computer when I took this picture, that's why there's a loose cable.

And now the lovely box it came in.


----------



## aaronsta1

it seems my MSI r9 270x gaming OC card has given up the ghost.. again.. for the 3rd time..

i was playing DA:I and the screen went black.. then static from the speakers..

now every time i run a game, the screen remains black and i hear static from the speakers..

everything is good in windows.. when i remove that video card the games work.

1st MSI card, fan went bad..
2nd MSI card huge black squares on the screen when playing games.
3rd MSI card, black screen and static noise from speakers.

i wasnt even overclocking this card.. running at 1120/1400.

i guess its a good thing i have a spare card.. but its just a 270 tho and not an X









EDIT..

so it seems when the game crashed it messed up something in the bios.. i flipped the switch to bios B and now the card is working again..

strange.. i dunno if i should return it or not..


----------



## dxzdxz1

Hi guys, I came here in hoping that someone have a R9 280 Dual-X (11230-00-20G) and could send me the BIOS. This card apparently is the same 7950 Dual-X, and have UEFI GOP support.

Thanks,
Deivid.


----------



## Mr-Dark

Quote:


> Originally Posted by *dxzdxz1*
> 
> Hi guys, I came here in hoping that someone have a R9 280 Dual-X (11230-00-20G) and could send me the BIOS. This card apparently is the same 7950 Dual-X, and have UEFI GOP support.
> 
> Thanks,
> Deivid.


+1 I need this too


----------



## marine88

What is the maximum voltage you can give to a 270x? I have allwayas 1.20v i can´t give more I think this is default!


----------



## Mr-Dark

Quote:


> Originally Posted by *marine88*
> 
> What is the maximum voltage you can give to a 270x? I have allwayas 1.20v i can´t give more I think this is default!


For hd 7000 (270x=7870) 1.25 on air 24/7 on water 1.3 not more


----------



## BlaXey

Hello, actually i have a r9 270x dual-x max temp 62º with gpu clock 1165MHz and memory clock 1580, vcore= 1.3 this is the limit vcore that leaves me put. My question is, ¿There is any way to increase the voltage?


----------



## jason387

Quote:


> Originally Posted by *BlaXey*
> 
> Hello, actually i have a r9 270x dual-x max temp 62º with gpu clock 1165MHz and memory clock 1580, vcore= 1.3 this is the limit vcore that leaves me put. My question is, ¿There is any way to increase the voltage?


Hard mod.


----------



## jrizzz

Quote:


> Originally Posted by *dxzdxz1*
> 
> Hi guys, I came here in hoping that someone have a R9 280 Dual-X (11230-00-20G) and could send me the BIOS. This card apparently is the same 7950 Dual-X, and have UEFI GOP support.
> 
> Thanks,
> Deivid.


Just go to the TPU database
Edit: oops i guess they dont have it, sorry


----------



## dxzdxz1

Quote:


> Originally Posted by *jrizzz*
> 
> Just go to the TPU database
> Edit: oops i guess they dont have it, sorry


yep, it was the first place I went looking.
thanks for try to help anyway


----------



## DiceAir

So i'm running 2x club3d r9 280x's. I also have a Qnix qx2710 [email protected] i really do like the monitor but i have 1 major complain about it. It takes a beast system to run games on decent settings at that resolution and refresh rate. at 60hz that monitor also doesn't feel smooth. frame pacing also don't feel up to par here so i was thinking of either selling this monitor or taking it to work and get a 1080p 144hz monitor as it's easier to drive and I feel because I'm not overclocking the monitor now and it uses displayport my smoothness will be much better giving me a better experience. I'm just not sure ATM. I feel that on my GTX 570 sli I almost had a better experience although I had to lower the settings.

I tried runing the qnix unoverclocked with both the cru settings being reset and the ati patcher set to default and had same issue. Even on 60hz my frames feel jerky on most games and I can say it's not because of my brain/eyes getting used to 96hz. even my 1080p panel at work with a r9 270x installed I can feel it's smoother in some games at 60hz. So anyone that can enlighten me.

Oh before I forget yes I did install 2 crossfire bridges that is required for monitor overclocking.


----------



## diggiddi

What driver are you using?


----------



## aaronsta1

what are you guy's voltages on the 270x cards..
i seem to have hit a wall on mine around 1150 core.. @ 1.25v

even if i go up to 1.3 it wont go any higher without the driver crashing.. or artifacts..

its not due to heat because they never initialize..

any ideas?

one is a MSI 270X Gaming 2g, stock clock 1080, factory OC 1120 @ 1.2v 71% Asic
and one is a Sapphire 270X Dual-X, stock clock 1070, no factory OC @ 1.238 77% Asic. altho this card will run 1120 @ 1.2v

both cards will run 1120 @ 1.2v
not sure its worth the extra 30mhz for .05v


----------



## DiceAir

Quote:


> Originally Posted by *diggiddi*
> 
> What driver are you using?


i tried many drivers. From the latest 14.11.2 to as early at 14.2 and even 13.9 but still same issue. so not a driver issue or maybe a driver issue that has never been fixed cause only a hand full of people have this issue


----------



## inlandchris

Quote:


> Originally Posted by *DiceAir*
> 
> i tried many drivers. From the latest 14.11.2 to as early at 14.2 and even 13.9 but still same issue. so not a driver issue or maybe a driver issue that has never been fixed cause only a hand full of people have this issue


I believe the latest driver is 14.301 and I use it.

I use (5) 1080P monitors and using only one r9-280x Matrix, no jitter.

I don't know what club3d is, is it the dp to hdmi adapter or is it the maker of the vga card?

here is my monitors and GPUZ screen shot.





I use ACCELL to go from DP to HDMI.

No special tricks and stock overclocking from Asus.

If you have 6 ports, you can put up 3 or 5 monitors.

Computer is Asus R4BE with i7-4930K standard clocks.

If you are using 2 cards, then you may have a sync problem with crossfire, just a thought


----------



## DiceAir

Quote:


> Originally Posted by *inlandchris*
> 
> I believe the latest driver is 14.301 and I use it.
> I use (5) 1080P monitors and using only one r9-280x Matrix, no jitter.
> I don't know what club3d is, is it the dp to hdmi adapter or is it the maker of the vga card?
> here is my monitors and GPUZ screen shot.
> 
> 
> 
> 
> 
> 
> 
> I use ACCELL to go from DP to HDMI.
> No special tricks and stock overclocking from Asus.
> If you have 6 ports, you can put up 3 or 5 monitors.
> Computer is Asus R4BE with i7-4930K standard clocks.
> If you are using 2 cards, then you may have a sync problem with crossfire, just a thought


Club3d is the make of my video cards. I have the qnix q82710.

Here is some pictures and a video. See my vsync at 100FPS when actually on 96hz and on 60hz check the weird high gpu latency








and yes I know the lower the graph the better


----------



## inlandchris

Quote:


> Originally Posted by *DiceAir*
> 
> Club3d is the make of my video cards. I have the qnix q82710.
> 
> Here is some pictures and a video. See my vsync at 100FPS when actually on 96hz and on 60hz check the weird high gpu latency
> 
> 
> 
> 
> 
> 
> 
> 
> and yes I know the lower the graph the better


Those look like artifacts. Caused usually by video card being over clocked too high or too hot, best guess.
If video card is not damaged, try only one card and put to the same test, might help isolate where.


----------



## DiceAir

Quote:


> Originally Posted by *inlandchris*
> 
> Those look like artifacts. Caused usually by video card being over clocked too high or too hot, best guess.
> If video card is not damaged, try only one card and put to the same test, might help isolate where.


Ok pelase explain where you see artifacts. I don't see any...lol. you confusing artifacts with smoothness. give me the exact time and area where you see it


----------



## Mr-Dark

Quote:


> Originally Posted by *DiceAir*
> 
> Club3d is the make of my video cards. I have the qnix q82710.
> 
> Here is some pictures and a video. See my vsync at 100FPS when actually on 96hz and on 60hz check the weird high gpu latency
> 
> 
> 
> 
> 
> 
> 
> 
> and yes I know the lower the graph the better


where is the problem ????

how your vysenc cap the tps at 96 not 60 your screen 60hz ???


----------



## DiceAir

Quote:


> Originally Posted by *Mr-Dark*
> 
> where is the problem ????
> 
> how your vysenc cap the tps at 96 not 60 your screen 60hz ???


\

Ok forgot to set t he game in settings to 60hz but when you fire up the game at first it will use the desktop refresh rate and that's 60hz. My refresh rate is 96hz but it tops out on 100fps. Just trust me the refresh rae is 96hz not 60hz.

Even when I set the game to 96hz it's still topping out on 100fps

in the first picture my refresh rate is set to 60hz on desktop so 60hz in game to but check the high frametime. should be lower so I feel something is wrong. My cpu is not the limit here otherwise my fps would drop. Although my frametime is high there my fps is still 60.


----------



## JackCY

Quote:


> Originally Posted by *DiceAir*
> 
> \
> 
> Ok forgot to set t he game in settings to 60hz but when you fire up the game at first it will use the desktop refresh rate and that's 60hz. My refresh rate is 96hz but it tops out on 100fps. Just trust me the refresh rae is 96hz not 60hz.
> 
> Even when I set the game to 96hz it's still topping out on 100fps
> 
> in the first picture my refresh rate is set to 60hz on desktop so 60hz in game to but check the high frametime. should be lower so I feel something is wrong. My cpu is not the limit here otherwise my fps would drop. Although my frametime is high there my fps is still 60.


I don't know what's the issue, we really can't see at thing from the screenshot and recording via fraps or AMD utility. We don't hold the mouse and move it to notice any delays.

There are some things I would have done differently on that setup, you went all the way to get the monitor from Korea, so I would just clock it as high as I can.
As a gamer, never use VSYNC, ever, it causes jerkiness, seriously, it's just complete rubbish and some games get totally screwed up and lag like hell once VSYNC is turned on in them. What I use instead to limit very high fps is set up an fps limiter, say for 60Hz I use 120-125fps limit.
Yes driving [email protected] will take some firepower








And for serious online gaming I setup minimum details possible to remove the visual clutter and make the games more easy and quicker to orient, less distractions and less crap like blur and bloom. Quake - picmip etc., CoD4 can be reduced crazy a lot as well, etc.

Until you turn VSYNC off, there is really no point in investigating what's going on with your setup.
CF is less smooth than SLI but it shouldn't be sudden jerkiness, it's more because SLI delays frames to space them equally, where as CF sends them ASAP so the frames arrive more like in pairs and then there is a slight delay and then again a pair of frames.
If you can in any game, they are very few now, try using SFR instead of AFR. I don't get it why they implemented AFR in the first place, it's easier, that's why... SFR is better from my POV.


----------



## DiceAir

Quote:


> Originally Posted by *JackCY*
> 
> I don't know what's the issue, we really can't see at thing from the screenshot and recording via fraps or AMD utility. We don't hold the mouse and move it to notice any delays.
> 
> There are some things I would have done differently on that setup, you went all the way to get the monitor from Korea, so I would just clock it as high as I can.
> As a gamer, never use VSYNC, ever, it causes jerkiness, seriously, it's just complete rubbish and some games get totally screwed up and lag like hell once VSYNC is turned on in them. What I use instead to limit very high fps is set up an fps limiter, say for 60Hz I use 120-125fps limit.
> Yes driving [email protected] will take some firepower
> 
> 
> 
> 
> 
> 
> 
> 
> And for serious online gaming I setup minimum details possible to remove the visual clutter and make the games more easy and quicker to orient, less distractions and less crap like blur and bloom. Quake - picmip etc., CoD4 can be reduced crazy a lot as well, etc.
> 
> Until you turn VSYNC off, there is really no point in investigating what's going on with your setup.
> CF is less smooth than SLI but it shouldn't be sudden jerkiness, it's more because SLI delays frames to space them equally, where as CF sends them ASAP so the frames arrive more like in pairs and then there is a slight delay and then again a pair of frames.
> If you can in any game, they are very few now, try using SFR instead of AFR. I don't get it why they implemented AFR in the first place, it's easier, that's why... SFR is better from my POV.


So you say rather stick to screen tearing than having vsync without screen tear. You know what screen tear is so bad even limiting fps or unlimited fps. I tried limiting it to anything between 93-97fps and major screen tear. Even on single gpu I get this issue so it doesn't look like a crossfire issue. I even tried with frame pacing on or off and same issue. I don't see why I have to do the 125fps limit on 60hz as it's making this setup extremely expensive or I have to dump down the graphics so much that it would look worse than 1080p 144hz at better graphics.

in the screenshot I get dips in framelatency. it's like a miss communication from my system and gpu(s) and I do get it more when turning around even on lowst graphics and my resolution set to 1080p. Now in the video look at the fps. My refresh rate is 96hz (never mind the game settings as it will use my desktop refresh rate as the game uses that when first booted up and It's the same when I force the game and desktop on 96hz) but my FPS goes to 100 Vsynced. So I don't know What is going on. Liek I said on my old 60hz 1080p panel it feels a lot smoother than my current panel at 60hz. I don't get this issue there. Maybe something to do with the EDID of the monitor. I don't know how to explaing this better but if you guys where here I can compare it for you with my old monitor and explain in person so you can better understand and then you guys will understand 100%. i want to be able to have 96hz and get 96fps vsynced regardless of if it's bad input lag or not this should happen whatsoever. If I can get this solved then we might even solve some other issues I'm having.

There is 1 game where everything works out fine and that is Alien isolation. Can't remember if it was 100% fine will have to test again tonight and report back but as far as I remember it was fine for me. BTW I even changed the renderaheadlimit in BF4 and either same or worse.


----------



## DiceAir

sorry double post


----------



## inlandchris

Quote:


> Originally Posted by *DiceAir*
> 
> Ok pelase explain where you see artifacts. I don't see any...lol. you confusing artifacts with smoothness. give me the exact time and area where you see it


Those "cell towers" really looked odd in your high res and in low res, the towers were not that visiable but there. So, no artifacts, thought the towers were, never mind.


----------



## DiceAir

Quote:


> Originally Posted by *inlandchris*
> 
> Those "cell towers" really looked odd in your high res and in low res, the towers were not that visiable but there. So, no artifacts, thought the towers were, never mind.


ok will try setting the gpu to lower clock. Heck even stock clocks for r9 280x and test but I'm telling you no artifacts. it's just the game that looks that way. My gpu's is not even reaching 90C and the vrm is running 60-70C max so no not unstable card. My asic quality is between 50-60 so really bad but yeah will try stock clocks tonight

Here you can see the towers look the same


----------



## inlandchris

Quote:


> Originally Posted by *DiceAir*
> 
> ok will try setting the gpu to lower clock. Heck even stock clocks for r9 280x and test but I'm telling you no artifacts. it's just the game that looks that way. My gpu's is not even reaching 90C and the vrm is running 60-70C max so no not unstable card. My asic quality is between 50-60 so really bad but yeah will try stock clocks tonight
> 
> Here you can see the towers look the same


Yep, you are right. I got the video to pause right on the cell phone towers and it looks like it suppose to be a reflection of the sun on them. When its going full speed, your eyes tend to focus whats going on with the towers...and...maybe miss your shot, ha ha.
Looks good to me


----------



## DiceAir

Quote:


> Originally Posted by *inlandchris*
> 
> Yep, you are right. I got the video to pause right on the cell phone towers and it looks like it suppose to be a reflection of the sun on them. When its going full speed, your eyes tend to focus whats going on with the towers...and...maybe miss your shot, ha ha.
> Looks good to me


Yes i tried lower clocks nothing wrong towers looks the same


----------



## JackCY

Quote:


> Originally Posted by *DiceAir*
> 
> So you say rather stick to screen tearing than having vsync without screen tear. You know what screen tear is so bad even limiting fps or unlimited fps. I tried limiting it to anything between 93-97fps and major screen tear. Even on single gpu I get this issue so it doesn't look like a crossfire issue. I even tried with frame pacing on or off and same issue. I don't see why I have to do the 125fps limit on 60hz as it's making this setup extremely expensive or I have to dump down the graphics so much that it would look worse than 1080p 144hz at better graphics.


Definitely, vsync is a little slower, without it the frame gets faster on the screen. If your fps is high the tear is small, sure at low fps the tears can be big when turning fast but honestly, I played on laptop for almost a decade and the tearing never bothered me. It's more the low fps that did.

What's the issue on your setup in BF4? What about other games/movies/... BF is one buggy game.

I'm for speed and latest picture as fast as possible, don't care about tearing since at high fps the tears are smaller. So at 120fps the tears are half the size compared to 60fps. Plus some games mess up input based on fps. Quake based games run best/exploit movement at 125fps and some other numbers, I think 333 and 76 as well.
Quote:


> in the screenshot I get dips in framelatency. it's like a miss communication from my system and gpu(s) and I do get it more when turning around even on lowst graphics and my resolution set to 1080p.


Yeah then it could be vsync related if you are running vsync on and it doesn't do it when it is off.
That's just damn old silly vsync. I don't know why it took them 30 years to finally start implementing vsync that is more sensible. Like the freesync/g-sync or what it's called. This old vsync, easy implementation is complete rubbish. For fast and interactive scenes at least, for movies can be also, for working it doesn't really matter often.
Quote:


> Now in the video look at the fps. My refresh rate is 96hz (never mind the game settings as it will use my desktop refresh rate as the game uses that when first booted up and It's the same when I force the game and desktop on 96hz) but my FPS goes to 100 Vsynced. So I don't know What is going on. Liek I said on my old 60hz 1080p panel it feels a lot smoother than my current panel at 60hz. I don't get this issue there. Maybe something to do with the EDID of the monitor. I don't know how to explaing this better but if you guys where here I can compare it for you with my old monitor and explain in person so you can better understand and then you guys will understand 100%. i want to be able to have 96hz and get 96fps vsynced regardless of if it's bad input lag or not this should happen whatsoever. If I can get this solved then we might even solve some other issues I'm having.


What other issues?
On a different monitor [email protected] vsynced it runs normal as you expect and on the Korean one with [email protected]+96Hz it runs weird and stutters/pauses/frames seem to sometimes arrive late or faster even on single GPU?
Yeah it could be that the electronics of the monitor are not very fast but from what I have seen when researching these Korean babies they seem to have minimum input lag and can run 120Hz, that is unless you got the pimped up version that has OSD etc. different electronic that is probably slow and also doesn't run 120Hz.
Not sure all run 120Hz, sometimes it can be better to settle at 96Hz like you may have done because 96=4*24, and movies are often 24fps~23.976fps.
Quote:


> There is 1 game where everything works out fine and that is Alien isolation. Can't remember if it was 100% fine will have to test again tonight and report back but as far as I remember it was fine for me. BTW I even changed the renderaheadlimit in BF4 and either same or worse.


I wouldn't judge by BF4, I don't have it and from what I heard both BF3 and BF4 are buggy.
Some games will just lag and lag input once you turn on vsync. Some may seem to "skip" "stutter" frames as it tries to vsync. Also unless you can run above 96Hz/96fps all the time it's pointless to even turn vsync on as if it can't run above 96 it will half the framerate to 48fps to match the vsync and this switch will for sure make it seem like stutter and all kind of weird stuff.

I don't mind vsync but the more sensible implementation that doesn't screw up the games and we may hopefully get in upcoming years.
Traditional vsync = latency + stuttering.
With the new one we will need new monitors... because the current ones don't have the firmware and electronics needed in them as they will sync with the GPU, not the GPU with the monitor.
Sure something like triple buffering works too, but it gives you a lag, which is not good either.

*Link.*
Standard double buffering no vsync = as fast image as possible.
Triple buffering, no tearing but will have a lag.
Double buffering with vsync, all the hell you can imagine









If you really really want no tearing, try triple buffering or the new freesync/gsync.
Or at worst use adaptive vsync that doesn't drop frames in half but only applies to limit maximum fps, hence you will get tearing when running below your monitor refresh rate but not when running above it. It's still going to be nearly as messy as standard vsync though.

Sure I notice tearing, but now that my fps in games is higher it's definitely better than the 20-40fps I was used to








Having 120Hz and having it synced with monitor without delay and tears at 120fps, yeah that's what I would consider "normal" but manufacturers still do not.
Quote:


> Originally Posted by *inlandchris*
> 
> Yep, you are right. I got the video to pause right on the cell phone towers and it looks like it suppose to be a reflection of the sun on them. When its going full speed, your eyes tend to focus whats going on with the towers...and...maybe miss your shot, ha ha.
> Looks good to me


Ah I see what you mean, nope that is normal, the tower structure seems to be thin and when it renders it has to decide whether to render in the pixel the shadow side or the sunny side of the structure which is very thin and so it goes gray/light over and over again. Filtering can help with that, like all those types of different AA. I don't like AA though, it blurs the picture.


----------



## Depauville Kid

Asus r9 280 TOP
Stock Clocks are 980 core, 5200 memory.
I run mine at 1180 core, 5200 memory. Bumped voltage to 1.2v and power to 120%. Custom fan profile using GPU Tweak. Temps stay under 74° at load.


----------



## M1kuTheAwesome

I just bought a new toy, guys. It's one of those power measuring tools for a wall socket. Giving it a test run, I found out that with my graphics card at stock voltage I only pull about 584 watts from the socket under Furmark + Prime95 load. Looking at several reviews of my PSU, it seems its efficiency never exceeds 87%. Some quick math shows that the PSU is outputting barely over 500 watts out of the possible 750 under this insane load, so I should have more than plenty of room for a second 280X, right? Or does anyone see anything fishy? If that is so, I will regret buying a mechanical keyboard instead of saving up a few extra pennies and grabbing a second hand 280X for CrossFire. Looks like once again I must postpone saving up for driving school to buy PC stuff... Oh well, my feet can move me about, no need for driving while I still have those.


----------



## Mr-Dark

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> I just bought a new toy, guys. It's one of those power measuring tools for a wall socket. Giving it a test run, I found out that with my graphics card at stock voltage I only pull about 584 watts from the socket under Furmark + Prime95 load. Looking at several reviews of my PSU, it seems its efficiency never exceeds 87%. Some quick math shows that the PSU is outputting barely over 500 watts out of the possible 750 under this insane load, so I should have more than plenty of room for a second 280X, right? Or does anyone see anything fishy? If that is so, I will regret buying a mechanical keyboard instead of saving up a few extra pennies and grabbing a second hand 280X for CrossFire. Looks like once again I must postpone saving up for driving school to buy PC stuff... Oh well, my feet can move me about, no need for driving while I still have those.


548w with single 280x !!!! somthing rong with your build mt !

what the brand of psu you have and how much the price of this measuring power tool ?


----------



## M1kuTheAwesome

The measuring tool was only 17.50eur, but it was bought from a respectable electronics store. The PSU is a Seasonic M12 II EVO 750.


----------



## Mr-Dark

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> The measuring tool was only 17.50eur, but it was bought from a respectable electronics store. The PSU is a Seasonic M12 II EVO 750.


So your psu effecincy 80% you pull 584 from the wall so ( 584*80% = 467 ) your pc pull 467 watt from your psu only

you can take 750w from the psu but they will pul around ( 750 * 120% = 900) 900w from the wall

can you retest with cpu at stock ? that fx 8350 at high clock take alot of power


----------



## HitJacker

Hi !

I have one VTX3D 280X and I try to oc it, but I can't change the voltage. I buy wb, so I want to unlocked this voltage









If I change the value in the bios, and I flash, it's ok ? Or the voltage can't never be unlocked ?

Thanks


----------



## BruceB

Quote:


> Originally Posted by *HitJacker*
> 
> Hi !
> 
> I have one VTX3D 280X and I try to oc it, but I can't change the voltage. I buy wb, so I want to unlocked this voltage
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I change the value in the bios, and I flash, it's ok ? Or the voltage can't never be unlocked ?
> 
> Thanks


What software are you using to OC?
For some brands the voltage is locked and for others it is not. Flahing the BIOS shouldn't be done unless needed, check to see if your Card is voltage locked before flashing.


----------



## HitJacker

I use MSI AB 4, I checked "unlock voltage control", and I can move the slider but it's not apply for the core, while the memory voltage can change.

I check the voltage with AB and gpuz : I can fix voltage to 1.3 or 0.8, nothing change (I stress gpu), just the memory voltage change.

I'm not sure if I change core voltage in bios, it will apply.


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *Mr-Dark*
> 
> So your psu effecincy 80% you pull 584 from the wall so ( 584*80% = 467 ) your pc pull 467 watt from your psu only
> 
> you can take 750w from the psu but they will pul around ( 750 * 120% = 900) 900w from the wall
> 
> can you retest with cpu at stock ? that fx 8350 at high clock take alot of power


Whoa! I ran with BIOS fully stock and only pulled 439 watts from the wall. That's a pretty crazy difference.


----------



## Mr-Dark

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> Whoa! I ran with BIOS fully stock and only pulled 439 watts from the wall. That's a pretty crazy difference.


Ya 145w big number for Oc cpu only


----------



## dayen666

someone have 280x toxic (1150/1600 stock) with oc?
idk safe Power % works in CCC


----------



## Agent Smith1984

Toxic is safe all the way to max from what I've seen.....

The board will not draw more power than what it needs to operate.
Even at 50% limiter, if the overclock you are using only requires a 30% increase in power, then that's what it will use.

Someone please correct me if I am wrong??


----------



## The Pook

Ordered my R9 280 a few minutes ago!








was gonna wait on the GTX 960 but this GT 635 is getting irritating to try to game on. paid $170 out of pocket and $15 MIR...


----------



## aaronsta1

Quote:


> Originally Posted by *Mr-Dark*
> 
> Ya 145w big number for Oc cpu only


i dont believe those numbers..

145w is the max this cpu will draw with tdp.. it should do that with stock clocks with turbo.

on mine, with tdp turned off and oc, in hwinfo under power on the vrms it is 176w on full load.. so only around 30w inc from stock.

now i know hwinfo isnt as accurate as a hardware device, bu there is no way this cpu is drawing an EXTRA 145w from the board OC then stock. that would put the cpu at almost 300w


----------



## Mr-Dark

Quote:


> Originally Posted by *aaronsta1*
> 
> i dont believe those numbers..
> 
> 145w is the max this cpu will draw with tdp.. it should do that with stock clocks with turbo.
> 
> on mine, with tdp turned off and oc, in hwinfo under power on the vrms it is 176w on full load.. so only around 30w inc from stock.
> 
> now i know hwinfo isnt as accurate as a hardware device, bu there is no way this cpu is drawing an EXTRA 145w from the board OC then stock. that would put the cpu at almost 300w


The fx 8350 at stock vs 4.8ghz at 1.51v can do that its 8 core so power hungry

the kill watt the best way to know the power usage while OC


----------



## M1kuTheAwesome

It's possible I remembered it wrong and it was 548, not 584. But yeah, those chips pull a lot of juice when overclocked, over 200W is pretty common. My system idles at around 230W, all power saving options are off.
Anyway, not sure if I should grab a second 280X early next year or wait for next gen cards that are not that far away. The first option is cheaper to buy but consumes more power, second is more expensive to buy but more efficient and more up to date Decisions, decisions...


----------



## Mr-Dark

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> It's possible I remembered it wrong and it was 548, not 584. But yeah, those chips pull a lot of juice when overclocked, over 200W is pretty common. My system idles at around 230W, all power saving options are off.
> Anyway, not sure if I should grab a second 280X early next year or wait for next gen cards that are not that far away. The first option is cheaper to buy but consumes more power, second is more expensive to buy but more efficient and more up to date Decisions, decisions...


Grab 7950 or 280x with low price 130$ this very good for price performance

for the power consumption i think 4500mhz good for fx 8350 and the power draw will be in the avarge


----------



## ricko99

Can anyone suggest a good heatsink vram for Asus R9 280x DC2TOP?


----------



## TRusselo

well... rad sprang a leak... right on to my his r9 280x that i picked up a few months ago on kijiji for 140$...
soldered the rad, and bought another his 280x off kijiji for $200...
at least my r9 protected my 7950 from getting wet!

if anyone needs some HIS ICE Q heatsinks... i have 3! they fit reference boards


----------



## Mr-Dark

Quote:


> Originally Posted by *TRusselo*
> 
> well... rad sprang a leak... right on to my his r9 280x that i picked up a few months ago on kijiji for 140$...
> soldered the rad, and bought another his 280x off kijiji for $200...
> at least my r9 protected my 7950 from getting wet!
> 
> if anyone needs some HIS ICE Q heatsinks... i have 3! they fit reference boards


Nice ! as you see you run now 2* 280x watercooled all of this with 750w power supply ?

you have message


----------



## jason387

Just got a brand new R9 270 back from Sapphire after giving my 7870 in for RMA. I was a little disappointed asI was hoping for an R9 270X as it comes with 2 6 pin connectors. So any guys here having any luck with overclocking an R9 270?


----------



## aaronsta1

Quote:


> Originally Posted by *jason387*
> 
> Just got a brand new R9 270 back from Sapphire after giving my 7870 in for RMA. I was a little disappointed asI was hoping for an R9 270X as it comes with 2 6 pin connectors. So any guys here having any luck with overclocking an R9 270?


i think you got a good deal..
the 270 has better memory.. it comes clocked at 1400, instead of the 1200 the 7870 had..
also the 270 should clock to 1080 which is the speed of the 270x..

having a MSI 270 gaming and a MSI 270X gaming, they bench the same at the same clocks..

also, dont say that 2 power plugs help you in any way, the MSI 270x i got, wont go above 1150.. but the non X goes to 1180..


----------



## jason387

Quote:


> Originally Posted by *aaronsta1*
> 
> i think you got a good deal..
> the 270 has better memory.. it comes clocked at 1400, instead of the 1200 the 7870 had..
> also the 270 should clock to 1080 which is the speed of the 270x..
> 
> having a MSI 270 gaming and a MSI 270X gaming, they bench the same at the same clocks..
> 
> also, dont say that 2 power plugs help you in any way, the MSI 270x i got, wont go above 1150.. but the non X goes to 1180..


I've only managed 1160/1600Mhz at 1.3v







. But with Vdroop the actual Voltage is 1.25v. Anyway to go beyond 1.3v to compensate for Vdroop?


----------



## aaronsta1

Quote:


> Originally Posted by *jason387*
> 
> I've only managed 1160/1600Mhz at 1.3v
> 
> 
> 
> 
> 
> 
> 
> . But with Vdroop the actual Voltage is 1.25v. Anyway to go beyond 1.3v to compensate for Vdroop?


no idea, probably with a hardware mod..

1160 tho isnt bad..


----------



## jason387

the R9 270 is a lot cooler than my previous 7870 by almost 20c.


----------



## long99x

anyone have gigabyte r9 280 bios support elpida ram,send me please


----------



## dayen666

you
Quote:


> Originally Posted by *long99x*
> 
> anyone have gigabyte r9 280 bios support elpida ram,send me please


you can open a ticket with Gigabyte, they can send you new bios!


----------



## limitlessenergy

I am an owner of XFX Double D R9 280X BE










Sadly looks like I got screwed because it won't even hold stable at the clocks it came with factory.

I had to underclock to 1020 mhz and 1500 mhz.

Going to try giving more voltage soon.


----------



## Spork13

Did you buy it new?
Screw upping voltage just to (try to) get it to work like it's meant to - it will run hotter / louder and void any warranty (?).
RMA that sucker!


----------



## BruceB

Quote:


> Originally Posted by *limitlessenergy*
> 
> I am an owner of XFX Double D R9 280X BE
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sadly looks like I got screwed because it won't even hold stable at the clocks it came with factory.
> 
> I had to underclock to 1020 mhz and 1500 mhz.
> 
> Going to try giving more voltage soon.


Quote:


> Originally Posted by *Spork13*
> 
> Did you buy it new?
> Screw upping voltage just to (try to) get it to work like it's meant to - it will run hotter / louder and void any warranty (?).
> RMA that sucker!


@Spork13's right, don't go upping the voltage and voiding your warranty; RMA it instead!


----------



## limitlessenergy

Quote:


> Originally Posted by *BruceB*
> 
> @Spork13's right, don't go upping the voltage and voiding your warranty; RMA it instead!


Already sent RMA in problem is that I was not able to open my new gift till after the 30 day registration period ended for registering the product...

Maybe they will make one holiday exception for me.

Otherwise I guess I can threaten them for false advertisement. http://imgur.com/a/ZSiM5


----------



## BruceB

Quote:


> Originally Posted by *limitlessenergy*
> 
> Already sent RMA in problem is that I was not able to open my new gift till after the 30 day registration period ended for registering the product...
> 
> Maybe they will make one holiday exception for me.
> 
> Otherwise I guess I can threaten them for false advertisement. http://imgur.com/a/ZSiM5


I doubt it's false Advertising, much more likely a faulty product. Nothing an RMA won't fix.








Are you sure it's not throttling due to heat? What temps does it run at?

Which Country are you in?
If you're in the EU, the _seller_ is required to replace any faulty products, regardless of what the manufacturer says. AFAIK there's something similar in the US too. Check with your seller before going through your manufacturer, in my experience it's much easier.


----------



## Depauville Kid

I would check to make sure your power supply is adequate and that you have good airflow. The 280x is one of the most power hungry single GPUs around. If you're set in those areas and the problem persists, RMA it for sure. The card should at least run at rated specs on stock voltage.

What are you using for your CPU? I know when I used an APU with a 7870, it wasn't powerful enough to feed the GPU and the 7870 kept throttling back to 80%. Soon as I switched the APU out for a higher end CPU, the 7870 stopped throttling and ran at full speed.


----------



## BruceB

Quote:


> Originally Posted by *Depauville Kid*
> 
> I would check to make sure your power supply is adequate and that you have good airflow. The 280x is one of the most power hungry single GPUs around. If you're set in those areas and the problem persists, RMA it for sure. The card should at least run at rated specs on stock voltage.
> 
> What are you using for your CPU? I know when I used an APU with a 7870, it wasn't powerful enough to feed the GPU and the 7870 kept throttling back to 80%. Soon as I switched the APU out for a higher end CPU, the 7870 stopped throttling and ran at full speed.


I've never heard of a too weak PSU not casuing a total crash/not allowing boot at all; still worth checking though.
What wattage & ampage can your PSU push @limitlessenergy?


----------



## Depauville Kid

Quote:


> Originally Posted by *BruceB*
> 
> I've never heard of a too weak PSU not casuing a total crash/not allowing boot at all; still worth checking though.
> What wattage & ampage can your PSU push @limitlessenergy?


You could be right, I don't know how a PSU would react if its under powered. I just threw out the PSU as a possibility. I do know some units have poor voltage regulation, especially when pushed hard. For example, the 12v rail drops to 11.7v for example. Again, I'm not a PSU expert, so I don't know what issue this would/could cause. You could be 100% accurate in your statement. I truly don't know.


----------



## BruceB

Quote:


> Originally Posted by *Depauville Kid*
> 
> You could be right, I don't know how a PSU would react if its under powered. I just threw out the PSU as a possibility. I do know some units have poor voltage regulation, especially when pushed hard. For example, the 12v rail drops to 11.7v for example. Again, I'm not a PSU expert, so I don't know what issue this would/could cause. You could be 100% accurate in your statement. I truly don't know.


The 12v dropping to 11.7v is quite extreme, PC components are very sensitive to voltage changes (example: for a phenom II stock volts are 1.475v, the processor dies at 1.550v, a difference of 0.075v, way less than the 0.300v change you had!







), it's probably for the best you threw it out before it killed everything in your PC!


----------



## Depauville Kid

Quote:


> Originally Posted by *BruceB*
> 
> The 12v dropping to 11.7v is quite extreme, PC components are very sensitive to voltage changes (example: for a phenom II stock volts are 1.475v, the processor dies at 1.550v, a difference of 0.075v, way less than the 0.300v change you had!
> 
> 
> 
> 
> 
> 
> 
> ), it's probably for the best you threw it out before it killed everything in your PC!


I have read many PSU reviews where the different rails fluctuate more than 3% when stressed. These typically get poor scores. That's the extent of my knowledge on the topic. As to not derail this thread anymore, I will stop here. Thanks for your input. You sound more knowledgeable on the topic than I am.


----------



## DiceAir

So I'm trying to overclock my Club3d r9 280x roaylking. Stock clocks is 1100 core and 1500mhz (5000mhz) memory but Whenever i try to oerclock it then my games will crash after a while. I can do 3dmark firestrike but that's about it. I tried 1125 even but still crashes. My volts is on 1.265 stock and asic quality is 58-60% and that's a bit low. Temps is 80C-83C max for the top card and vrm is 65C max so what should i do to get my overclock working?

I tried increasing the power limit to 20%. I don't know why I can't overclock it and that's both cards. I find it highly unlikely that I have 2 cards that's overclock to the max already.


----------



## Depauville Kid

Quote:


> Originally Posted by *DiceAir*
> 
> So I'm trying to overclock my Club3d r9 280x roaylking. Stock clocks is 1100 core and 1500mhz (5000mhz) memory but Whenever i try to oerclock it then my games will crash after a while. I can do 3dmark firestrike but that's about it. I tried 1125 even but still crashes. My volts is on 1.265 stock and asic quality is 58-60% and that's a bit low. Temps is 80C-83C max for the top card and vrm is 65C max so what should i do to get my overclock working?
> 
> I tried increasing the power limit to 20%. I don't know why I can't overclock it and that's both cards. I find it highly unlikely that I have 2 cards that's overclock to the max already.


1100 is already a pretty good overclock from the factory. Usually, you would increase the voltage to make it stable. There is s chance you were unlucky in the cards you received and are already at the limits. They call it the Silicon lottery. I've had an xfx 7870 and could barely add 75mhz. I also have a 280 that I could add 200mhz. Just depends on luck of the draw.


----------



## DiceAir

So i tried 1120 again with no other changes and so far so good. Could play BF4 for about 1 1/2 rounds so I will justkeep it like this and see but anyone else care to comment. Anyway must say the 1100mhz core clock is amazing compared to other companies like MSI, ASUS etc CLub3d offers one of the best factory clocks


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *BruceB*
> 
> I doubt it's false Advertising, much more likely a faulty product. Nothing an RMA won't fix.
> 
> 
> 
> 
> 
> 
> 
> 
> Are you sure it's not throttling due to heat? What temps does it run at?
> 
> Which Country are you in?
> If you're in the EU, the _seller_ is required to replace any faulty products, regardless of what the manufacturer says. AFAIK there's something similar in the US too. Check with your seller before going through your manufacturer, in my experience it's much easier.


The US (North American Office) is actually pretty nice.. When I got my 2nd XFX 280x I called about 3-4 times asking about stuff if I could pull the heatsink (the void warranty stickers) Apparently in the US market it is fine as they understand the replacement of pastes and such. Also I had called in because I found a difference between my first and 2nd one. It seems that my first one was a rebranded BE version, where the otherone wasnt so I have one that is voltage unlocked and the other not.

All in all TBH I would say call up XFX and it may just have been a poor application of thermal paste.

Oh and here is a link to my review on the first card http://www.overclock.net/users/313915/reviews


----------



## limitlessenergy

Quote:


> Originally Posted by *Depauville Kid*
> 
> I would check to make sure your power supply is adequate and that you have good airflow. The 280x is one of the most power hungry single GPUs around. If you're set in those areas and the problem persists, RMA it for sure. The card should at least run at rated specs on stock voltage.
> 
> What are you using for your CPU? I know when I used an APU with a 7870, it wasn't powerful enough to feed the GPU and the 7870 kept throttling back to 80%. Soon as I switched the APU out for a higher end CPU, the 7870 stopped throttling and ran at full speed.


I have Cooler Master 750W it is more than enough for a single card setup with only a dual core. I am overclocking motherboard / cpu / ram for the moment. I seriously doubt that is effecting it. I don't even vdroop unless you count the power saving. My cooling: I build peltiers. HSF with a TEC in between. Power supply for the TEC is separate 125 watt PSU.

I will run some heat testing but again... DOUBTFUL.

Thanks for the notice on the warranty info though. I didn't know I could legally replace the paste. We honestly should be able to because that would help for sure! Now question is do I use nanosilver or diamond paste









This is why I miss TANK GUYS. They would hand pick and test all their products but charged $25 extra for the work but it was ALWAYS worth it. I would ALWAYS get the BEST IC's and hand picked cpus. I am still using the hand picked E6600 @ 4.5 Ghz and it has been overclocked for YEARS straight...


----------



## PunkX 1

Quote:


> Originally Posted by *BruceB*
> 
> The 12v dropping to 11.7v is quite extreme, PC components are very sensitive to voltage changes (example: for a phenom II stock volts are 1.475v, the processor dies at 1.550v, a difference of 0.075v, way less than the 0.300v change you had!
> 
> 
> 
> 
> 
> 
> 
> ), it's probably for the best you threw it out before it killed everything in your PC!


11.7v is fine. ATX specifications call for a minimum of 11.4v.

Also, 1.55v will not kill your Phenom, if it is cooled properly. It's on the high side, sure...but not enough to kill your chip.


----------



## mindchap

ASUS R9 270x 4GB Overclocked Edition 1120


----------



## jason387

Quote:


> Originally Posted by *mindchap*
> 
> ASUS R9 270x 4GB Overclocked Edition 1120


Have you tried overclocking it?


----------



## Depauville Kid

Quote:


> Originally Posted by *limitlessenergy*
> 
> I have Cooler Master 750W it is more than enough for a single card setup with only a dual core. I am overclocking motherboard / cpu / ram for the moment. I seriously doubt that is effecting it. I don't even vdroop unless you count the power saving. My cooling: I build peltiers. HSF with a TEC in between. Power supply for the TEC is separate 125 watt PSU.
> 
> I will run some heat testing but again... DOUBTFUL.
> 
> Thanks for the notice on the warranty info though. I didn't know I could legally replace the paste. We honestly should be able to because that would help for sure! Now question is do I use nanosilver or diamond paste
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is why I miss TANK GUYS. They would hand pick and test all their products but charged $25 extra for the work but it was ALWAYS worth it. I would ALWAYS get the BEST IC's and hand picked cpus. I am still using the hand picked E6600 @ 4.5 Ghz and it has been overclocked for YEARS straight...


There is a pretty good chance that e6600 dual core is the source of your gpu troubles. Its pretty under powered to be matched with a 280x. I had a system that began life as an A10 5800k APU. Later, I added a 7870 AMD card, a full generation below the 280x. That four core apu bottle necked that 7870 pretty bad. I could not get the card to run at above 80% usage, usually it throttled right back to 70% usage. As soon as I replaced the cpu, the card began running at full speed.

Your 280x is way faster than a 7870 and your cpu is on par or below the apu in my scenario. To put it another way, even a Haswell dual core Pentium is about 80% faster than the e6600. Step up to any current generation quad core, and the difference is even greater. You have that e6600 running like a champ. But it is likely hindering the performance on you 280x.

Edit:
I just read an article testing core 2 duo with a gtx 660 and they were showing 12% difference in games between the dual core and quad core legacy cpus. So a 12% bottleneck on a gtx 660 even compared with a legacy quad core. Your 280x is significantly more powerful than a gtx 660 and I would venture the bottleneck would be much more significant than the results shown in that article.


----------



## Michalko

Why I can get past 1075MHz with my R9 280x Dual-X? My default voltage is 1.225v and I dont know how to increase it, even Sapphire Trixx cant increase my voltage







.. So please help me with my card, I wanna at least 1150MHz..


----------



## Depauville Kid

Quote:


> Originally Posted by *Michalko*
> 
> Why I can get past 1075MHz with my R9 280x Dual-X? My default voltage is 1.225v and I dont know how to increase it, even Sapphire Trixx cant increase my voltage
> 
> 
> 
> 
> 
> 
> 
> .. So please help me with my card, I wanna at least 1150MHz..


If the card is voltage locked, you may be stuck. Short of flashing a custom bios, not much you can do. I had an XFX card that was voltage locked before. It came with a Lifetime warranty, so I assume they locked the voltage on the card to cut down on the number of people pushing the card too hard and shortening it's life. I don't know if that's what's happening in your case, but it could be. In my experience, if the manufacturer locks the voltage, then all you can do is get the highest overclock on the stock voltage. I was super bummed when that happened to me. Now I research to see if a card has a locked voltage before I buy it just so I don't end up in the same situation again.

Perhaps someone else on the forum can help or knows how to unlock a voltage?


----------



## JackCY

Mine is unlocked, ASUS DC2T, yet I run 1110MHz on stock 1.2V anyway because the max reasonable stable it can do seems to be 1150MHz at 1.25V. Raising past that really doesn't bring much more clock but it definitely brings more heat and noise. This is stable from long experience, from endless tests of 3DMark and Heaven it seemed stable it is at 1140MHz at 1.2V but after running into hard to trace artifacts in games I just kept bumping it down by 10MHz until I don't ever get any. Some chips are good some are poor, mine is rather not so great I think. At least the RAM runs solid 7.2GHz it seems, which in turn puts more load on the core so if I lower that I can run higher core speed a little.

If your card is locked on 1.225V that's pretty decent anyway. Maybe some chips scale well with voltage but not all do, it's a lottery like with CPUs.

I even run mine sometimes on 1.1V and 1000MHz because it makes the card silent when games don't need the extra power.

I think that if 280x doesn't cut it anymore even with the slight OC, get a 970. Or use CF for even more noise, power hunger, issues but for a lower price.


----------



## Devildog83

Quote:


> Originally Posted by *Depauville Kid*
> 
> There is a pretty good chance that e6600 dual core is the source of your gpu troubles. Its pretty under powered to be matched with a 280x. I had a system that began life as an A10 5800k APU. Later, I added a 7870 AMD card, a full generation below the 280x. That four core apu bottle necked that 7870 pretty bad. I could not get the card to run at above 80% usage, usually it throttled right back to 70% usage. As soon as I replaced the cpu, the card began running at full speed.
> 
> Your 280x is way faster than a 7870 and your cpu is on par or below the apu in my scenario. To put it another way, even a Haswell dual core Pentium is about 80% faster than the e6600. Step up to any current generation quad core, and the difference is even greater. You have that e6600 running like a champ. But it is likely hindering the performance on you 280x.
> 
> Edit:
> I just read an article testing core 2 duo with a gtx 660 and they were showing 12% difference in games between the dual core and quad core legacy cpus. So a 12% bottleneck on a gtx 660 even compared with a legacy quad core. Your 280x is significantly more powerful than a gtx 660 and I would venture the bottleneck would be much more significant than the results shown in that article.


This is very true. I have a G3258 Pentium @ 4.3 Ghz with a R9 290 and I can't see that it's hindering the GPU performance at all. I don't clock the 290 very much because I see no need right now.


----------



## mindchap

Just the basics with GPUTweak, I'm still trying to get a better cooling setup, but even without overclocking it can run every new game that I have played so far, in ultra, with no problems.


----------



## FernTeixe

well ... it's just to think a little more about bottlenecks... they got a gtx970 + Intel i7 5960X and Pentium G3258 :https://www.youtube.com/watch?v=nJACZ5iStWw


----------



## long99x

Quote:


> Originally Posted by *dayen666*
> 
> you
> you can open a ticket with Gigabyte, they can send you new bios!


I mail 2 days ago but they don't reply


----------



## Depauville Kid

Quote:


> Originally Posted by *long99x*
> 
> I mail 2 days ago but they don't reply


Call them if you can. Much faster than waiting for an email.


----------



## ricko99

Quote:


> Originally Posted by *JackCY*
> 
> Mine is unlocked, ASUS DC2T, yet I run 1110MHz on stock 1.2V anyway because the max reasonable stable it can do seems to be 1150MHz at 1.25V. Raising past that really doesn't bring much more clock but it definitely brings more heat and noise. This is stable from long experience, from endless tests of 3DMark and Heaven it seemed stable it is at 1140MHz at 1.2V but after running into hard to trace artifacts in games I just kept bumping it down by 10MHz until I don't ever get any. Some chips are good some are poor, mine is rather not so great I think. At least the RAM runs solid 7.2GHz it seems, which in turn puts more load on the core so if I lower that I can run higher core speed a little.
> 
> If your card is locked on 1.225V that's pretty decent anyway. Maybe some chips scale well with voltage but not all do, it's a lottery like with CPUs.
> 
> I even run mine sometimes on 1.1V and 1000MHz because it makes the card silent when games don't need the extra power.
> 
> I think that if 280x doesn't cut it anymore even with the slight OC, get a 970. Or use CF for even more noise, power hunger, issues but for a lower price.


Can i Have your Asus r9 280x bios? and is your card's memory voltage locked to 1.5v or 1.6v ?


----------



## JackCY

I use the latest updated VBIOS that you can get via GPUTweak. I think I have old and new saved but no tools how to upload them, only know there are *tools around but none are "official" I think*, it's use at your own risk and hope it works and doesn't brick the card. Especially since ASUS has only single VBIOS.
I think the memory voltage is 1.6V locked at least from what I have noticed in the various tools and monitoring I tried, it's hard to find but some do show it. And I doubt it would run up to 7.4GHz memory on 1.5V not even sure there are any DC2Ts with 1.5V memory, maybe the versions with Elpida chips but mine has Hynix.
ASIC quality pretty low. For a $204 second hand card when new one cost $350+, can't complain.




The memory clock does boost fps quite nicely, I'm surprised since the card doesn't seem to have low bandwidth at stock, yet it does help.


----------



## DiceAir

Quote:


> Originally Posted by *FernTeixe*
> 
> well ... it's just to think a little more about bottlenecks... they got a gtx970 + Intel i7 5960X and Pentium G3258 :https://www.youtube.com/watch?v=nJACZ5iStWw


Wow never knew a 5960x is that good. You see performance increase and people thought a 4670k is enough. Maybe it is but i though a 5960x and 4670k will have exact fps due to cpu not bottlenecking. So cool to see new games makes proper use of cpu.


----------



## JackCY

4690K vs Haswell E in games is all about core frequency now, the one with higher wins so which ever you clock higher. Games are rather made more and more poorly and some get converted from consoles...
It will still render using one core but it can use other cores for audio (pretty much nonexistent complexity even these days in games), AI (some games try to make a better AI), physics calculations (that's probably what most games load the CPU with), ...

For heavy multitasking, sure more cores with more power wins because you can't clock a quad core twice as high as an octacore to make them more equal in power. 4*4GHz = 16, where as 8*3.5GHz = 28... and there is no way to clock a quad core 7GHz to make it more equal for heavy multitasking power. But the octacore costs 4 times more than the quad core *faint* while delivering less than double the advantage in total computing power available.


----------



## FernTeixe

Quote:


> Originally Posted by *DiceAir*
> 
> Wow never knew a 5960x is that good. You see performance increase and people thought a 4670k is enough. Maybe it is but i though a 5960x and 4670k will have exact fps due to cpu not bottlenecking. So cool to see new games makes proper use of cpu.


Well far cry 3 and crysis runs 10fps faster on 4760k, Tomb Raider was running 7fps faster on both G3258 and 4760k . (only taking avg )
So 16 threads , 1000$ and worse for games. So we have 2 problems now. Reviews use X editions to not have bottlenecks, but it actualy works worse than i5 WHEN it's for GAMES... and a G3258 did a great work with a gtx970... So when we see people around here saying "you must overclock your cpu above 4.5ghz" , "for 280x xfire /770sli you must i7 at 4.5ghz or it will have a huge bottleneck..." Well... it's all about lies.

Linus, The Tek, 3dOC and some others have many cpu benchmarks and what I always get from it is some cases i5 do better than i7... and 3570k = 4770k for gaming. Also for almost all games fps increase by CPU OC is almost always 1 to 3 fps, You'll only see 10 fps+ increase when the game is already running above 100fps

I love to overclock and it's always fun have more fps....but I hate when I see people talking like you must have a freaking Xedition to run sli or some i5 will bottleneck their gpu


----------



## Depauville Kid

Quote:


> Originally Posted by *FernTeixe*
> 
> Well far cry 3 and crysis runs 10fps faster on 4760k, Tomb Raider was running 7fps faster on both G3258 and 4760k . (only taking avg )
> So 16 threads , 1000$ and worse for games. So we have 2 problems now. Reviews use X editions to not have bottlenecks, but it actualy works worse than i5 WHEN it's for GAMES... and a G3258 did a great work with a gtx970... So when we see people around here saying "you must overclock your cpu above 4.5ghz" , "for 280x xfire /770sli you must i7 at 4.5ghz or it will have a huge bottleneck..." Well... it's all about lies.
> 
> Linus, The Tek, 3dOC and some others have many cpu benchmarks and what I always get from it is some cases i5 do better than i7... and 3570k = 4770k for gaming. Also for almost all games fps increase by CPU OC is almost always 1 to 3 fps, You'll only see 10 fps+ increase when the game is already running above 100fps
> 
> I love to overclock and it's always fun have more fps....but I hate when I see people talking like you must have a freaking Xedition to run sli or some i5 will bottleneck their gpu


It is interesting to see that the Pentium can be quite capable in many situations. Myself, I've always been a proponent of having a balanced system. The Pentium would be a great match for GTX 750 for example. An i5 is a wonderful choice for most mainstream and enthusiast cards, as well as most sli configurations. However. If someone was putting a $1000 worth of gpu in their system in sli, such as two 980s, there is no reason to pair that with a $169 i5. That would be too disproportionate in my opinion. It might be fine in some games and bottleneck in others. If you were putting over $1000 or more of GPU in a rig, you should at least allocate $300 for an i7.

My pairings were just examples off the top of my head, but my point is to plan a balanced system when possible. That will give you the best experience in the most situations ideally.


----------



## rdr09

Quote:


> Originally Posted by *FernTeixe*
> 
> Well far cry 3 and crysis runs 10fps faster on 4760k, Tomb Raider was running 7fps faster on both G3258 and 4760k . (only taking avg )
> So 16 threads , 1000$ and worse for games. So we have 2 problems now. Reviews use X editions to not have bottlenecks, but it actualy works worse than i5 WHEN it's for GAMES... and a G3258 did a great work with a gtx970... So when we see people around here saying "you must overclock your cpu above 4.5ghz" , "for 280x xfire /770sli you must i7 at 4.5ghz or it will have a huge bottleneck..." Well... it's all about lies.
> 
> Linus, The Tek, 3dOC and some others have many cpu benchmarks and what I always get from it is some cases i5 do better than i7... and 3570k = 4770k for gaming. Also for almost all games fps increase by CPU OC is almost always 1 to 3 fps, You'll only see 10 fps+ increase when the game is already running above 100fps
> 
> I love to overclock and it's always fun have more fps....but I hate when I see people talking like you must have a freaking Xedition to run sli or some i5 will bottleneck their gpu


The G3258 is indeed ideal in some games that do not take advantage of more threads . . . BF4 included. Why? There is single player that you play for a week or 2. Now, if the gamer plans to go Multiplayer after Single player, then i recommend a cpu with more threads, And, yes, with 2 7900 series cards, oc'ing the cpu helps.

+rep to the member right above this post.


----------



## FernTeixe

Well G3258 is just to show that you don't need to upgrade CPU to get a new gpu...also everybody do more stuff than just play...so for me and many other 2 cores is not enough at all.

I was just trying to show that people think that everything will bottleneck everything... and they tell the others like they NEED a freak high-end oh god stuff or they won't use ther SLI and it's a lie... Not everybody have money or want to spend it on premium high-end stuff for just 5 more fps.... and probably would not matter if it's 20 fps when the game is already running above 100fps. When it's about gaming.

if you go back reading this thread... or any other in this forum or in any tech forum you'll see tons of guys asking "will it bottleneck" and people saying "yes" like they will lose their money ....and most of the time it's just not true.... but one will read it and say that bottleneck is bottleneck... and will say even 5 fps is bottleneck....

there's a chart for Quad 760 sli : http://www.eurogamer.net/articles/digitalfoundry-2014-pentium-g3258-review and i5 4690k do almost same fps as 4790K .... so ..............


----------



## rdr09

Quote:


> Originally Posted by *FernTeixe*
> 
> Well G3258 is just to show that you don't need to upgrade CPU to get a new gpu...also everybody do more stuff than just play...so for me and many other 2 cores is not enough at all.
> 
> I was just trying to show that people think that everything will bottleneck everything... and they tell the others like they NEED a freak high-end oh god stuff or they won't use ther SLI and it's a lie... Not everybody have money or want to spend it on premium high-end stuff for just 5 more fps.... and probably would not matter if it's 20 fps when the game is already running above 100fps. When it's about gaming.
> 
> if you go back reading this thread... or any other in this forum or in any tech forum you'll see tons of guys asking "will it bottleneck" and people saying "yes" like they will lose their money ....and most of the time it's just not true.... but one will read it and say that bottleneck is bottleneck... and will say even 5 fps is bottleneck....
> 
> there's a chart for Quad 760 sli : http://www.eurogamer.net/articles/digitalfoundry-2014-pentium-g3258-review and i5 4690k do almost same fps as 4790K .... so ..............


i think you are mistaken, can you even run quad sli with a 760? what i gathered from that review is that the G3258 cannot even feed the 760 in some scenario and the author recommended an i5. that's just a 760. and, i think, all the games they tested were single player and still the cpu bottlenecked.


----------



## tsm106

Quote:


> Originally Posted by *FernTeixe*
> 
> Well G3258 is just to show that you don't need to upgrade CPU to get a new gpu...also everybody do more stuff than just play...so for me and many other 2 cores is not enough at all.
> 
> I was just trying to show that people think that everything will bottleneck everything... and they tell the others like they NEED a freak high-end oh god stuff or they won't use ther SLI and it's a lie... Not everybody have money or want to spend it on premium high-end stuff for just 5 more fps.... and probably would not matter if it's 20 fps when the game is already running above 100fps. When it's about gaming.
> 
> if you go back reading this thread... or any other in this forum or in any tech forum you'll see tons of guys asking "will it bottleneck" and people saying "yes" like they will lose their money ....and most of the time it's just not true.... but one will read it and say that bottleneck is bottleneck... and will say even 5 fps is bottleneck....
> 
> there's a chart for Quad 760 sli : http://www.eurogamer.net/articles/digitalfoundry-2014-pentium-g3258-review and i5 4690k do almost same fps as 4790K .... so ..............


I have an oc'd G3258 at 4.5ghz. It's a fun cpu and I got it for 100 bucks with mb lol. But dude, it is a serious bottleneck in mult-gpu. Actually its a bottleneck in general, but oc'd you can definitely get by with it in most games because most games are terribly threaded and because it has crazy IPC.

Look at your own link. Crazy high IPC. Thus on a budget and given the right scenario, you can take advantage of that stellar IPC with only two cores. The rest of the article looks to justify it's point by using limited gpu power as to balance the load with the available IPC. Talk about self-fulfilling prophecies lol.
Quote:


> CineBench Single-Core 1.30 *1.92* 1.11 1.71 1.73 1.97


----------



## FernTeixe

dear god... I said many times I'm not saying to get a G3258... I said people usually tell the other to get a freak high end that's not needed and i5 do the work and some times do it better...

is it so hard to read the text and not only the names?

never in not even ONE of my posts I said to get a G3258 , and didn't said it was enough for SLI

I was showing it's harder to have a real bottleneck than people think...


----------



## rdr09

Quote:


> Originally Posted by *FernTeixe*
> 
> dear god... I said many times I'm not saying to get a G3258... I said people usually tell the other to get a freak high end that's not needed and i5 do the work and some times do it better...
> 
> is it so hard to read the text and not only the names?
> 
> never in not even ONE of my posts I said to get a G3258 , and didn't said it was enough for SLI
> 
> I was showing it's harder to have a real bottleneck than people think...


we understand that. i5 is capable but for highend multi-gpu in some games it is not enough. if someone is it to build a system with 2 980s, for example, then it will make sense to pair them with an i7. now, it may not have to be a X99 system. A haswell maybe. like i said, those tests done in that review were done in single player.


----------



## Devildog83

Quote:


> Originally Posted by *rdr09*
> 
> we understand that. i5 is capable but for highend multi-gpu in some games it is not enough. if someone is it to build a system with 2 980s, for example, then it will make sense to pair them with an i7. now, it may not have to be a X99 system. A haswell maybe. like i said, those tests done in that review were done in single player.


Yep, the key here is PCI-E lanes. Cores, threads and clocks too but the amount of PCI-E lanes on a CPU has a serious affect on a multi-GPU set-up. The 4790K for example I think has 28 and of course a 3258 isn't near that but an I5 like the 4690k only has 16. The highest end Haswell-e has 40 so there is way more available lanes.

I will say that I have a 3258 paired with a 290 and I run BF4 on ultra settings and with Mantle on I get 65 to 80 FPS (1080p single player) so if you are not running multi-GPU's or 4k you do not have to spend a lot on a CPU.


----------



## rdr09

Quote:


> Originally Posted by *Devildog83*
> 
> Yep, the key here is PCI-E lanes. Cores, threads and clocks too but the amount of PCI-E lanes on a CPU has a serious affect on a multi-GPU set-up. The 4790K for example I think has 28 and of course a 3258 isn't near that but an I5 like the 4690k only has 16. The highest end Haswell-e has 40 so there is way more available lanes.
> 
> I will say that I have a 3258 paired with a 290 and I run BF4 on ultra settings and with Mantle on I get 65 to 80 FPS (1080p single player) so if you are not running multi-GPU's or 4k you do not have to spend a lot on a CPU.


it is just my observation and my experience having the ability to enable and disable HT on my i7 SB. in that review, the pentium even at 4.5GHz was not able to feed the GTX 760 in some case. it was certainly fast but i think it lacked the cores, so a bottleneck occurred.

also, i think gpus are getting faster (then add multi-gpu) that cpus are falling behind, thus the need to oc. it would be interesting how DX12 will change all these like how mantle is making an impact in the games it supports.


----------



## Devildog83

Quote:


> Originally Posted by *rdr09*
> 
> it is just my observation and my experience having the ability to enable and disable HT on my i7 SB. in that review, the pentium even at 4.5GHz was not able to feed the GTX 760 in some case. it was certainly fast but i think it lacked the cores, so a bottleneck occurred.
> 
> also, i think gpus are getting faster (then add multi-gpu) that cpus are falling behind, thus the need to oc. it would be interesting how DX12 will change all these like how mantle is making an impact in the games it supports.


DX12 might be interesting. With DX11 in BF4 it was freezing and skipping badly but as soon as I switched to Mantle it was smooth as butter. I don't recommend a G3258 for high-end gaming but it is nice to know you can spend $70 and get a chip that will handle at least mid-range gaming at 1080p res. for those without the money to spend on the high-end stuff. Mine sits at 4.3 Ghz @ 1.2v and consumes almost no power. Even with a lower-end GPU like a Gtx 660 or something this would be great for an HTPC. Having said that I had my fun with it and will be upgrading to a 4790k next week I think.


----------



## Devildog83

Hey guys, I haven't been around much and I am sorry for that. I haven't gone back through he thread and checked if I missed any folks that wanted to join so if you do please post a pick the card type and current clocks and I will add you. You can also post a link to a post if you have already one this.

Thanks,
DD


----------



## FernTeixe

my point was really simple... most of the time I see guys who already have a i5 or fx8350 and a 280x but they have money for one more 280x now...they come here to ask if it will bottleneck and most of the people just say "it will.. " or "a little" or even "it will not be worth with your cpu" .... so I think those reviews are good to clarify how much they will actually lose.... and as it show.. most of the cases it will be fine. Since the problem is always when you don't have money. When you have... you can go for the best.


----------



## rdr09

Quote:


> Originally Posted by *FernTeixe*
> 
> my point was really simple... most of the time I see guys who already have a i5 or fx8350 and a 280x but they have money for one more 280x now...they come here to ask if it will bottleneck and most of the people just say "it will.. " or "a little" or even "it will not be worth with your cpu" .... so I think those reviews are good to clarify how much they will actually lose.... and as it show.. most of the cases it will be fine. Since the problem is always when you don't have money. When you have... you can go for the best.


here was BF3 (less demanding than BF4) MP 64 with crossfire 7950/7970 with an i7 4.5 HT on and off . . .





see the usages on the cores? see, i don't have experience with the FX but if a member already has an i5, then i would not recommend going i7 unless if shooting for highend. it should handle most games even with 2 280X like you said. now, if someone is building new and has plans of playing multi-player games, then it is a different story.


----------



## Mr-Dark

I do some of test for ht off vs on actualy there big differnt

test pc is i7 4790k + 16gb ram + 7970/7950 crossfire 1080p the test on bf3 and bf4 both the same result in the most demanding map 64player

disable the ht and overclock ot 4700 give me around 95 fps on avarge and drop to 60fps on demanding map but not smooooth stiture like hell the usage + 95% all the time

enable the ht and underclock to 4000 give me smoothe gameplay with over 105fps avarge never drop bellow 85 and the usage aroung 65% to 75% only


----------



## JackCY

1150: i5 with 2x PCI-E 3.0 8x is enough for CF/SLI. i7 4790K has the same number of PCI-E lanes, again 16 at max.
Only way to get more is to go into "enthusiast" more expensive 2011 with i7s and DDR4, 28 or 40 lanes total.

Even on single GPU, i7 4790 was rubbish (worse in games, benches, ...) compared to OCed i5 4690K anywhere outside pure parallel applications but even there 4690K with OC can close the gap a lot for much less money. Sure HT is nice but it only allows to feed the CPU more and can it process a little more work, the raw power though is the same it's just like eating with two hands makes the mouth more busy than eating with only one, you still have one mouth though and if you are good and fast with one hand two won't help ya.

*HT on vs off*


----------



## Mr-Dark

Quote:


> Originally Posted by *JackCY*
> 
> 1150: i5 with 2x PCI-E 3.0 8x is enough for CF/SLI. i7 4790K has the same number of PCI-E lanes, again 16 at max.
> Only way to get more is to go into "enthusiast" more expensive 2011 with i7s and DDR4, 28 or 40 lanes total.
> 
> Even on single GPU, i7 4790 was rubbish (worse in games, benches, ...) compared to OCed i5 4690K anywhere outside pure parallel applications but even there 4690K with OC can close the gap a lot for much less money. Sure HT is nice but it only allows to feed the CPU more and can it process a little more work, the raw power though is the same it's just like eating with two hands makes the mouth more busy than eating with only one, you still have one mouth though and if you are good and fast with one hand two won't help ya.
> 
> *HT on vs off*


Loooooooool 4690k faster than 4790 ?


----------



## rdr09

Quote:


> Originally Posted by *JackCY*
> 
> 1150: i5 with 2x PCI-E 3.0 8x is enough for CF/SLI. i7 4790K has the same number of PCI-E lanes, again 16 at max.
> Only way to get more is to go into "enthusiast" more expensive 2011 with i7s and DDR4, 28 or 40 lanes total.
> 
> Even on single GPU, i7 4790 was rubbish (worse in games, benches, ...) compared to OCed i5 4690K anywhere outside pure parallel applications but even there 4690K with OC can close the gap a lot for much less money. Sure HT is nice but it only allows to feed the CPU more and can it process a little more work, the raw power though is the same it's just like eating with two hands makes the mouth more busy than eating with only one, you still have one mouth though and if you are good and fast with one hand two won't help ya.
> 
> *HT on vs off*


the minimum fps is more important gauge. even with 337.50 it was higher with HT on. like i said, multiplayer.


----------



## buttface420

nevermind


----------



## FernTeixe

well guys it's funny how it just can't be ok to an i5 to be enough.... always something more to it lol


----------



## Devildog83

Quote:


> Originally Posted by *Devildog83*
> 
> Yep, the key here is PCI-E lanes. Cores, threads and clocks too but the amount of PCI-E lanes on a CPU has a serious affect on a multi-GPU set-up. The 4790K for example I think has 28 and of course a 3258 isn't near that but an I5 like the 4690k only has 16. The highest end Haswell-e has 40 so there is way more available lanes.
> 
> I will say that I have a 3258 paired with a 290 and I run BF4 on ultra settings and with Mantle on I get 65 to 80 FPS (1080p single player) so if you are not running multi-GPU's or 4k you do not have to spend a lot on a CPU.


I looked at the wrong CPU, the 4790k only has 16 lanes.


----------



## JackCY

Quote:


> Originally Posted by *Mr-Dark*
> 
> Loooooooool 4690k faster than 4790 ?


Yes notice the lack of K on i7. 4690K usually 4.4GHz+, mine does 4.6GHz. 4790 at full blast drops to 3.6GHz, and at single core load runs around 3.7GHz because other cores are loaded with some crap from OS etc. and the single core turbo almost never gets used, often the 3 core turbo speed gets applied and the 4 core turbo speed.
So in my case 4690K clocks 1GHz faster than 4790.
Don't have that many results saved from 4790 since I had it not even two weeks before returning it once DC got released.

With 280x DC2T still on stock clocks:


Spoiler: Warning: Spoiler!



Direct swap, reinstalled Intel chipset drivers, for fun I guess, and that was it.
































Of course in CPU multicore only task the 4790 with HT does better a little.
Quote:


> COMMENT=MAXON CINEBENCH is based on the high performance animation and rendering software MAXON CINEMA 4D.
> COMMENT=These are your MAXON CINEBENCH R15 results.
> COMMENT=
> COMMENT=Results Disclaimer - CINEBENCH results are indicative of overall system performance when using CINEMA 4D,
> COMMENT=and do not necessarily reflect the performance of the tested hardware with other applications.
> COMMENT=Performance of each component (processor, graphics card) does rely somewhat on other components in the system.
> COMMENT=Results provided are typical, although not derived from specific testing procedures.
> COMMENT=
> *CORES=4
> LOGICALCORES=1
> MHZ=4600.000000*
> PROCESSOR=Intel Core *i5-4690K* CPU 2400MHz/CL10
> OPENGLVENDOR=Advanced_Micro_Devices
> OPENGLCARD=AMD Radeon R9 200 Series
> OPENGLVERSION=3.2.12874 Compatibility Profile Context 14.100.0.0
> DRIVERVERSION=
> CBTYPE=64 Bit
> OSVERSION=Windows 8, 64 Bit
> *CBCPU1=181.694299
> CBCPUX=705.019741*
> CBOPENGL=157.195485
> CBOPENGLQUALITY=98.017265
> C4DINFO=
> C4DVERSION=15.037
> C4DBUILDID=RC83328demo


Quote:


> COMMENT=MAXON CINEBENCH is based on the high performance animation and rendering software MAXON CINEMA 4D.
> COMMENT=These are your MAXON CINEBENCH R15 results.
> COMMENT=
> COMMENT=Results Disclaimer - CINEBENCH results are indicative of overall system performance when using CINEMA 4D,
> COMMENT=and do not necessarily reflect the performance of the tested hardware with other applications.
> COMMENT=Performance of each component (processor, graphics card) does rely somewhat on other components in the system.
> COMMENT=Results provided are typical, although not derived from specific testing procedures.
> COMMENT=
> *CORES=4
> LOGICALCORES=2
> MHZ=3600.000000*
> PROCESSOR=Intel Core *i7-4790* CPU 2400MHz/CL10
> OPENGLVENDOR=Advanced_Micro_Devices
> OPENGLCARD=AMD Radeon R9 200 Series
> OPENGLVERSION=3.2.12874 Compatibility Profile Context 14.100.0.0
> DRIVERVERSION=
> CBTYPE=64 Bit
> OSVERSION=Windows 8, 64 Bit
> *CBCPU1=159.148239
> CBCPUX=770.831727*
> CBOPENGL=144.491247
> CBOPENGLQUALITY=98.017265
> C4DINFO=
> C4DVERSION=15.037
> C4DBUILDID=RC83328demo


In games, yes faster. In CPU heavy mutlithreaded processing: close enough for me for significantly less money.



Is the extra that you have to pay just for HT with 4790K worth it? Nope. Intel here has 44% mark up on 4790K compared to 4690K, when the whole useful difference is only HT lol, was the same with 4670K and 4770K, now Intel sure does try to give you at least higher clocks on 4790K from the get go but it confuses people as they often think wow I could clock this to 5GHz easily haha, nope, Intel only exploits the headroom more on 4790K is all.
The only non K i7 I would consider keeping would be 1231v3 but it clocks only 3.4-3.8GHz, is still more expensive than 4690K, lacks iGPU so I skipped it since I needed a back up GPU in case me second hand 280x didn't work well.


----------



## Mr-Dark

Quote:


> Originally Posted by *JackCY*
> 
> Yes notice the lack of K on i7. 4690K usually 4.4GHz+, mine does 4.6GHz. 4790 at full blast drops to 3.6GHz, and at single core load runs around 3.7GHz because other cores are loaded with some crap from OS etc. and the single core turbo almost never gets used, often the 3 core turbo speed gets applied and the 4 core turbo speed.
> So in my case 4690K clocks 1GHz faster than 4790.
> Don't have that many results saved from 4790 since I had it not even two weeks before returning it once DC got released.
> 
> With 280x DC2T still on stock clocks:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Direct swap, reinstalled Intel chipset drivers, for fun I guess, and that was it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Of course in CPU multicore only task the 4790 with HT does better a little.
> 
> In games, yes faster. In CPU heavy mutlithreaded processing: close enough for me for significantly less money.
> 
> 
> 
> Is the extra that you have to pay just for HT with 4790K worth it? Nope. Intel here has 44% mark up on 4790K compared to 4690K, when the whole useful difference is only HT lol, was the same with 4670K and 4770K, now Intel sure does try to give you at least higher clocks on 4790K from the get go but it confuses people as they often think wow I could clock this to 5GHz easily haha, nope, Intel only exploits the headroom more on 4790K is all.
> The only non K i7 I would consider keeping would be 1231v3 but it clocks only 3.4-3.8GHz, is still more expensive than 4690K, lacks iGPU so I skipped it since I needed a back up GPU in case me second hand 280x didn't work well.


With single gpu i think both perform good but with multi gpu i6 will be craap in demanding game like bf4 multiplayer


----------



## JackCY

Is it worth it to spend the extra lets say $100 difference between 4690K and 4790K on the CPU or GPU if we are talking gaming only?
I would say rather save the $100 and buy a 970 instead of getting 4790K with a weaker older GPU.

BF4 is BF4 just one of many many games, has it's own issues.

Sure for multi GPU, money obviously is not much of a limit, so why limit the CPU.


----------



## Mr-Dark

Quote:


> Originally Posted by *JackCY*
> 
> Is it worth it to spend the extra lets say $100 difference between 4690K and 4790K on the CPU or GPU if we are talking gaming only?
> I would say rather save the $100 and buy a 970 instead of getting 4790K with a weaker older GPU.
> 
> BF4 is BF4 just one of many many games, has it's own issues.
> 
> Sure for multi GPU, money obviously is not much of a limit, so why limit the CPU.


for me 95% of the games i play on me pc is bf3 + bf4 multi i have i5 before its just performe like crap when record gameplay or opan any program in the background while playing

and the most annoying the cpu usage i dont wont see my 2000$ pc while gaming hit 95% cpu usage thats it


----------



## JackCY

Can you even hit 100% CPU usage when you have HT or only 50%?








Where I have seen non HT hit 90% a CPU with HT was showing half of that but there isn't magically twice the power lol. It's just that the CPU usage got reported per thread instead of per core. No idea how Intel reports the usage if it raises the usage of both threads on the core if one gets busy or not since it will obviously steal resources available for the other.
If more cores is needed there is always this









Spoiler: Warning: Spoiler!




8 CPU * 10 cores * 2 HT = 160


----------



## Mr-Dark

Quote:


> Originally Posted by *JackCY*
> 
> Can you even hit 100% CPU usage when you have HT or only 50%?
> 
> 
> 
> 
> 
> 
> 
> 
> Where I have seen non HT hit 90% a CPU with HT was showing half of that but there isn't magically twice the power lol. It's just that the CPU usage got reported per thread instead of per core. No idea how Intel reports the usage if it raises the usage of both threads on the core if one gets busy or not since it will obviously steal resources available for the other.
> If more cores is needed there is always this
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 8 CPU * 10 cores * 2 HT = 160


I understand you 100% with single card my cpu usage didnt go over 60% in any games but with 2 card my cpu hit 90% somtimes

with bf3 or bf4 or crysis 3 this mean the game benfit from HT

as i tell you just disable ht on my cpu and wow i cant play bf4 at 120fps any more lag stiture as hell with + 95% cpu usage


----------



## jamponget9

Sorry if this has been asked before but...

I had this card for almost a year now its a R9 270x Toxic and I never even bothered to look under the hood while it is on... Not until now... For all I know the 3 fans are supposed to be spinning, but just this morning i noticed 1 fan not spinning while at boot up and idle and while gaming... I tried to fiddle with MSI Fan control and not only until the fan speed is at 80% that the 3 fans are fully spinning... here are the pics...

I tried firing up Archeage game until the card runs hot at 60 degrees but the fan wont spin...

Now for the question.. is my card broken? or is my fan malfunctioning?

VCARD at Idle and Gaming... Note that 1 fan is not spinning


VCARD when Fan Speed is Forced at 80%


It wud be so nice of someone enlighten me on this matter.... TIA!!!!


----------



## aaronsta1

Quote:


> Originally Posted by *jamponget9*
> 
> Sorry if this has been asked before but...
> 
> I had this card for almost a year now its a R9 270x Toxic and I never even bothered to look under the hood while it is on... Not until now... For all I know the 3 fans are supposed to be spinning, but just this morning i noticed 1 fan not spinning while at boot up and idle and while gaming... I tried to fiddle with MSI Fan control and not only until the fan speed is at 80% that the 3 fans are fully spinning... here are the pics...
> 
> I tried firing up Archeage game until the card runs hot at 60 degrees but the fan wont spin...
> 
> Now for the question.. is my card broken? or is my fan malfunctioning?
> 
> VCARD at Idle and Gaming... Note that 1 fan is not spinning
> 
> 
> VCARD when Fan Speed is Forced at 80%
> 
> 
> It wud be so nice of someone enlighten me on this matter.... TIA!!!!


the bearing could be seizing up.

try to spin it with the power off and see if it turns as freely as the other ones..

it could be that the 3rd fan doesnt turn on until it gets hot.. not sure.. but id check that first.


----------



## Mr-Dark

Quote:


> Originally Posted by *aaronsta1*
> 
> the bearing could be seizing up.
> 
> try to spin it with the power off and see if it turns as freely as the other ones..
> 
> it could be that the 3rd fan doesnt turn on until it gets hot.. not sure.. but id check that first.


I think its normal becouse you say its work when you set fan speed 80%

my friend have sapphire 7970 vapor-x and the card work with 1 fan until temp hit 65c


----------



## JackCY

I think it's normal, you have to search for someone or a review of that specific card. If it has separate fan channels two or more than not all fans might be spinning all the time and only when needed the 3rd fan will start spinning.

Mr-Dark: yeah that sucks if without HT you can't run SLI/CF without glitches. Having total CPU load in a game above 80% on these new CPUs is just ridiculous, dunno what they are doing in BF3,4 and C3 to sack the CPU so much.


----------



## Mr-Dark

Quote:


> Originally Posted by *JackCY*
> 
> I think it's normal, you have to search for someone or a review of that specific card. If it has separate fan channels two or more than not all fans might be spinning all the time and only when needed the 3rd fan will start spinning.
> 
> Mr-Dark: yeah that sucks if without HT you can't run SLI/CF without glitches. Having total CPU load in a game above 80% on these new CPUs is just ridiculous, dunno what they are doing in BF3,4 and C3 to sack the CPU so much.


Thats true bf4 super cpu demanding i just use mantle and cap my fps @ 120 rock solid and my cpu usage 60% not any more









but they use alot of ram around 7gb the dx use 5gb or less


----------



## jamponget9

Wow thanks for the replies and true enough.... And yes, having been tweaking the fan speed curve left at default will not spin the other remaining fan not until it is really needed... Thank goodness it was not something else! BTW, Thank you so much!


----------



## JackCY

Quote:


> Originally Posted by *Mr-Dark*
> 
> Thats true bf4 super cpu demanding i just use mantle and cap my fps @ 120 rock solid and my cpu usage 60% not any more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but they use alot of ram around 7gb the dx use 5gb or less


Are you sure they are not cracking some military codes in the background secretly? That's crazy tons of CPU and RAM usage for an FPS shooter








Now if it was 7GB VRAM I could understand that it might be using 8K textures and heaps of effects, but RAM, nope.

jamponget9: no problem, I've seen it before in some review for sure, could have been Saphire too. Until recently most cards run all fans, only now some 970/980s from ASUS and MSI can run hybrid I think.


----------



## Mr-Dark

Quote:


> Originally Posted by *JackCY*
> 
> Are you sure they are not cracking some military codes in the background secretly? That's crazy tons of CPU and RAM usage for an FPS shooter
> 
> 
> 
> 
> 
> 
> 
> 
> Now if it was 7GB VRAM I could understand that it might be using 8K textures and heaps of effects, but RAM, nope.
> 
> jamponget9: no problem, I've seen it before in some review for sure, could have been Saphire too. Until recently most cards run all fans, only now some 970/980s from ASUS and MSI can run hybrid I think.


this ram usage not vram the vram about 2300mb per card


----------



## jason387

Can I join the Club? I'm using the Sapphire R9 270 Dual X Boost 2GB GPU. Here's a GPU-Z validation- http://www.techpowerup.com/gpuz/details.php?id=vaz6m


----------



## Overclockerrr

Hello guys! I have an R9 270X Vapor-X, suddenly I noticed my temps were higher than before so I checked the fan's RPM at 100% and it is 2.7k-2.9k. Anyone with similar cards could tell me if it is normal speed or the fans are going bad?

Merry Christmas to you guys btw.


----------



## jason387

Quote:


> Originally Posted by *Overclockerrr*
> 
> Hello guys! I have an R9 270X Vapor-X, suddenly I noticed my temps were higher than before so I checked the fan's RPM at 100% and it is 2.7k-2.9k. Anyone with similar cards could tell me if it is normal speed or the fans are going bad?
> 
> Merry Christmas to you guys btw.


Thermal paste could be burnt out. The card could be dusty. Btw, have you tried overclocking it?? If so, what's the highest frequency that it can run at??


----------



## Overclockerrr

Quote:


> Originally Posted by *jason387*
> 
> Thermal paste could be burnt out. The card could be dusty. Btw, have you tried overclocking it?? If so, what's the highest frequency that it can run at??


I'm not sure about the thermal paste but the card isn't dusty. Yep, 1210/1450 on 1240 VDDC (Catalyst 14.4) but after I updated to the Omega drivers had to downclock to 1190/1450 and up the voltage to 1250 VDDC to keep it stable.

I really want to overclock this card using MSI Afterburner but the voltage slider is grayed out no matter what I do. I'm having problems in games where it would downclock when the game is at the loading screen causing the display drivers to crash.


----------



## jason387

Quote:


> Originally Posted by *Overclockerrr*
> 
> I'm not sure about the thermal paste but the card isn't dusty. Yep, 1210/1450 on 1240 VDDC (Catalyst 14.4) but after I updated to the Omega drivers had to downclock to 1190/1450 and up the voltage to 1250 VDDC to keep it stable.
> 
> I really want to overclock this card using MSI Afterburner but the voltage slider is grayed out no matter what I do. I'm having problems in games where it would downclock when the game is at the loading screen causing the display drivers to crash.


I guess your power limit needs to be increased and you can increase your voltage from the bios itself. Use VBE7 to edit the bios. I could do it for you if you like.


----------



## Overclockerrr

Quote:


> Originally Posted by *jason387*
> 
> Thermal paste could be burnt out. The card could be dusty. Btw, have you tried overclocking it?? If so, what's the highest frequency that it can run at??


Quote:


> Originally Posted by *jason387*
> 
> I guess your power limit needs to be increased and you can increase your voltage from the bios itself. Use VBE7 to edit the bios. I could do it for you if you like.


Ah yes, I forgot about that. My power limit was on 30%. I tried using VBE7 but I can only change the voltage on 3d clocks others were grayed out. (Or am I doing something wrong?) It's like the GPU jumps to 2D/UVD Clocks on loading screens then back to 3D clocks causing crashes. Thanks for the offer man!


----------



## jason387

That shouldn't happen. But if you want you can disable the power down mode to 2D clocks in Msi Afterburner. Try increasing the power limit to 50%. Can you post a screenshot of your firestrike score when the GPU is overclocked?


----------



## tsm106

Quote:


> Originally Posted by *Overclockerrr*
> 
> Ah yes, I forgot about that. My power limit was on 30%. I tried using VBE7 but I can only change the voltage on 3d clocks others were grayed out. (Or am I doing something wrong?) It's like the GPU jumps to 2D/UVD Clocks on loading screens then back to 3D clocks causing crashes. Thanks for the offer man!


Try following the steps in this thread.

http://www.overclock.net/t/1529888/pc-freezes-from-youtube-browser-apps-after-r9-290-install/0_40#post_23271996


----------



## Praston

Sadly my Asus 270X TOP card just died yesterday during cs:go







and it wasn't overclocked. GPU-Z tells 0 at clock speed and i had to take it off from the computer otherwise it would not work.


----------



## Overclockerrr

Quote:


> Originally Posted by *jason387*
> 
> That shouldn't happen. But if you want you can disable the power down mode to 2D clocks in Msi Afterburner. Try increasing the power limit to 50%. Can you post a screenshot of your firestrike score when the GPU is overclocked?


There is no effect when I do that in afterburner. I used ASUS GPU Tweak, but the voltage limit on 2D clocks is 1.175v. Also increased the power limit to 50%, didn't do anything. Okay, will do after I download.
Quote:


> Originally Posted by *tsm106*
> 
> Try following the steps in this thread.
> 
> http://www.overclock.net/t/1529888/pc-freezes-from-youtube-browser-apps-after-r9-290-install/0_40#post_23271996


But no matter what I do or options I check/uncheck, I can't adjust the voltage. I will try using the newest afterburner and use the previous ones to see if I could use the voltage slider.


----------



## frostbitten

I have Asus 280x DCU2T and I'm looking at how to raise TDP limit and keeping UEFI working.

Stock TDP is set to 204w which is too low and can throttle clocks. I've used VBE7 to raise limit and then HD7xxx Series UEFI Patch Tool BETA to fix UEFI checksum. Unfortunately UEFI GOP isn't working.

Any ideas?


----------



## JackCY

ASUS 280x DC2T throttling?
What VBIOS did you manage to load into the poor thing?
On stock VBIOS the card doesn't throttle easily even on 1.3V in Furmark.
I have tested the power limit and with the OC I had the power limit would only kick in if I set it down to 90 or 95%, that is when torturing with Furmark or OCCT insanity tests.
OCed it won't throttle on 100% power limit during any gaming. Just to be sure I set it to 120% and that limit is IMHO ridiculously high, so high I don't think the card could survive, or not for long, so much power.

Since ASUS has only single VBIOS flashing and playing with VBIOS and especially some custom ones seems rather risky.
What was your need to flash something custom or modified? Does custom fan profile or setting 2D (150/300) clocks, 2D (500/1600) clocks, 3D (970/1600) clocks, 3D Turbo (your OC) clocks all work?
I think only the 3D Turbo clocks can be set no?
No idea if custom fan profile works with 280x DC2T either?


----------



## frostbitten

I've did nothing - BIOS is stock (15.41), powertune +20%, temperatures absolutely fine.

However, voltages is what is what is bothering me - vdroop is insane. I set it to 1.26v for 1150 - card reads 1.21v and 12v reading falls to 11.7v (CM Silent Pro Gold 600w, not on cheap side, however, 4 years old) and what happens next is that clocks lock down to 970 and they ain't returning to 1150 unless I reboot.


----------



## JackCY

Quote:


> Originally Posted by *frostbitten*
> 
> I've did nothing - BIOS is stock (15.41), powertune +20%, temperatures absolutely fine.
> 
> However, voltages is what is what is bothering me - vdroop is insane. I set it to 1.26v for 1150 - card reads 1.21v and 12v reading falls to 11.7v (CM Silent Pro Gold 600w, not on cheap side, however, 4 years old) and what happens next is that clocks lock down to 970 and they ain't returning to 1150 unless I reboot.


Man!
I just replied on someone asking about the same PSU but in 800W. Read here.
It's not good, it seems to have almost 100mV ripple, which is a lot and you seem to have vdroop too.
It is normal for the GPU to drop the vcore under load since that's how they designed it and it doesn't have "LLC" like Intel CPUs have that will boost the vcore to keep the vcore stable but it can overshoot if set too aggressive. GPUs don't seem to have it, "yet" and get vcore drops.

My DC2T drops voltage by, what was it, 0.05V I think, or even 0.07V under load. When I set 1250mV it will run on 1200mV or so. On default 1200mV it runs on 1130-1140mV or so. This is normal.
Power limit has nothing to do with it, power limit will only downclock the card if the amperage and resulting power draw gets really really high. Say above 160A Iout. Normal gaming and full load without limit mine runs 80-130A I think.
90-95% power limit moves the max amperage limit down to around 100A, it jumps in quite large steps actually when it comes to allowed output power and amperage.

Does VBE7 work with 280x? I've looked at it before and steered clear because of no apparent support for 280x. Now I read DC2T might have dual bios? A hidden button near the CF connectors, have to check. No, no switch anywhere, single bios, not worth risking reflashing with something custom. Pic posted as being dual bios 280x DC2T looks like older 7800 series. I've checked the board before, there is nothing but voltage measurement points and ROG connection or some such, no pins though just solder points or vias and labels.

VBE7: open VBIOS, immediately UEFI blah blah, only supports legacy, UEFI disabled, well "worthless" tool then. Who still uses legacy bios these days.


----------



## Arsh31536

Can someone come up with a rig which is under $1000 (SGD) and has either R9 280 or R9 270x. I've been looking at some websites for days but just can't make up my mind on which products to use.
If possible can use this website: https://www.bizgram.com.sg/
So that I can straight away order all the parts for convenience sake.

I can stretch my budget with another $200 if it is really impossible to get a rig under $1000.


----------



## Maximus Knight

I'm afraid you might have misspelt
Quote:


> Originally Posted by *Arsh31536*
> 
> Can someone come up with a rig which is under $1000 (SGD) and has either R9 280 or R9 270x. I've been looking at some websites for days but just can't make up my mind on which products to use.
> If possible can use this website: https://www.bizgram.com.sg/
> So that I can straight away order all the parts for convenience sake.
> 
> I can stretch my budget with another $200 if it is really impossible to get a rig under $1000.


i sell you my 3 months old Power Color R9 280 Turbo Duo for 250 bucks

Bedok MRT


----------



## Roboyto

Quote:


> Originally Posted by *Arsh31536*
> 
> Can someone come up with a rig which is under $1000 (SGD) and has either R9 280 or R9 270x. I've been looking at some websites for days but just can't make up my mind on which products to use.
> If possible can use this website: https://www.bizgram.com.sg/
> So that I can straight away order all the parts for convenience sake.
> 
> I can stretch my budget with another $200 if it is really impossible to get a rig under $1000.


http://www.newegg.com/global/sg

I would shop NewEgg, much larger selection, seemingly better prices, much easier to navigate, and they always have great customer service.

This is $1000 Singapore currency I presume? If it's USD then you have about another 20% additional room to work with.

The price difference between 270X & 280 isn't much so I would go with the 280 since it has 40% more GPU cores and that extra GB of VRAM; 1280 ~vs~ 1792 and 2GB ~vs~ 3GB.

I didn't know if you needed a case, drives, etc, so this is a complete system. Had to go with AMD to keep the price within your budget, but for gaming purposes the FX 8-core is more than capable. If you want an i5 4690k & a nice Gigabyte Z87X motherboard you must add $220. If you don't need a few of these items then you can afford an Intel board/CPU.

(sg)$1,168 + Shipping

This is at the peak of your budget, but this will perform quite nice and play many games at high-to-max settings.

FX-8320E CPU, Zalman 120mm AIO Watercooler, ASUS 990FX rboard, 8GB 1866 G.Skill RAM, Gigabyte R9 280, 650W Gold PSU, Crucial 128GB SSD, 1TB Seagate HDD, Corsair Carbide 200R

FX-8320E $198
http://www.newegg.com/global/sg/Product/Product.aspx?Item=N82E16819113376&cm_re=8320-_-19-113-376-_-Product
Zalman LQ-310 AIO Cooler - $89
http://www.newegg.com/global/sg/Product/Product.aspx?Item=N82E16835118134
ASUS M5A99FX PRO R2.0 AM3+ $179
http://www.newegg.com/global/sg/Product/Product.aspx?Item=N82E16813128514&cm_re=990fx-_-13-128-514-_-Product
G.Skill Sniper 8GB 1866 - $106
http://www.newegg.com/global/sg/Product/Product.aspx?Item=N82E16820231460&cm_re=8gb_ddr3-_-20-231-460-_-Product
Gigabyte R9 280 - $258
http://www.newegg.com/global/sg/Product/Product.aspx?Item=N82E16814125515&cm_re=r9_280-_-14-125-515-_-Product
Rosewill Capstone 650W - $100
http://www.newegg.com/global/sg/Product/Product.aspx?Item=N82E16817182071&cm_re=rosewill_capstone-_-17-182-071-_-Product
Crucial MX100 128GB - $85
http://www.newegg.com/global/sg/Product/Product.aspx?Item=N82E16820148819&cm_re=crucial_mx100-_-20-148-819-_-Product
Corsair Carbide 200R - $80
http://www.newegg.com/global/sg/Product/Product.aspx?Item=N82E16811139018&cm_re=corsair_case-_-11-139-018-_-Product
Seagate 1TB Drive - $73
http://www.newegg.com/global/sg/Product/Product.aspx?Item=N82E16822148840&cm_re=1tb_hdd-_-22-148-840-_-Product


----------



## JackCY

Quote:


> Originally Posted by *Arsh31536*
> 
> Can someone come up with a rig which is under $1000 (SGD) and has either R9 280 or R9 270x. I've been looking at some websites for days but just can't make up my mind on which products to use.
> If possible can use this website: https://www.bizgram.com.sg/
> So that I can straight away order all the parts for convenience sake.
> 
> I can stretch my budget with another $200 if it is really impossible to get a rig under $1000.


Sorry but that site is awfully outdated. I can't even find Maxwell cards there.

So pcpartpicker and scout for them in your country.
Case only, no peripherals.
1000 SGD = 800 USD
1200 SGD = 960 USD

That's a tough limit but not bad for 280.

I know you can fit a 970 in that budget though









PCPartPicker part list / Price breakdown by merchant

*CPU:* Intel Core i5-4590 3.3GHz Quad-Core Processor ($159.99 @ Micro Center)
*CPU Cooler:* Cooler Master Hyper 212 EVO 82.9 CFM Sleeve Bearing CPU Cooler ($29.99 @ Newegg)
*Motherboard:* ASRock H97M PRO4 Micro ATX LGA1150 Motherboard ($69.99 @ Newegg)
*Memory:* G.Skill Ripjaws X Series 8GB (2 x 4GB) DDR3-1600 Memory ($64.99 @ Newegg)
*Storage:* Crucial MX100 128GB 2.5" Solid State Drive ($62.99 @ Amazon)
*Video Card:* Gigabyte GeForce GTX 970 4GB WINDFORCE 3X Video Card ($329.98 @ NCIX US)
*Case:* Fractal Design Core 1300 MicroATX Mini Tower Case ($47.99 @ Directron)
*Power Supply:* Rosewill Capstone 450W 80+ Gold Certified ATX Power Supply ($54.99 @ Newegg)
*Total:* $812.91
_Prices include shipping, taxes, and discounts when available_
_Generated by PCPartPicker 2014-12-31 13:06 EST-0500_



Spoiler: Preferred merchants list



PCPartPicker part list / Price breakdown by merchant

*CPU:* Intel Core i5-4590 3.3GHz Quad-Core Processor ($187.99 @ SuperBiiz)
*CPU Cooler:* Cooler Master Hyper 212 EVO 82.9 CFM Sleeve Bearing CPU Cooler ($29.99 @ Newegg)
*Motherboard:* ASRock H97M PRO4 Micro ATX LGA1150 Motherboard ($69.99 @ Newegg)
*Memory:* G.Skill Ripjaws X Series 8GB (2 x 4GB) DDR3-1600 Memory ($64.99 @ Newegg)
*Storage:* Crucial MX100 128GB 2.5" Solid State Drive ($62.99 @ Amazon)
*Video Card:* Gigabyte GeForce GTX 970 4GB WINDFORCE 3X Video Card ($329.98 @ NCIX US)
*Case:* Fractal Design Core 1300 MicroATX Mini Tower Case ($47.99 @ Directron)
*Power Supply:* Rosewill Capstone 450W 80+ Gold Certified ATX Power Supply ($54.99 @ Newegg)
*Total:* $840.91
_Prices include shipping, taxes, and discounts when available_
_Generated by PCPartPicker 2014-12-31 13:07 EST-0500_



Shouldn't get above 900 USD unless SG has big taxes like EU does. In that case say goodbye to having 970 in that price range for a rig


----------



## Arsh31536

Ahh... Newegg shipment costs are bloody expensive. They are like $200+ bucks. And the recommended graphic card mentioned by Roboyto cannot be shipped to SG. Not only that graphic card I also tried other R9 280 and R9 270x cards and they too cannot be shipped to SG. So Newegg is out of the box for me. I really appreciate Roboyto and JackCY for their rigs thanks a lot guys. Anyways if anybody else has any other suggestion for rig under $1000 SGD, you can use this website http://sgpcmart.com/ so that I can get free shipment unlike Newegg and its way more indated then the previous website. (If really cannot, I can stretch my budget by another $200-$250)
Thanks and Happy New Year ?


----------



## buttface420

would a 750w psu be enough for 2 r9 280x's and a fx8350?

how would two 280x's compare to a gtx-980?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *buttface420*
> 
> would a 750w psu be enough for 2 r9 280x's and a fx8350?
> 
> how would two 280x's compare to a gtx-980?


Look at my sig, so yes

and umm.. my 2 280x's are a little better than a 290x so that should help so I am thinking about the same with more power consumption but mantle support


----------



## JackCY

Quote:


> Originally Posted by *Arsh31536*
> 
> Ahh... Newegg shipment costs are bloody expensive. They are like $200+ bucks. And the recommended graphic card mentioned by Roboyto cannot be shipped to SG. Not only that graphic card I also tried other R9 280 and R9 270x cards and they too cannot be shipped to SG. So Newegg is out of the box for me. I really appreciate Roboyto and JackCY for their rigs thanks a lot guys. Anyways if anybody else has any other suggestion for rig under $1000 SGD, you can use this website http://sgpcmart.com/ so that I can get free shipment unlike Newegg and its way more indated then the previous website. (If really cannot, I can stretch my budget by another $200-$250)
> Thanks and Happy New Year ?


Why shop in US when you are from SG? These lists are more of a recommendation of parts than shops. Is a different GPU cheaper, just grab it, no need to stick to precisely what's on the list. There is little to no difference between for example GB, MSI, ASUS 970. Just look for the best deals from shops that you like, prefer, offer you the best deal. Usually shopping abroad is a bad idea due to shipping and customs costs and lack or expensive warranty if need be.

I'm always surprised people from SG here link to shops that are often something like this tiny corner shop with no selection haha.
Don't you have lots of options there? In HK there seems to be crazy tons of options to buy PC hardware. Maybe SG isn't as electronics centered.

HDDs, only WD, is that a joke, what a shop lol.
Pointless to select parts there they have only a few brands and that's it, their prices aren't that good either.

You gotta figure out who are the biggest sellers of computer parts in SG first. Every country and region has some. Or at least find a comparison website for your country/region that compares prices from many shops in your area, I use these to shop for anything to get the best deals and find most popular products in some categories.

You can fit a 970 into that budget even with an i5. Or get a much much slower 280x or 290 second hand dirt cheap.
Quote:


> Originally Posted by *buttface420*
> 
> would a 750w psu be enough for 2 r9 280x's and a fx8350?
> 
> how would two 280x's compare to a gtx-980?


They should smoke the 980. If not in performance then in heat lol.

They say 280x CF is a little faster but I have not found any real benchmarks yet that would run both 280x CF and 980.
Might have to ask people to run a bench or look for the same bench and settings in multiple reviews = PITA.
Plus 980 is a bad deal, get a 970 until new cards arrive.


----------



## Spork13

280x/280 in x-fire is frankly a PITA.
Good for benchmarking, but does not work in a lot of games.
ie: Works great in FC3, Metro LL and BF4.
Doesn't help in Shadow of Mordor, FC4, WoT

so while theoretically 280 x fire > 980, in real world use this may not often be the case.

Of course, we can expect drivers and patches to improve xfire compatability over time, but in my (limited) experience, if you want to play new release games, single GPU is the way to go.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Spork13*
> 
> 280x/280 in x-fire is frankly a PITA.
> Good for benchmarking, but does not work in a lot of games.
> ie: Works great in FC3, Metro LL and BF4.
> Doesn't help in Shadow of Mordor, FC4, WoT
> 
> so while theoretically 280 x fire > 980, in real world use this may not often be the case.
> 
> Of course, we can expect drivers and patches to improve xfire compatability over time, but in my (limited) experience, if you want to play new release games, single GPU is the way to go.


FC3 and Metro.. do use crossfire and it works well

World of Tanks, well.. its a F2P single threaded game.. most of those types of games never support such, and a single 280x is more than plenty for those games


----------



## Mr-Dark

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> FC3 and Metro.. do use crossfire and it works well
> 
> World of Tanks, well.. its a F2P single threaded game.. most of those types of games never support such, and a single 280x is more than plenty for those games
> 
> So you must be a troll like Tivan, maybe you two are the same person


+1 My 280x/7950 crossfire work great all the time playing bf4 with mantle ultra 130 fps cap @ 80% gpu usage the gtx 980 cant do this


----------



## Spork13

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> FC3 and Metro.. do use crossfire and it works well
> 
> World of Tanks, well.. its a F2P single threaded game.. most of those types of games never support such, and a single 280x is more than plenty for those games
> 
> So you must be a troll like Tivan, maybe you two are the same person


Yes, that's what I said. Also works perfectly well on BF4. Sure there are plenty of others too, but I haven't palyed them since going xfire, or haven't needed the second card working to run them well.

Fair call, shame though, I don't get a solid 60 FPS on WoT with highest settings (1440 monitor) using single 280x


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Spork13*
> 
> Yes, that's what I said. Also works perfectly well on BF4. Sure there are plenty of others too, but I haven't palyed them since going xfire, or haven't needed the second card working to run them well.
> 
> Fair call, shame though, I don't get a solid 60 FPS on WoT with highest settings (1440 monitor) using single 280x
> 
> Did we get up on the wrong side of the bed pumpkin? Troll my arse.


Missed the ie

fixed


----------



## buttface420

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Look at my sig, so yes
> 
> and umm.. my 2 280x's are a little better than a 290x so that should help so I am thinking about the same with more power consumption but mantle support


ah okay. just ordered a second 280x and i was very nervous about 2-280xs plus 8350 at 4.2ghz with a 80 bronze 750w power supply... hope its enough!

okay please excuse total noob question as i have never crossfired...do i need two crossfire ribbons and do they have to be the same? my first one came with a ribbon but my second one is card only(off teh ebay)


----------



## Spork13

That "ribbon" is a Crossfire bridge, and you only need to connect one between two cards. The reason the cards can fit two is for people who run 3+ GPU's, the middle one/s need to connect to the cards above and below them.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *buttface420*
> 
> ah okay. just ordered a second 280x and i was very nervous about 2-280xs plus 8350 at 4.2ghz with a 80 bronze 750w power supply... hope its enough!
> 
> okay please excuse total noob question as i have never crossfired...do i need two crossfire ribbons and do they have to be the same? my first one came with a ribbon but my second one is card only(off teh ebay)
> Quote:
> 
> 
> 
> Originally Posted by *Spork13*
> 
> That "ribbon" is a Crossfire bridge, and you only need to connect one between two cards. The reason the cards can fit two is for people who run 3+ GPU's, the middle one/s need to connect to the cards above and below them.
Click to expand...

This should shed some light



Spoiler: Warning: Spoiler!



[


/SPOILER]


----------



## buttface420

okay thanks guys!


----------



## ozlay

request for more R9-280 bios to be uploaded with gpu-z if anyone has them would you plz consider uploading your bios









im look for a stock 280 bios to flash to my sapphire 7950


----------



## Mr-Dark

Quote:


> Originally Posted by *ozlay*
> 
> request for more R9-280 bios to be uploaded with gpu-z if anyone has them would you plz consider uploading your bios
> 
> 
> 
> 
> 
> 
> 
> 
> 
> im look for a stock 280 bios to flash to my sapphire 7950


This sapphire 280 dual-x bios

http://dc392.gulfup.com/XNzwXE.rar?gu=KyT_BJRIpNowUvHcxSfxzA&e=1420843570&n=66696c656e616d652a3d5554462d38272731323334352e726172


----------



## ozlay

Quote:


> Originally Posted by *Mr-Dark*
> 
> This sapphire 280 dual-x bios
> 
> http://dc392.gulfup.com/XNzwXE.rar?gu=KyT_BJRIpNowUvHcxSfxzA&e=1420843570&n=66696c656e616d652a3d5554462d38272731323334352e726172


flash was good but hdmi port doesn't seem to work is that a 280 or a 280x bios


----------



## Mr-Dark

Quote:


> Originally Posted by *ozlay*
> 
> flash was good but hdmi port doesn't seem to work is that a 280 or a 280x bios


this 280 i flash them to my 7950 all fine but no temp any more


----------



## diggiddi

Quote:


> Originally Posted by *Mr-Dark*
> 
> this 280 i flash them to my 7950 all fine but no temp any more


Any benefit to flashing?


----------



## Mr-Dark

Quote:


> Originally Posted by *diggiddi*
> 
> Any benefit to flashing?


Nothing just the name change to r9 200


----------



## MiladEd

Hey guys, I've a Sapphire R9 280X Dual-X. I was wondering, if later when I want to upgrade (I recently acquired this PC) Would it be wiser to add another R9 280X and run it in CrossFire or buy a new card all together? My PSU is only 550 watts so I would need a new one for running 2 GPUs. How about getting a new card, would I need one?

FYI, my CPU is AMD FX-8320. Full PC details in sig.


----------



## Spork13

Quote:


> Originally Posted by *MiladEd*
> 
> Hey guys, I've a Sapphire R9 280X Dual-X. I was wondering, if later when I want to upgrade (I recently acquired this PC) Would it be wiser to add another R9 280X and run it in CrossFire or buy a new card all together? My PSU is only 550 watts so I would need a new one for running 2 GPUs. How about getting a new card, would I need one?
> 
> FYI, my CPU is AMD FX-8320. Full PC details in sig.


550w might struggle. Does it even have 2 x 8 pin and 2 x 6 pin PCIe connectors?
I have a 4790K with a _very_ mild OC, a 280x and a 280. All air cooled. Max current draw is 680w, but mostly stays at least 100w less than that when running Xfire.


----------



## MiladEd

Quote:


> Originally Posted by *Spork13*
> 
> 550w might struggle. Does it even have 2 x 8 pin and 2 x 6 pin PCIe connectors?
> I have a 4790K with a _very_ mild OC, a 280x and a 280. All air cooled. Max current draw is 680w, but mostly stays at least 100w less than that when running Xfire.


I don't think it has 2 x 8 pin and 2 x 6 pin connectors but I'm not sure. 550 will suffer, as the max rating for adding another R9 280X will be 730 watt. I do need a new PSU if I'm to add that. What I was asking, is that in case I don't add another 280X and just take a 290X or GTX 980 or something instead, will it provide better performance? How about performance/price ratio as that's quite important as well. Do I need a new PSU in that case?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *MiladEd*
> 
> I don't think it has 2 x 8 pin and 2 x 6 pin connectors but I'm not sure. 550 will suffer, as the max rating for adding another R9 280X will be 730 watt. I do need a new PSU if I'm to add that. What I was asking, is that in case I don't add another 280X and just take a 290X or GTX 980 or something instead, will it provide better performance? How about performance/price ratio as that's quite important as well. Do I need a new PSU in that case?


if you are going cost/performance and only have a 550 power supply you will come out slightly ahead with the single card option


----------



## buttface420

Quote:


> Originally Posted by *MiladEd*
> 
> Hey guys, I've a Sapphire R9 280X Dual-X. I was wondering, if later when I want to upgrade (I recently acquired this PC) Would it be wiser to add another R9 280X and run it in CrossFire or buy a new card all together? My PSU is only 550 watts so I would need a new one for running 2 GPUs. How about getting a new card, would I need one?
> 
> FYI, my CPU is AMD FX-8320. Full PC details in sig.


i just ordered my second 280x , i was gonna go with a gtx 970 but honestly 2-280x beats it and i already owned a 280x..i figure i will use them both until newer better gpus are cheaper.


----------



## End3R

Quote:


> Originally Posted by *buttface420*
> 
> i just ordered my second 280x , i was gonna go with a gtx 970 but honestly 2-280x beats it and i already owned a 280x..i figure i will use them both until newer better gpus are cheaper.


Hate to burst your bubble, but subjectively a single gtx970 will outperform crossfired 280s in almost any game - assuming you're playing @1080p.

I could be wrong, but I'm fairly certain you'd only see better results with xfire 280s in benchmarks, and even then I'm not sure, the gtx 970s are beasts. Nvidia got tired of ATI having the best prices, and for the price, ati has nothing that comes close to the gtx970.

Not saying 280s aren't beasts as well, only reason I haven't snagged up a 970 yet is because I've yet to find a game that can bring my *270X* to it's knees. (again @1080p) But still pretty sure the 970 would outperform xfire 280s.


----------



## rdr09

Quote:


> Originally Posted by *End3R*
> 
> Hate to burst your bubble, but subjectively a single gtx970 will outperform crossfired 280s in almost any game - assuming you're playing @1080p.
> 
> I could be wrong, but I'm fairly certain you'd only see better results with xfire 280s in benchmarks, and even then I'm not sure, the gtx 970s are beasts. Nvidia got tired of ATI having the best prices, and for the price, ati has nothing that comes close to the gtx970.
> 
> Not saying 280s aren't beasts as well, only reason I haven't snagged up a 970 yet is because I've yet to find a game that can bring my *270X* to it's knees. (again @1080p) But still pretty sure the 970 would outperform xfire 280s.


no, 2 280s in crossfire is at least 30% more powerful.

edit: i know 'cause i have owned 7900 cards (still own one) and 290s. i have not seen any 970 in ocn beat my 290s.









at least in synthetic benches.


----------



## End3R

Quote:


> Originally Posted by *rdr09*
> 
> no, 2 280s in crossfire is at least 30% more powerful.
> 
> edit: i know 'cause i have owned 7900 cards (still own one) and 290s. i have not seen any 970 in ocn beat my 290s.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> at least in synthetic benches.


That's why I said subjectively, and said the only place he would probably see improved results is benchmarking. As far as actually playing any game out there, assuming he's playing @1080p, the single 970 will outperform the xfire setup.


----------



## REDEFINE

Quote:


> Originally Posted by *jamponget9*
> 
> Sorry if this has been asked before but...
> 
> I had this card for almost a year now its a R9 270x Toxic and I never even bothered to look under the hood while it is on... Not until now... For all I know the 3 fans are supposed to be spinning, but just this morning i noticed 1 fan not spinning while at boot up and idle and while gaming... I tried to fiddle with MSI Fan control and not only until the fan speed is at 80% that the 3 fans are fully spinning... here are the pics...
> 
> I tried firing up Archeage game until the card runs hot at 60 degrees but the fan wont spin...
> 
> Now for the question.. is my card broken? or is my fan malfunctioning?
> 
> VCARD at Idle and Gaming... Note that 1 fan is not spinning
> 
> 
> VCARD when Fan Speed is Forced at 80%
> 
> 
> It wud be so nice of someone enlighten me on this matter.... TIA!!!!


Its not normal my R9 280x toxic fans all spin all the time the toxic doesn't have the feature that shuts down 2 fans in idle mod like the vapor x


----------



## rdr09

Quote:


> Originally Posted by *End3R*
> 
> That's why I said subjectively, and said the only place he would probably see improved results is benchmarking. As far as actually playing any game out there, assuming he's playing @1080p, the single 970 will outperform the xfire setup.


no, fps wise . . . 280s are stronger. only a highly oc'ed 980 will match them. i've owned them. tahitis scale better than hawaiis even.


----------



## End3R

Quote:


> Originally Posted by *rdr09*
> 
> no, fps wise . . . 280s are stronger. only a highly oc'ed 980 will match them. i've owned them. tahitis scale better than hawaiis even.


What part of the word subjectively do you not understand?


----------



## rdr09

Quote:


> Originally Posted by *End3R*
> 
> What part of the word subjectively do you not understand?


you are wrong and don't spread misinformation. now, if you've used any, then i might consider . . . but, you haven't.


----------



## End3R

Quote:


> Originally Posted by *rdr09*
> 
> you are wrong and don't spread misinformation. now, if you've used any, then i might consider . . . but, you haven't.


Ok you clearly don't understand what that word means.

In most cases SLI or xfire hurts game performance until a specific profile is released and even then, you usually get more micro stuttering than you do with a single card. My 270x can play every game out there max settings (adjusting AA) @1080p with a smooth frame rate, so can a single 280, and a 970. The only place he will be seeing performance increases is benchmarks.

Now if he is gaming at 4k, I don't know which would be better, but at 1080p, I can guarantee you, when it comes to playing games, the 970 is better. And when I say playing games, I mean PLAYING them, not staring at an fps counter.


----------



## rdr09

Quote:


> Originally Posted by *End3R*
> 
> Ok you clearly don't understand what that word means.
> 
> In most cases SLI or xfire hurts game performance until a specific profile is released and even then, you usually get more micro stuttering than you do with a single card. My 270x can play every game out there max settings (adjusting AA) @1080p with a smooth frame rate, so can a single 280, and a 970. The only place he will be seeing performance increases is benchmarks.
> 
> Now if he is gaming at 4k, I don't know which would be better, but at 1080p, I can guarantee you, when it comes to playing games, the 970 is better. And when I say playing games, I mean PLAYING them, not staring at an fps counter.


then, not subjectively, 'cause there will be games that 2 280s will scale better and play smooth. the thing is . . . you have not used them.


----------



## End3R

Do yourself a favor and lookup the definition of subjective.


----------



## rdr09

Quote:


> Originally Posted by *End3R*
> 
> Do yourself a favor and lookup the definition of subjective.


it is really very simple. we both claim to be subjective. the difference is i've experienced using crossfire tahitis while you have not.


----------



## End3R

Quote:


> Originally Posted by *rdr09*
> 
> it is really very simple. we both claim to be subjective. the difference is i've experienced using crossfire tahitis while you have not.


Correct, I've provided enough support fixing people's xfires and sli setups that I'm smart enough to avoid them. A stronger single card is always better than 2 weaker cards. Again, I'm not talking about benchmarks, because yes, you will get higher scores. Just because you choose not to use something doesn't make you ignorant about them.

If a single 270X can run everything out there at a smooth frame rate @1080p, then you'd have to be daft to think that a 280 or 970 couldn't. Aside from benchmarks, the only way he'd be getting any additional use out of the cards while gaming is by increasing his resolution beyond 1080, or adding un-needed levels of anti-aliasing. All of which, will be resulting in a very very very slightly better looking image, at the cost of fps.

All of which is only noticable if you're actually watching your FPS, when you're just playing, whether the fps is 30, 60, or 120, smooth is smooth.


----------



## rdr09

Quote:


> Originally Posted by *End3R*
> 
> Correct, *I've provided enough support fixing people xfires and sli setups* that I'm smart enough to avoid them. A stronger single card is always better than 2 weaker cards. Again, I'm not talking about benchmarks, because yes, you will get higher scores. Just because you choose not to use something doesn't make you ignorant about them.
> 
> If a single 270X can run everything out there at a smooth frame rate @1080p, then you'd have to be daft to think that a 280 or 970 couldn't. Aside from benchmarks, the only way he'd be getting any additional use out of the cards while gaming is by increasing his resolution beyond 1080, or adding un-needed levels of anti-aliasing. All of which, will be resulting in a very very very slightly better looking image, at the cost of fps.


Good nite.


----------



## End3R

Quote:


> Originally Posted by *rdr09*
> 
> Good nite.


Sweet dreams


----------



## buttface420

yes in games that dont support x-fire the 970 will beat it.

in games that do support it the 280x cross fire is 8% better than a gtx-980. especially in battlefield 4 because of amd having mantle,and thats my favorite game.

i will still probably get 970's when they are cheaper unless amd new gpus are better.

this is all from what i've read, i never crossfired before so i guess its a learning process.


----------



## tsm106

Quote:


> Originally Posted by *End3R*
> 
> I could be wrong...


Shakes magic 8 ball, yes. There you go!


----------



## End3R

lol you guys really need to learn more about xfire and sli. Having 2 cards isn't automatically better, and it doesn't automatically double your power. You're still just using 2 WEAKER cards.

At no point have I said xfire 280s are bad, it's more than enough to play anything out there at the moment, but getting consistently better performance is not something you will find. And when I say consistent, I mean going forward in the future, with new games that have yet to be released. Just look at EVERY single AAA title to be released this year. Almost every single person that complained about performance issues in any of them, were using SLI or xfire setups. So having a single 4gb 970 is more than enough for anything out there @1080p. Even a single 280 is. Unless you're going above 1080p, or forcing 16+AA into your game when it isn't needed then you really have no reason to upgrade at all.

Again, just to be perfectly clear, I'm not saying ANYTHING bad about xfire 280s, like I said I love my 270x. But when it comes to new games being released, I can promise you most of them will perform better on a stronger single card, than 2 weaker ones.


----------



## MiladEd

Lol, my display isn't even 1080p. It has a 1680x1050 resolution. (Yes, it's quite old) But, I really couldn't care less for resolution, specially when I don't have enough money to pay hi-res display and GPUs good enough to support it. For now, a single 280X is more than enough for any game I can think of, I run Far Cry 4 on Ultra with average 45 fps. What I was talking about, was about 1 or 2 years from now, when 280X gets more dated and higher graphic games come out. For then, considering the price for a single 280X will be lower than more average cards of then, and buying a new PSU would be pretty much essential regardless of running 2 GPUs or just a newer one (I only have a 550 watt PSU), I think running 2 280X is a better choice by then. Still, there are more important upgrades I need to make to my PC before adding a GPU, like a SSD.


----------



## tsm106

Quote:


> Originally Posted by *End3R*
> 
> lol you guys really need to learn more about xfire and sli. Having 2 cards isn't automatically better, and it doesn't automatically double your power. You're still just using 2 WEAKER cards.
> 
> At no point have I said xfire 280s are bad, it's more than enough to play anything out there at the moment, but getting consistently better performance is not something you will find. And when I say consistent, I mean going forward in the future, with new games that have yet to be released. Just look at EVERY single AAA title to be released this year. Almost every single person that complained about performance issues in any of them, were using SLI or xfire setups. So having a single 4gb 970 is more than enough for anything out there @1080p. Even a single 280 is. Unless you're going above 1080p, or forcing 16+AA into your game when it isn't needed then you really have no reason to upgrade at all.
> 
> Again, just to be perfectly clear, I'm not saying ANYTHING bad about xfire 280s, like I said I love my 270x. But when it comes to new games being released, I can promise you most of them will perform better on a stronger single card, than 2 weaker ones.


It's a yawn reading about you creating your own argument and then running around in circles with yourself.


----------



## End3R

Quote:


> Originally Posted by *tsm106*
> 
> I'm not smart enough to keep debating, so I'll feign intelligence by saying your argument is invalid, without providing any actual rebuttal of my own.


Fixed that for ya. GG, get some sleep.


----------



## tsm106

Quote:


> Originally Posted by *End3R*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I'm not smart enough to keep debating, so I'll feign intelligence by saying your argument is invalid, without providing any actual rebuttal of my own.
> 
> 
> 
> Fixed that for ya. GG, get some sleep.
Click to expand...

You're not very smart are you?


----------



## End3R

Quote:


> Originally Posted by *tsm106*
> 
> You're not very smart are you?


I'm still waiting for a reply from you that isn't an insult. Go ahead, my statements are still waiting, I'll even quote it again incase you have trouble finding it at the top of the page.
Quote:


> Originally Posted by *End3R*
> 
> lol you guys really need to learn more about xfire and sli. Having 2 cards isn't automatically better, and it doesn't automatically double your power. You're still just using 2 WEAKER cards.
> 
> At no point have I said xfire 280s are bad, it's more than enough to play anything out there at the moment, but getting consistently better performance is not something you will find. And when I say consistent, I mean going forward in the future, with new games that have yet to be released. Just look at EVERY single AAA title to be released this year. Almost every single person that complained about performance issues in any of them, were using SLI or xfire setups. So having a single 4gb 970 is more than enough for anything out there @1080p. Even a single 280 is. Unless you're going above 1080p, or forcing 16+AA into your game when it isn't needed then you really have no reason to upgrade at all.
> 
> Again, just to be perfectly clear, I'm not saying ANYTHING bad about xfire 280s, like I said I love my 270x. But when it comes to new games being released, I can promise you most of them will perform better on a stronger single card, than 2 weaker ones.


----------



## tsm106

You keep telling everyone they need to learn more... more of what? How to be a great debater like you? Or that you're arguing a silly point? One big card for 1080, that's freaking great. What do you do when your resolution is so large that one greatest card isn't enough? Welp, back to the drawing table to figure out something else everyone must learn heh? Omg wait what, stop the press a 970 is all you need for 1080. Omg, I never ever? We are so enlightened thank you.


----------



## End3R

Quote:


> Originally Posted by *tsm106*
> 
> You keep telling everyone they need to learn more... more of what? How to be a great debater like you? Or that you're arguing a silly point? One big card for 1080, that's freaking great. What do you do when your resolution is so large that one greatest card isn't enough? Welp, back to the drawing table to figure out something else everyone must learn heh? Omg wait what, stop the press a 970 is all you need for 1080. Omg, I never ever? We are so enlightened thank you.


Are you mentally damaged? Are you even aware of what we are talking about? We are talking about a specific situation where face420 was debating on whether he should be going xfire 280s or switching to a single 970. Which he agrees, a single 970 would be better performance in games that aren't properly supporting xfire/sli.

Either you aren't reading everything I'm saying, intentionally ignoring party of it, or are genuinely too dull to understand it because we're talking about sli/xfire setups here.

Just to be perfectly clear about what I'm saying:
Going xfire 280s is totally fine, but considering a single 280 is strong enough to play every game available at max settings at 1080p, there is no actual need to upgrade unless he wants higher resolutions or to inject un-needed levels of AA into a game. That being said, if he still wants to upgrade to get those higher resolutions, more AA, or just plain bragging rights, while playing games, chances are more likely that new games being released this year and the years to come, will run more smoothly out of the box on a single 970, hell, even a single 280, than a xfire 280 setup. The only place he will see consistently higher results with the 280xfire setup is in benchmarks. Every AAA game to be released in the past year suffered from performance issues on xfire and sli machines.

Now, do you have anything else to say other than continuing to insult me? Because That is still all you have done, even in the text I'm quoting of you, nowhere do you say anything to give validity to your point of view, which even that, has not been made clear. All you're doing is showing what little capacity for thought you really have.


----------



## diggiddi

End3r you are wrong, I have a 7950/280 it is not enough to power Crysis 3 , BF4 and even BF3 multi(Sometimes) at 1080p max settings and I don't think a 970 is better overall than CF, it might win some benches, but have a peek see at what Anand has to say(those are 1440 and 4k but you should see the trend)


----------



## tsm106

Quote:


> Originally Posted by *End3R*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You keep telling everyone they need to learn more... more of what? How to be a great debater like you? Or that you're arguing a silly point? One big card for 1080, that's freaking great. What do you do when your resolution is so large that one greatest card isn't enough? Welp, back to the drawing table to figure out something else everyone must learn heh? Omg wait what, stop the press a 970 is all you need for 1080. Omg, I never ever? We are so enlightened thank you.
> 
> 
> 
> Are you mentally damaged? Are you even aware of what we are talking about? We are talking about a specific situation where face420 was debating on whether he should be going xfire 280s or switching to a single 970. Which he agrees, a single 970 would be better performance in games that aren't properly supporting xfire/sli.
> 
> Either you aren't reading everything I'm saying, intentionally ignoring party of it, or are genuinely too dull to understand it because we're talking about sli/xfire setups here.
> 
> Just to be perfectly clear about what I'm saying:
> Going xfire 280s is totally fine, but considering a single 280 is strong enough to play every game available at max settings at 1080p, there is no actual need to upgrade unless he wants higher resolutions or to inject un-needed levels of AA into a game. That being said, if he still wants to upgrade to get those higher resolutions, more AA, or just plain bragging rights, while playing games, chances are more likely that new games being released this year and the years to come, will run more smoothly out of the box on a single 970, hell, even a single 280, than a xfire 280 setup. The only place he will see consistently higher results with the 280xfire setup is in benchmarks. Every AAA game to be released in the past year suffered from performance issues on xfire and sli machines.
> 
> Now, do you have anything else to say other than continuing to insult me? Because That is still all you have done, even in the text I'm quoting of you, nowhere do you say anything to give validity to your point of view, which even that, has not been made clear. All you're doing is showing what little capacity for thought you really have.
Click to expand...

Christ you are a great candidate for the ignore button.

A 7870 is not going to max the top triple A games from last gen at 1080 let alone the new ones just out. Go fire up metro or c3 and see if you can break 35fpsl. And about you creating an argument, you make up scenarios and treat them as fact. For ex. you write that new triple A titles all suck for sli/crossfire when in reality that's just not true for all games. And for those that are not optimized, it takes a short time for the driver to catch up. It is the nature of the beast. And there are those of us that like AA especially at low resolutions. Those that say AA is not needed or is a waste...







It's an opinion that you have and good for you, but don't discount what others hold valuable then run around telling everyone they need to learn, lol.


----------



## Catscratch

Umm, sorry to barge into an ongoing debate but...

What do you think about Asus 280 ROG Strix (STRIX-R9280-OC-3GD5) and Sapphire 280x tri-x (11221-22-20G) Sapphire wouldn't fit a 912 HAF Advanced with the HDD cage i suppose. I'm gonna replace my late 6850 :/ And I always wanted a 280(x)


----------



## End3R

Quote:


> Originally Posted by *diggiddi*
> 
> End3r you are wrong, I have a 7950/280 it is not enough to power Crysis 3 , BF4 and even BF3 multi(Sometimes) at 1080p max settings and I don't think a 970 is better overall than CF, it might win some benches, but have a peek see at what Anand has to say(those are 1440 and 4k but you should see the trend)


You're wrong, because my 270x is strong enough to max settings in those games and more with a smooth framerate, so obviously a 280 can as well. Keep in mind you don't need to max out AA for the game to be at max settings, setting 8+ AA is just gimping your performance.
Quote:


> Originally Posted by *tsm106*
> 
> Christ you are a great candidate for the ignore button.
> 
> A 7870 is not going to max the top triple A games from last gen at 1080 let alone the new ones just out. Go fire up metro or c3 and see if you can break 35fpsl. And about you creating an argument, you make up scenarios and treat them as fact. For ex. you write that new triple A titles all suck for sli/crossfire when in reality that's just not true for all games. And for those that are not optimized, it takes a short time for the driver to catch up. It is the nature of the beast. And there are those of us that like AA especially at low resolutions. Those that say AA is not needed or is a waste...
> 
> 
> 
> 
> 
> 
> 
> It's an opinion that you have and good for you, but don't discount what others hold valuable then run around telling everyone they need to learn, lol.


You guys clearly have no idea what you're talking about because you can most certainly break 35 fps in both c3 and bf4 at max settings when playing with 2-4x AA. Increasing AA beyond this when playing @1080 does nothing but stroke your e-peen and hurt your performance.


----------



## Agent Smith1984

Quote:


> Originally Posted by *End3R*
> 
> You're wrong, because my 270x is strong enough to max settings in those games and more with a smooth framerate, so obviously a 280 can as well. Keep in mind you don't need to max out AA for the game to be at max settings, setting 8+ AA is just gimping your performance.
> You guys clearly have no idea what you're talking about because you can most certainly break 35 fps in both c3 and bf4 at max settings when playing with 2-4x AA. Increasing AA beyond this when playing @1080 does nothing but stroke your e-peen and hurt your performance.


You can probably break 35FPS, but my 280x (the card I used before getting my 290) was running at 1250/1800 and would run Crysis 3 on max, with 4X MSAA, and get consistent 50-65FPS, however it would STILL dip down to 35-38 FPS at times, and that is on a HIGHLY overclocked card.
There is no way a 7870 can stay much higher than 35 without getting some 20-25FPS dips in certain areas.


----------



## End3R

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You can probably break 35FPS, but my 280x (the card I used before getting my 290) was running at 1250/1800 and would run Crysis 3 on max, with 4X MSAA, and get consistent 50-65FPS, however it would STILL dip down to 35-38 FPS at times, and that is on a HIGHLY overclocked card.
> There is no way a 7870 can stay much higher than 35 without getting some 20-25FPS dips in certain areas.


Ok now we are getting somewhere, you see, I have no problem if a game dips occasionally so the behavior you just described, I still consider 100% acceptable.


----------



## Agent Smith1984

Quote:


> Originally Posted by *End3R*
> 
> Ok now we are getting somewhere, you see, I have no problem if a game dips occasionally so the behavior you just described, I still consider 100% acceptable.


Well, that's more of an opinion. People have different levels of what they believe to be acceptable framerates.....

I don't require that FPS be above or at 60 at all times, but when I am getting fragged on a mutlplayer map because my FPS has dipped down into the 20's, I find that to be unacceptable.
The lowest I can deal with is a 30 FPS dip, and even that seems to bother me when the framerates are typically above 45FPS.
Even though it is said that the human eye can't see any faster than 30FPS, I believe anyone who games on PC can certainly feel the difference in a game running above 45FPS, and a game running below 45FPS.....

I don't mind the dips nearly as much in campaign play though, so long as the difficulty is on normal, and the AI isn't hawking my every move.....

I will admit, that my very high performing 280x did satisfy my personal performance goals for the most part, except for in a few situations..... Now that I have the 290, I never see anything dip below 45FPS, which is exactly what I was hoping to acheive with this card.

I am not the type to spend a lot on hardware though. I bought my 280x on ebay for $125 back in july 2014, sold in Oct 2014 for $130, and then bought my 290 on craigslist for $160.
Had it not been for scoring some awesome deals, I'd be running on a much cheaper card right now.

I even managed to get my son's 7950 on CL for $60, and it was like new in the box with all the papers and accessories.

The used hardware market can bring a different level of performance within a given budget, for those who are okay with used hardware.
Everything in all of my systems was purchased used, or given to me by family members used.....


----------



## JackCY

^ exactly, I sure bought a new PC but the ASUS 280x DC2T is second hand 3 months old at the time of buy bought for not more than 60% of price of a new one. Bought used for $225, new cost nearly $400, coz Europe prices suck.
A new 970 costs around $500 now.

Agent Smith, $125 that's nuts







$160 for a 290 haha, those went here around $375+ for non reference models.

Still almost same second prices over half year later last time I checked.


----------



## End3R

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, that's more of an opinion. People have different levels of what they believe to be acceptable framerates.....
> 
> I don't require that FPS be above or at 60 at all times, but when I am getting fragged on a mutlplayer map because my FPS has dipped down into the 20's, I find that to be unacceptable.
> The lowest I can deal with is a 30 FPS dip, and even that seems to bother me when the framerates are typically above 45FPS.
> Even though it is said that the human eye can't see any faster than 30FPS, I believe anyone who games on PC can certainly feel the difference in a game running above 45FPS, and a game running below 45FPS.....
> 
> I don't mind the dips nearly as much in campaign play though, so long as the difficulty is on normal, and the AI isn't hawking my every move.....
> 
> I will admit, that my very high performing 280x did satisfy my personal performance goals for the most part, except for in a few situations..... Now that I have the 290, I never see anything dip below 45FPS, which is exactly what I was hoping to acheive with this card.
> 
> I am not the type to spend a lot on hardware though. I bought my 280x on ebay for $125 back in july 2014, sold in Oct 2014 for $130, and then bought my 290 on craigslist for $160.
> Had it not been for scoring some awesome deals, I'd be running on a much cheaper card right now.
> 
> I even managed to get my son's 7950 on CL for $60, and it was like new in the box with all the papers and accessories.
> 
> The used hardware market can bring a different level of performance within a given budget, for those who are okay with used hardware.
> Everything in all of my systems was purchased used, or given to me by family members used.....


Finally another person who speaks sense. I agree with everything you're saying. So wouldn't you also rather use a single 970, rather than xfire 280s - with our levels of what we find acceptable, wouldn't you rather deal with the smoother fps rates of the single card than dealing all the xfire issues? Again I'm not saying anything wrong with going 280 xfire for those who want to, this was just about a specific person who was thinking of either buying another 280 to go xfire with his current one, or switching to a 970.


----------



## Agent Smith1984

Well, I live in an area where so many were at one point unemployed, and really needed to generate some cash to go along with their unemployment checks I guess.....

So they all set up mining rigs, between 2012-early 2014 and made cash that way. Then in mid 2014 that market crashed, and being as most of these people had no interest whatsoever in PC gaming, they liquidated all of their stuff.....

Now that all of that hardware has been sold off, most of the used prices have been sustained on most cards. That's mainly because used hardware is now being sold from one gamer to another again, versus the "guy that's just liquidating unused equipment,"

Most of the gamers that are selling used hardware, have no intention of giving stuff away at too low of a price, even if they aren't using it, and it just sits there.....

Say I did't ride motorcycles at all, and I inherit one from an uncle...... I'd be more likely to dump it off to someone for the first decent offer I get. However, if I were a big time bike guru, then I'd probably be wanting top dollar for it, even if I never ride it.....


----------



## Agent Smith1984

Quote:


> Originally Posted by *End3R*
> 
> Finally another person who speaks sense. I agree with everything you're saying. So wouldn't you also rather use a single 970, rather than xfire 280s - with our levels of what we find acceptable, wouldn't you rather deal with the smoother fps rates of the single card than dealing all the xfire issues? Again I'm not saying anything wrong with going 280 xfire for those who want to, this was just about a specific person who was thinking of either buying another 280 to go xfire with his current one, or switching to a 970.


Well,

I could see going 280x crossfire, and even considered it myself when I had mine, but here are the variables that I would keep in mind...

If you have less than a decently overclocked i5, (and I am including the FX8*** into the "less" category), then you want to go with a single card since the dual GPU's require a little more CPU headroom than what AMD brings to the table for the most part. That's not the case in ALL situations, but in most....

Also, keep in mind the power usage, and make sure you are working with a decent PSU over 800W, but preferably 900-1000 if you are going to do any overclocking/volting....

Also consider the titles you play, or plan on playing, and see if they will be optimized for CF.

If you don't fit the bill in all three categories, then I would suggest not using CF, and selling the single 280x, and applying the money towards a single card solution.

That is exactly what I did, and I am pretty happy.... even if you go up to a 290 with an overclock, you will still see a 20-30% improvement across the board, and I usually qualify 20% improvement on any piece of hardware as a good, worth while upgrade


----------



## End3R

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well,
> 
> I could see going 280x crossfire, and even considered it myself when I had mine, but here are the variables that I would keep in mind...
> 
> If you have less than a decently overclocked i5, (and I am including the FX8*** into the "less" category), then you want to go with a single card since the dual GPU's require a little more CPU headroom than what AMD brings to the table for the most part. That's not the case in ALL situations, but in most....
> 
> Also, keep in mind the power usage, and make sure you are working with a decent PSU over 800W, but preferably 900-1000 if you are going to do any overclocking/volting....
> 
> Also consider the titles you play, or plan on playing, and see if they will be optimized for CF.
> 
> If you don't fit the bill in all three categories, then I would suggest not using CF, and selling the single 280x, and applying the money towards a single card solution.
> 
> That is exactly what I did, and I am pretty happy.... even if you go up to a 290 with an overclock, you will still see a 20-30% improvement across the board, and I usually qualify 20% improvement on any piece of hardware as a good, worth while upgrade












I will say though, don't sell all FX processors short. I agree if you're talking the 6000 series or lower, but 8320 and higher are absolute beasts, they're on the same level as a lot of i5s and i7s.


----------



## Agent Smith1984

Quote:


> Originally Posted by *End3R*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will say though, don't sell all FX processors short. I agree if you're talking the 6000 series or lower, but 8320 and higher are absolute beasts.


If the FX8* is overclocked above 4.6 is does pretty well, but you will see huge performance scaling differences with intel VS AMD when using multiple GPU's.
Single GPU, not so much.....


----------



## End3R

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If the FX8* is overclocked above 4.6 is does pretty well, but you will see huge performance scaling differences with intel VS AMD when using multiple GPU's.
> Single GPU, not so much.....


Last month I myself was thinking of upgrading to a 970 as well, added the card to my "cart" a couple times but couldn't ever bring myself to pull the trigger since my 270x can handle all the games out there. I'd be playing them and be like, well, if I upgraded that 45 might turn into a 90, but that's only in the most demanding games. There would be little to no difference in most games.

I still kinda want to, but I'm waiting for a game to release it can't handle.


----------



## Agent Smith1984

Well, it's all relative to your settings....

I mean, I wouldn't dare try to run Crysis 3 on max settings with 4x MSAA on a 270x.
But high settings with no AA is probably doable.....

The 970 would let you have your way with any game you want, but I would advise that your CPU will bottleneck a higher end GPU in some cases if you don't have it overclocked a good bit....
You would pretty much see a 100% performance increase in demanding games with max settings when going to the 970 though.....


----------



## End3R

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, it's all relative to your settings....
> 
> I mean, I wouldn't dare try to run Crysis 3 on max settings with 4x MSAA on a 270x.
> But high settings with no AA is probably doable.....
> 
> The 970 would let you have your way with any game you want, but I would advise that your CPU will bottleneck a higher end GPU in some cases if you don't have it overclocked a good bit....
> You would pretty much see a 100% performance increase in demanding games with max settings when going to the 970 though.....


It's more than doable







- keep in mind I'm talking about @1080p. There isn't a game out there I can't run maxed with around 2-4x AA, TheWorse modded Watch_Dogs, Shadows of Mordor, Lord of the Fallen, Dragon Age Inquisition, etc etc - I haven't picked up FC4 or ACU yet but that's just because I have such a backlog already


----------



## tsm106

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, it's all relative to your settings....
> 
> *I mean, I wouldn't dare try to run Crysis 3 on max settings with 4x MSAA on a 270x.
> But high settings with no AA is probably doable.....
> *
> The 970 would let you have your way with any game you want, but I would advise that your CPU will bottleneck a higher end GPU in some cases if you don't have it overclocked a good bit....
> You would pretty much see a 100% performance increase in demanding games with max settings when going to the 970 though.....


Lmao, no matter how many times you repeat that... it doesn't seem to stick for some.


----------



## End3R

Quote:


> Originally Posted by *tsm106*
> 
> Lmao, no matter how many times you repeat that... it doesn't seem to stick for some.


If you wanna show up late to the party, read all the posts. Him and I agree about this stuff. Haha. Keep believing you need more hardware than you really do.


----------



## Agent Smith1984

Quote:


> Originally Posted by *End3R*
> 
> If you wanna show up late to the party, read all the posts. Him and I agree about this stuff. Haha. Keep believing you need more hardware than you really do.


I agree with some of the things both of you have said.
For the record though, I disagree that 270x is enough for max setting gameplay @ 1080p.

I say that because my son is using a 7950, and I have to keep his settings in check at 1080P for multiplayer, so I know first hand.
I also ran the card on my system to see if there would be differences in performance going from his x4 to my x6, and in the demanding titles, it's not even relevant.

I have hands on experience with 7950, 280x, and my 290 on recent games, and I don't see any way possible the 270x could deliver consistent gameplay above 30FPS with titles like Watch Dogs, SOM, C3, etc....
But if you are experiencing satisfactory levels of performance, then more power to you!


----------



## End3R

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I agree with some of the things both of you have said.
> For the record though, I disagree that 270x is enough for max setting gameplay @ 1080p.
> 
> I say that because my son is using a 7950, and I have to keep his settings in check at 1080P for multiplayer, so I know first hand.
> I also ran the card on my system to see if there would be differences in performance going from his x4 to my x6, and in the demanding titles, it's not even relevant.
> 
> I have hands on experience with 7950, 280x, and my 290 on recent games, and I don't see any way possible the 270x could deliver consistent gameplay above 30FPS with titles like Watch Dogs, SOM, C3, etc....
> But if you are experiencing satisfactory levels of performance, then more power to you!


Here ya go, straight from fraps: all @1080p

TheWorse Modded Watch dogs with ultra textures
Frames: 5816
Time (ms): 120000
Min: 24
Max: 62
Avg: 48.467

Shadows of Mordor maxed (just using high instead of ultra textures, which for the record, look the exact same as ultra, the ultra are just uncompressed)
Frames: 6345
Time (ms): 120000
Min: 35
Max: 91
Avg: 52.875

Lords of the Fallen maxed
Frames: 4236
Time (ms): 120000
Min: 19
Max: 45
Avg: 35.300


----------



## Agent Smith1984

Looks right in line with this:
http://www.hardocp.com/article/2014/08/11/asus_radeon_r9_270x_directcu_ii_top_video_card_review/4#.VKw41CvF9lY

I guess what I was getting at, is I qualify acceptable FPS as not dipping below 30FPS, and running no less than 2x AA. And to do that, you need more.
Good to see the 270x putting up the numbers it does though. You should really look at overclocking your CPU and GPU and see if you can close in on that 30FPS minimum.


----------



## End3R

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looks right in line with this:
> http://www.hardocp.com/article/2014/08/11/asus_radeon_r9_270x_directcu_ii_top_video_card_review/4#.VKw41CvF9lY
> 
> I guess what I was getting at, is I qualify acceptable FPS as not dipping below 30FPS, and running no less than 2x AA. And to do that, you need more.
> Good to see the 270x putting up the numbers it does though. You should really look at overclocking your CPU and GPU and see if you can close in on that 30FPS minimum.


I have the same expectations, but I look at avg fps numbers. A hiccup here or there is not a deal breaker. Out of all of my games Lords of the Fallen is probably the worst performing, I've played watch dogs and shadows of mordor for hours without the fps ever (noticeably) dropping below 30, smooth gameplay the whole time. Really it's only when looking at benches you even noticed drops to 25-30. And even lords of the fallen plays very smoothly 99% of the time, the avg in that bench was 35.


----------



## Agent Smith1984

Quote:


> Originally Posted by *End3R*
> 
> I have the same expectations, but I look at avg fps numbers. A hiccup here or there is not a deal breaker. Out of all of my games Lords of the Fallen is probably the worst performing, I've played watch dogs and shadows of mordor for hours without the fps ever (noticeably) dropping below 30, smooth gameplay the whole time. Really it's only when looking at benches you even noticed drops to 25-30.


Do you have a firestrike score to post?


----------



## tsm106




----------



## End3R

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Do you have a firestrike score to post?


I only have the demo for 3d mark which won't let you run anything other than their performance bench, not extreme. But I do have Valley.


Spoiler: Warning: Spoiler!


----------



## Agent Smith1984

So what does your firestrike performance scores look like?
I am more curious about the physics and combined score than the graphics score to be honest....


----------



## tsm106

Just to give ppl an idea of the difference in performance, twice the perf... lol.


----------



## Agent Smith1984

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> Just to give ppl an idea of the difference in performance, twice the perf... lol.


Is that on one 79**?

I gotta run valley when I get home..... I haven't ran that bench on any of my cards. Does it see any CPU advantages?


----------



## tsm106

That is one gpu, *7970 or 280x depending on the bios I was running at the time. AMD cpus are very far behind unfortunately, especially as the gpu count goes up. That run was years ago. If I ran one now it would prolly be even faster.


----------



## End3R

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So what does your firestrike performance scores look like?
> I am more curious about the physics and combined score than the graphics score to be honest....


Just ran it, and I wish I could run the extreme one because this bench is extremely misleading . Without physx or xfire a good score in this is not very possible, but it is also not reflective of the performance in any existing game, unless you're forcing phsyx effects which as we all know forces your cpu to try and handle it, which even the best struggle with.


Spoiler: Warning: Spoiler!






These are my in-game benches again, all with (what I consider) acceptable, smooth avg framerates.
Quote:


> Originally Posted by *End3R*
> 
> Here ya go, straight from fraps: all @1080p
> 
> TheWorse Modded Watch dogs with ultra textures
> Frames: 5816
> Time (ms): 120000
> Min: 24
> Max: 62
> Avg: 48.467
> 
> Shadows of Mordor maxed (just using high instead of ultra textures, which for the record, look the exact same as ultra, the ultra are just uncompressed)
> Frames: 6345
> Time (ms): 120000
> Min: 35
> Max: 91
> Avg: 52.875
> 
> Lords of the Fallen maxed
> Frames: 4236
> Time (ms): 120000
> Min: 19
> Max: 45
> Avg: 35.300






And very nice score tsm


----------



## tsm106

Quote:


> Originally Posted by *End3R*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> So what does your firestrike performance scores look like?
> I am more curious about the physics and combined score than the graphics score to be honest....
> 
> 
> 
> Just ran it, and I wish I could run the extreme one because this bench is extremely misleading . *Without physx or xfire a good score in this is not very possible*, but it is also not reflective of the performance in any existing game, unless you're forcing phsyx effects which as we all know forces your cpu to try and handle it, which even the best struggle with.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> These are my in-game benches again, all with (what I consider) acceptable, smooth avg framerates.
> Quote:
> 
> 
> 
> Originally Posted by *End3R*
> 
> Here ya go, straight from fraps: all @1080p
> 
> TheWorse Modded Watch dogs with ultra textures
> Frames: 5816
> Time (ms): 120000
> Min: 24
> Max: 62
> Avg: 48.467
> 
> Shadows of Mordor maxed (just using high instead of ultra textures, which for the record, look the exact same as ultra, the ultra are just uncompressed)
> Frames: 6345
> Time (ms): 120000
> Min: 35
> Max: 91
> Avg: 52.875
> 
> Lords of the Fallen maxed
> Frames: 4236
> Time (ms): 120000
> Min: 19
> Max: 45
> Avg: 35.300
> 
> Click to expand...
> 
> 
> 
> 
> 
> And very nice score tsm
Click to expand...

There is no Physx in this benchmark. Btw, I can break 9K pscore or over 10K gscore in this bench. My specific card has broke the 10K pscore wall and 11K gscore on water.

http://www.3dmark.com/fs/2720459


----------



## Agent Smith1984

Well, benchmarks are better for measuring your own performance gains from overclocking than they are comparing performance between other people in my opinion.

Firestrike is actually a good benchmark because it pushes the GPU very well independently from the rest of your system, but also does a good job of measuring CPU performance independently.

Crossfire scales 70-100% on Firestrike depending on your CPU, so it does help a lot, but Firestrike does not use physx at all. The "physics" test uses your CPU only.

Your score is decent for that setup though. You should really consider overclocking everything,..... you could see something closer to this:
http://www.3dmark.com/fs/2136509

Good luck!


----------



## End3R

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, benchmarks are better for measuring your own performance gains from overclocking than they are comparing performance between other people in my opinion.
> 
> Firestrike is actually a good benchmark because it pushes the GPU very well independently from the rest of your system, but also does a good job of measuring CPU performance independently.
> 
> Crossfire scales 70-100% on Firestrike depending on your CPU, so it does help a lot, but Firestrike does not use physx at all. The "physics" test uses your CPU only.
> 
> Your score is decent for that setup though. You should really consider overclocking everything,..... you could see something closer to this:
> http://www.3dmark.com/fs/2136509
> 
> Good luck!


Yea, I know I could overclock to get higher benches, but I'm all about gameplay, and I know even a slight OC would give me better performance all around, but until my card is struggling to keep avg fps rates over 30 with games maxed out I don't see the need.









I think by the time they release a game capable of doing that, it'd be worth it just to switch to a 970 at that point to get the extra 2 GB of memory.

Edit: and sorry if I worded the phsyx stuff weird, I know there isn't phsyx in firestrike, but im pretty sure nvidia cards will use their phsyx tech to render that before offloading it to the cpu, which ati cards cant do. But that could just be a misconception, unless anyone with an 8320 and a 760 gets a near identicle score to me, then I guess that would answer that. - found out you can search for exact setups hah, looks like it is comparable so they probably aren't getting much of a boost because of PhysX. http://www.3dmark.com/fs/2102182


----------



## diggiddi

Quote:


> Originally Posted by *End3R*
> 
> It's more than doable
> 
> 
> 
> 
> 
> 
> 
> - keep in mind I'm talking about @1080p. There isn't a game out there *I can't run maxed with around 2-4x AA,* TheWorse modded Watch_Dogs, Shadows of Mordor, Lord of the Fallen, Dragon Age Inquisition, etc etc - I haven't picked up FC4 or ACU yet but that's just because I have such a backlog already


That is not maxed, learn the meaning of the word
eg 2. To reach a point from which no additional growth, improvement, or benefit is possible

I like to run at full AA if possible, I can see the difference in C3 between MsAA 4x and 8x, and no, 4x is not maxing the game out, 8x is, which a 7950 cannot do so, please enough with the 270x can handle everything spiel. Stahpp IT! I believe At 1080p you most likely need something like an overclocked 290 not to dip below 40fps at any point in the game but deffo not a 270x as you keep insisting


----------



## End3R

Quote:


> Originally Posted by *diggiddi*
> 
> That is not maxed, learn the meaning of the word
> eg 2. To reach a point from which no additional growth, improvement, or benefit is possible
> 
> I like to run at full AA if possible, I can see the difference in C3 between MsAA 4x and 8x, and no, 4x is not maxing the game out, 8x is, which a 7950 cannot do so, please enough with the 270x can handle everything spiel. Stahpp IT! I believe At 1080p you most likely need something like an overclocked 290 not to dip below 40fps at any point in the game but deffo not a 270x as you keep insisting


You clearly have no idea what AA is actually doing then. All it does is smooth out jagged edges. Once you stop getting jagged edges, increasing AA does nothing but hurt performance, so yes if all your other settings are maxed, then it's still "maxed" it doesn't have to be full AA.


----------



## diggiddi

Once again if you max something then it cannot be improved upon, if there is any improvement to be had no matter how small, then it is not maxed out.
That's like saying I maxed out my car at 105 mph when the car can still get to 110, and no it doesn't matter if it needs 2x as much fuel to attain that speed
Also it does not only smooth jaggies, the water effects are more realistic when you increase the AA which improves the IQ. I will repeat again 4x is still not the max IQ setting despite your re-definition of the term, and even unfortunatley the 7950 struggles sometimes at 4x therefore your 7870 can do no better


----------



## End3R

Ok, I'm going to let someone else tell you you're wrong and spare a longwinded explanation, so here is a short one.

Yes, you are correct about the definition of the word, but the effect increasing AA has on a game is subjective to the individual person, and once they stop noticing jagged edges, they will notice only a fraction improvement, if any at all. That being said, you do not have to "max" out AA for the rest of your settings to be called "maxed" when they are. But keep telling yourself whatever you need to, to make yourself feel better. I'll be off playing Shadows of Mordor at 60 fps.


----------



## tsm106

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *End3R*
> 
> *I could be wrong...*
> 
> 
> 
> Shakes magic 8 ball, yes. There you go!
Click to expand...

It's like we've come full circle.


----------



## End3R

Quote:


> Originally Posted by *tsm106*
> 
> It's like we've come full circle.


At least diggi tries to backup what they say other than tossing insults. Yesterday you asked me what I wanted people to learn, I want you to learn how to contribute to a discussion on a non destructive manner. Get back to me when you can do that, until then, enjoy some Shadows of Mordor screenshots I took while @60fps









Spoiler: Warning: Spoiler!










































We can split hairs all day about something not having the sliders in every category maxed out. But those look pretty maxed out to me.


----------



## tsm106

Seriously, someone needs to google what clueless means, and probably delusional for that matter.


----------



## diggiddi

Quote:


> Originally Posted by *End3R*
> 
> Ok, I'm going to let someone else tell you you're wrong and spare a longwinded explanation, so here is a short one.
> 
> Yes, you are correct about the definition of the word, but the effect increasing *AA has on a game is subjective to the individual person, and once they stop noticing jagged edges, they will notice only a fraction improvement, if any at all.* That being said, you do not have to "max" out AA for the rest of your settings to be called "maxed" when they are. But keep telling yourself whatever you need to, to make yourself feel better. I'll be off playing Shadows of Mordor at 60 fps.


You know what I'm done with you, evidently subscribe to the group think view that AA does not improve IQ, all your arguments are subjective you have not produced any facts to support your point but you you want to impose your standard on me . I'm telling you I can see a difference especially in the water effects in C3, maybe your eyes are bad
There is a distinct difference between 8x and 4x AA that is why the devs put it in there, smaa 4x actually does the water effects very well but is not as good with the grass and other stuff


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *End3R*
> 
> At least diggi tries to backup what they say other than tossing insults. Yesterday you asked me what I wanted people to learn, I want you to learn how to contribute to a discussion on a non destructive manner. Get back to me when you can do that, until then, enjoy some Shadows of Mordor screenshots I took while @60fps
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We can split hairs all day about something not having the sliders in every category maxed out. But those look pretty maxed out to me.


So you show pictures of full CGI and not in game CONGRATS









Here.. let me pull a few on you since you think everyone is not educated here..
Quote:


> This offset is smaller than the actual size of the pixels. By averaging all these samples, the result is a smoother transition of the colors at the edges. Unlike supersampling (SSAA) which can result in the same pixel being shaded multiple times per pixel, multisampling runs the fragment program just once per pixel rasterized. However with MSAA multiple depth/stencil comparisons are performed per pixel, one for each of the subsamples, which gives you sub-pixel spatial precision on your geometry and nice, smoothed edges on your polygons.


source For you know.. the people that actually develop it

so this does allow further effects such as water and terrain to look better as now with more gpu performance and renders on pixels and shaders you have even more accurate polygon shapes,


----------



## End3R

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> So you show pictures of full CGI and not in game CONGRATS


All of those are gameplay... it has a photo mode... you know that right? Nice try though.

I also never said increasing AA isn't improving the image, I said it's subjective to the person. And when playing @1080p most people won't notice a difference going higher than 4x. All I said is once you stop noticing the improvement, increasing it beyond that is just hurting performance.

I don't get why everyone is being so negative. I'm not attacking anyone, I'm not saying you SHOULDN'T go higher than 4x, I'm just saying in most cases you don't need to. And if you do, the improvement is usually negligible (again, this is assuming we're talking about 1080p)

If you don't believe me, you JUST called my gameplay screenshot "full CGI" because you thought they were TOO GOOD to be gameplay. They were all taken with my 270x while getting 60fps.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *End3R*
> 
> All of those are gameplay... it has a photo mode... you know that right? Nice try though.


Look at you ignoring the rebuttal that proved you wrong, and that also renders in CGI, you know cause instagram photos are ALL NATURAL LIGHTING too right?


----------



## End3R

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Look at you ignoring the rebuttal that proved you wrong, and that also renders in CGI, you know cause instagram photos are ALL NATURAL LIGHTING too right?


Re-read what I said. Proved me wrong how? Never, not once anywhere have I said that increasing your AA is literally doing nothing. I've been saying that once YOU the specific person looking at the image stops noticing a difference, continuing to increase it is only hurting your performance.

And yes, the entire game is a computer generated image, but fact remains those are all gameplay screenshots, you just press a button to freeze everything going on so you can change the camera angle etc.

But ok, feel free thinking that isn't gameplay.

Go ask anyone else that plays SoM if those graphics aren't gameplay lol.

http://shadowofmordor.wikia.com/wiki/How_to:_Capture_images_and_share_from_Photo_Mode


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *End3R*
> 
> Re-read what I said. Proved me wrong how? Never, not once anywhere have I said that increasing your AA is literally doing nothing. I've been saying that once YOU the specific person looking at the image stops noticing a difference, continuing to increase it is only hurting your performance.
> 
> And yes, the entire game is a computer generated image, but fact remains those are all gameplay screenshots, you just press a button to freeze everything going on so you can change the camera angle etc.
> 
> But ok, feel free thinking that isn't gameplay.
> 
> Go ask anyone else that plays SoM if those graphics aren't gameplay lol.
> 
> http://shadowofmordor.wikia.com/wiki/How_to:_Capture_images_and_share_from_Photo_Mode


like I said instagram is a true example of the lighting naturally occuring.........


----------



## End3R

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> like I said instagram is a true example of the lighting naturally occuring.........


I don't even know what you're trying to say. What does instagram have to do with anything?

Whether I'm in photomode or playing the game it's 60fps and the graphics don't magically change when entering photo mode, it just pauses the game. Any alterations to the lighting via filters is just added post-processing and not something you have to do. It's still all gameplay graphics.

It's not a pre-rendered cgi cutscene like you're trying to claim.

I'm at work right now but if I have to I'll upload a video when I get home from work showing you exactly what my gameplay is like.


----------



## F3ERS 2 ASH3S

So I noticed that I am getting 30-45FPS in Dragon Age inquisition, anyone else notice with similar hardware? me 2x 280x


----------



## End3R

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> So I noticed that I am getting 30-45FPS in Dragon Age inquisition, anyone else notice with similar hardware? me 2x 280x


I get better than that with a single 270x with ultra settings and 4x MSAA. I usually average 50fps - but I'm also at 1080p, I dunno what res you're using. If it's the same, I bet is xfire causing the issue.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *End3R*
> 
> I get better than that with a single 270x with ultra settings and 4x MSAA. I usually average 50fps.


I should add on full top graphics (mkeaning yes all the settings maxed) , I switched it to mantle and saw it, need to switch back to dx11 to see if I notice any difference


----------



## End3R

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> I should add on full top graphics (mkeaning yes all the settings maxed) , I switched it to mantle and saw it, need to switch back to dx11 to see if I notice any difference


My settings are on ultra with 4x AA @1080p. I'd still suggest disabling xfire.


----------



## buttface420

i just got my second 280x in today...im trying to find what games support it.....

so far black ops 2 i went from 115-130 fps with one 280x to 150-200fps with 2x 280x

in battlefield 4, i didnt see much improvement but im mostly around 100-135fps when before i was 80-120 with lows of 60.

single 280x unigine valley 1.0 score:
1808

280x crossfire unigine valley 1.0 score
2711

i admit i was hoping for better performance, this is with a fx-8350 @ 4.5 ghz


----------



## gnemelf

cross fire is like a 1 third improvement I believe.


----------



## diggiddi

try for 4.6 ghz on the CPU see if that makes a difference


----------



## xLPGx

Im sitting here with a spare ~$100. The Kraken G10 and the H55 costs slightly less than that together.
Worth? My gigabyte 280x rev 1 isn't noisy but still..


----------



## JackCY

Put it to the fund for a better GPU


----------



## BruceB

Quote:


> Originally Posted by *buttface420*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> i just got my second 280x in today...im trying to find what games support it.....
> 
> so far black ops 2 i went from 115-130 fps with one 280x to 150-200fps with 2x 280x
> 
> in battlefield 4, i didnt see much improvement but im mostly around 100-135fps when before i was 80-120 with lows of 60.
> 
> single 280x unigine valley 1.0 score:
> 1808
> 
> 280x crossfire unigine valley 1.0 score
> 2711
> 
> 
> i admit i was hoping for better performance, this is with a fx-8350 @ 4.5 ghz


Quote:


> Originally Posted by *gnemelf*
> 
> cross fire is like a 1 third improvement I believe.


Crossfire Support/performace is almost entirely dependant on which game you're looking at, it's very variable, check xfire Benchmarks of the games you Play to see if there's any real improvement (I'm sure you did this already though







).

Quote:


> Originally Posted by *diggiddi*
> 
> try for 4.6 ghz on the CPU see if that makes a difference


^^That's a good Point, two 280X's in xfire is _a lot_ of graphics power, your cpu might be bottlenecking (135fps in BF4 is gonna be hard on any CPU







)


----------



## gnemelf

well yea obviously it has to be a crossfire compatible game I'm just saying out of all the games that I play that are crossfire capable its seems to be a 1/3 improvement... The person asked for input and I gave that...


----------



## rdr09

Quote:


> Originally Posted by *buttface420*
> 
> i just got my second 280x in today...im trying to find what games support it.....
> 
> so far black ops 2 i went from 115-130 fps with one 280x to 150-200fps with 2x 280x
> 
> in battlefield 4, i didnt see much improvement but im mostly around 100-135fps when before i was 80-120 with lows of 60.
> 
> single 280x unigine valley 1.0 score:
> 1808
> 
> 280x crossfire unigine valley 1.0 score
> 2711
> 
> i admit i was hoping for better performance, this is with a fx-8350 @ 4.5 ghz


something might be wrong with the setup. my croassfire 7970/7950 scored 3292 in Valley with the 7950 oc'ed to 1000 core and the 7970 at stock.


----------



## aaronsta1

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> So I noticed that I am getting 30-45FPS in Dragon Age inquisition, anyone else notice with similar hardware? me 2x 280x


DA:I is a horrible port.. lots of bugs.. and not really compatible with xfire.. it works but not so good.

i myself get around 80 FPS with my dual 7870XTs.. but it crashes alot.. and textures flash.

even tho i get a performance it, i been playing with xfire off.. its more smoother.


----------



## Devildog83

Quote:


> Originally Posted by *buttface420*
> 
> i just got my second 280x in today...im trying to find what games support it.....
> 
> so far black ops 2 i went from 115-130 fps with one 280x to 150-200fps with 2x 280x
> 
> in battlefield 4, i didnt see much improvement but im mostly around 100-135fps when before i was 80-120 with lows of 60.
> 
> single 280x unigine valley 1.0 score:
> 1808
> 
> 280x crossfire unigine valley 1.0 score
> 2711
> 
> i admit i was hoping for better performance, this is with a fx-8350 @ 4.5 ghz


The CPU is almost a non issue in Valley, I got this score with a dual core pentium G3258 @ 4.3 Ghz and a single R9 290 and got close to the same with an 8350 @ 4.8 and a 7870/R9 270x in X-fire. Not quite but almost.


----------



## tsm106

CPU can make a huge difference. Those two are similar because the Pentium IPC is stupid high overclocked allowing it to match many other cpus. Not sure you realize it but you making AMD cpus look bad.


----------



## Devildog83

Quote:


> Originally Posted by *tsm106*
> 
> CPU can make a huge difference. Those two are similar because the Pentium IPC is stupid high overclocked allowing it to match many other cpus. Not sure you realize it but you making AMD cpus look bad.


It does make a difference just not that much in Valley. I didn't mean to make AMD look bad.







I love my 8350. In other bench's like Firestrike you can really tell the difference between a dual core and an 8 core.


----------



## tsm106

It gets progressively worse the more you add cards unfortunately. Maybe this year when Firestrike's new Mantle support comes online, things will get shaken up.


----------



## Devildog83

Quote:


> Originally Posted by *tsm106*
> 
> It gets progressively worse the more you add cards unfortunately. Maybe this year when Firestrike's new Mantle support comes online, things will get shaken up.


I have old Firestrike scores with 1 GPU and Crossfire, my old Devils

Single


X-Fire


If you look strictly at the graphics score it's almost exactly double.

I wish I could find my Firestrike score with the 290 and 8350


----------



## buttface420

Quote:


> Originally Posted by *rdr09*
> 
> something might be wrong with the setup. my croassfire 7970/7950 scored 3292 in Valley with the 7950 oc'ed to 1000 core and the 7970 at stock.


i wonder what could be wrong? i have:
990fx motherboard that runs both cards x16
omega drivers
win 8.1

everything is hooked up right.
i only have 750w bronze power supply, could it be under powered or something?

guess i got alot of playing around to do.


----------



## buttface420

Quote:


> Originally Posted by *Devildog83*
> 
> I have old Firestrike scores with 1 GPU and Crossfire, my old Devils
> 
> Single
> 
> 
> X-Fire
> 
> 
> If you look strictly at the graphics score it's almost exactly double.
> 
> I wish I could find my Firestrike score with the 290 and 8350


this is my firestrike (not extreme i used the free version)

http://www.3dmark.com/3dm/5427019

i still cant figure out why my scores are so low


----------



## rdr09

Quote:


> Originally Posted by *buttface420*
> 
> this is my firestrike (not extreme i used the free version)
> 
> http://www.3dmark.com/3dm/5427019
> 
> i still cant figure out why my scores are so low


a highly oc'ed 290 will top out around 14K in FS GScore. your 280Xs are already at almost 17K. it could be your psu. i only had a 700W back when i had 7900 crossfired. can't oc both just the 7950 to match the 7970. along with the 8300 chip, you might be asking too much from the 750.

in Valley, there is a tweak you can do to make sure you use the full potential of 2 7900 cards. you have to set optimized 1X1 in CCC. your cards are running fine.

edit: in BF4 MP, lower your MSAA. i suggest using just one app to minitor temps. i recommend HWINFO64.

monitor your temps like this in benches and games. let it run in the background and check after . . .



that's BF4 MP


----------



## buttface420

okay i found the valley benchmark tweek thread, i turn frame pacing "off" for xfire..brought my score up more so maybe if i keep playing it will get better.

are we supposed to have frame pacing on for xfire?

before score: 2711

after turning off frame pacing:


----------



## JackCY

My 280x doesn't like Package C-state C7, there is always a chance, not so small, that when it wakes up from being idle for a long time the monitor either doesn't even lit up or the image seems to have some pattern artifacts. That is with OC already tuned down to 1100/1750MHz @ 1.2V (min 1.13V real). C6 seems to work fine so far. C7, nope, just tried to switch to it and artifacts today again with it. Maybe it's due to OC, maybe my card simply doesn't like C7.


----------



## Catscratch

Quote:


> Originally Posted by *buttface420*
> 
> i just got my second 280x in today...im trying to find what games support it.....
> 
> so far black ops 2 i went from 115-130 fps with one 280x to 150-200fps with 2x 280x
> 
> in battlefield 4, i didnt see much improvement but im mostly around 100-135fps when before i was 80-120 with lows of 60.
> 
> single 280x unigine valley 1.0 score:
> 1808
> 
> 280x crossfire unigine valley 1.0 score
> 2711
> 
> i admit i was hoping for better performance, this is with a fx-8350 @ 4.5 ghz


Got a brand new Sapphire 280x 11221-22 and turns out we got almost the same 1x card performance.



Stock card 1020/1500 with a 4ghz 2500k and 1866 c9-10-9 28-2t ripjaws.


----------



## Devildog83

Quote:


> Originally Posted by *buttface420*
> 
> this is my firestrike (not extreme i used the free version)
> 
> http://www.3dmark.com/3dm/5427019
> 
> i still cant figure out why my scores are so low


I don't think your scores are that low really. With your hardware that's not bad at all. Just my opinion though.


----------



## aaronsta1

Quote:


> Originally Posted by *buttface420*
> 
> this is my firestrike (not extreme i used the free version)
> 
> http://www.3dmark.com/3dm/5427019
> 
> i still cant figure out why my scores are so low


your couple of 280xs are faster then my 7870XTs









http://www.3dmark.com/fs/3528574

now i think my results are low, lol, why cant i hit 10k


----------



## gnemelf

http://www.3dmark.com/fs/3717806


----------



## JackCY

You have to look at the graphics score not the overall score to even begin comparing GPUs. The overall counts in the other scores as well which are CPU limited way more.

I don't like 3DMark personally, it used to be great back in the day but the recent versions are not so much.
I prefer game benchmarks or at least Heaven. Dunno why you guys run Valley? It's a bit dated now.


----------



## aaronsta1

Quote:


> Originally Posted by *JackCY*
> 
> You have to look at the graphics score not the overall score to even begin comparing GPUs. The overall counts in the other scores as well which are CPU limited way more.
> 
> I don't like 3DMark personally, it used to be great back in the day but the recent versions are not so much.
> I prefer game benchmarks or at least Heaven. Dunno why you guys run Valley? It's a bit dated now.


unigine is NOT a good benchmark software..
you can change your settings in your CP and boost its score. its like cheating.. plus the results are in plain html, i can put anything i want in there.. lol.


----------



## buttface420

uh oh..new high score for me with firestrike!

i reached 11000+

http://www.3dmark.com/3dm/5486852


----------



## JackCY

Quote:


> Originally Posted by *aaronsta1*
> 
> unigine is NOT a good benchmark software..
> you can change your settings in your CP and boost its score. its like cheating.. plus the results are in plain html, i can put anything i want in there.. lol.


You can always change things in your GPU configuration for anything you run even 3DMark, I don't know what you would change for Heaven though. Maybe put preference to performance instead of quality, that used to be the single option changing preferences in the processing or try forcibly turning features off.
If you want to cheat you can cheat anything even if it's online reported scores haha.


----------



## rdr09

Quote:


> Originally Posted by *buttface420*
> 
> uh oh..new high score for me with firestrike!
> 
> i reached 11000+
> 
> http://www.3dmark.com/3dm/5486852


very nice cars. they're only 15% off stock 2 290s.


----------



## Agent Smith1984

Quote:


> Originally Posted by *aaronsta1*
> 
> your couple of 280xs are faster then my 7870XTs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/3528574
> 
> now i think my results are low, lol, why cant i hit 10k


No offense, but it's your CPU....

If you look, you have a great graphics and a really good physics score for an AMD, but your combined score is low. That's the result of the vishera using 4 modules with 8 semi-cores, versus having 8 actual cores.
For some reason the combined test only benefits from physical cores in combination with the GPU.....

Look at my run with a single 290....
http://www.3dmark.com/fs/3654067
Your graphics and physics scores are both higher than mine, but I break over the 10k mark with my combined score.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Agent Smith1984*
> 
> No offense, but it's your CPU....
> 
> If you look, you have a great graphics and a really good physics score for an AMD, but your combined score is low. That's the result of the vishera using 4 modules with 8 semi-cores, versus having 8 actual cores.
> For some reason the combined test only benefits from physical cores in combination with the GPU.....
> 
> Look at my run with a single 290....
> http://www.3dmark.com/fs/3654067
> Your graphics and physics scores are both higher than mine, but I break over the 10k mark with my combined score.


His point will be proven here

http://www.3dmark.com/fs/2907439


----------



## buttface420

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> His point will be proven here
> 
> http://www.3dmark.com/fs/2907439


nice!

i do notice the higher the overclock of cpu the higher the score,unfortunately im am maxed on my cpu oc due to cooling...but my gpus are still stock maybe with a oc i could get a lil higher but not as high as your score!


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *buttface420*
> 
> nice!
> 
> i do notice the higher the overclock of cpu the higher the score,unfortunately im am maxed on my cpu oc due to cooling...but my gpus are still stock maybe with a oc i could get a lil higher but not as high as your score!


graphics score you have 1k more.. not sure why that is hrmmmm


----------



## Recr3ational

this is mine

http://www.3dmark.com/3dm/5496178?


----------



## tsm106

Quote:


> Originally Posted by *Recr3ational*
> 
> this is mine
> 
> http://www.3dmark.com/3dm/5496178?


Intel setups are on a whole different level. Here's mine on old drivers.

http://www.3dmark.com/fs/2519782


----------



## Recr3ational

Quote:


> Originally Posted by *tsm106*
> 
> Intel setups are on a whole different level. Here's mine on old drivers.
> 
> http://www.3dmark.com/fs/2519782


That hexa doe.


----------



## tsm106

But you get FS Ivy boost on gscore.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Recr3ational*
> 
> this is mine
> 
> http://www.3dmark.com/3dm/5496178?


now I really need to see why my graphics score is so low hrmmmm


----------



## Recr3ational

Quote:


> Originally Posted by *tsm106*
> 
> But you get FS Ivy boost on gscore.


Your clocks are ridiculous. That's some increase. Nice work.


----------



## tsm106

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Recr3ational*
> 
> this is mine
> 
> http://www.3dmark.com/3dm/5496178?
> 
> 
> 
> now I really need to see why my graphics score is so low hrmmmm
Click to expand...

Your gscore isn't bad though. Rec is running a bit higher clocks and he's on an Ivy chip which gets a slight gscore bias in FS.

Quote:


> Originally Posted by *Recr3ational*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> But you get FS Ivy boost on gscore.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Your clocks are ridiculous. That's some increase. Nice work.
Click to expand...

Thanks man. One card from that duo actually hit 1400 core on low ambient when I loaned it to a friend.

http://www.3dmark.com/3dm/4282003


----------



## kamil234

asus 280x here







1050/1600


----------



## Recr3ational

Quote:


> Originally Posted by *tsm106*
> 
> Your gscore isn't bad though. Rec is running a bit higher clocks and he's on an Ivy chip which gets a slight gscore bias in FS.
> Thanks man. One card from that duo actually hit 1400 core on low ambient when I loaned it to a friend.
> 
> http://www.3dmark.com/3dm/4282003


What voltage was that running?


----------



## buttface420

Quote:


> Originally Posted by *Recr3ational*
> 
> this is mine
> 
> http://www.3dmark.com/3dm/5496178?


oh wow nice score!!!
Quote:


> Originally Posted by *tsm106*
> 
> Intel setups are on a whole different level. Here's mine on old drivers.
> 
> http://www.3dmark.com/fs/2519782


holy crappoly i cant believe they are capable for a score that high! maybe i do need to switch to intel jezus


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *buttface420*
> 
> oh wow nice score!!!
> holy crappoly i cant believe they are capable for a score that high! maybe i do need to switch to intel jezus


as stated above, its the cpu boost lol, but outside of benchmarks it really is not that much of a boost switch. def not worth the cost unless you can reclaim the cost that you spent for the AMD

Before there is hate, I will add from what I have seen, in addition. we all know yada yada


----------



## tsm106

Quote:


> Originally Posted by *Recr3ational*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Your gscore isn't bad though. Rec is running a bit higher clocks and he's on an Ivy chip which gets a slight gscore bias in FS.
> Thanks man. One card from that duo actually hit 1400 core on low ambient when I loaned it to a friend.
> 
> http://www.3dmark.com/3dm/4282003
> 
> 
> 
> What voltage was that running?
Click to expand...

That was at the max of 1.38v

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Quote:
> 
> 
> 
> Originally Posted by *buttface420*
> 
> oh wow nice score!!!
> holy crappoly i cant believe they are capable for a score that high! maybe i do need to switch to intel jezus
> 
> 
> 
> as stated above, its the cpu boost lol, but outside of benchmarks it really is not that much of a boost switch. *def not worth the cost unless you can reclaim the cost that you spent for the AMD*
> 
> Before there is hate, I will add from what I have seen, in addition. we all know yada yada
Click to expand...

That's pretty subjective there. AMD cpus have terrible IPC and we live in a software age where IPC is king because developers are either too lazy or incompetent to utilize multi threading well.


----------



## Recr3ational

Quote:


> Originally Posted by *tsm106*
> 
> That was at the max of 1.38v
> That's pretty subjective there. AMD cpus have terrible IPC and we live in a software age where IPC is king because developers are either too lazy or incompetent to utilize multi threading well.


wait 1.38?
What was you using to achieve that?


----------



## tsm106

Quote:


> Originally Posted by *Recr3ational*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That was at the max of 1.38v
> That's pretty subjective there. AMD cpus have terrible IPC and we live in a software age where IPC is king because developers are either too lazy or incompetent to utilize multi threading well.
> 
> 
> 
> wait 1.38?
> What was you using to achieve that?
Click to expand...

If you have a reference card or one that uses the reference VR controller you can use this version of trixx bottom of 2nd post.

http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *tsm106*
> 
> That was at the max of 1.38v
> That's pretty subjective there. AMD cpus have terrible IPC and we live in a software age where IPC is king because developers are either too lazy or incompetent to utilize multi threading well.


Although you are correct and I should have been more tactful with what I was saying. If you are also subjective enough to utilize the proper programs then the line blurs a quite bit..


----------



## Sky-way

Took the plunge a couple weeks ago and bought an msi r9 270x 2gb gaming. Very impressed with the build quality, everything installed correctly and is working perfectly. I don't have a lot of games right now, mostly free to play stuff from steam and starter editions of the blizzard games but it plays all those flawlessly on ultra without hesitation. I'm glad I went with msi, top notch stuff. Also, in case anyone cares, I got it for $170 on newegg.


----------



## JackCY

Quote:


> Originally Posted by *tsm106*
> 
> That was at the max of 1.38v
> That's pretty subjective there. AMD cpus have terrible IPC and we live in a software age where IPC is king because developers are either too lazy or incompetent to utilize multi threading well.


Not everything can be parallelized, sure some things should be, will someone pay for it? I guess not yet.
Plus Intel has it's own very good compiler. Does AMD have a compiler tailored to their CPUs? Never seen one.

AMD kept boosting clocks and adding more cores instead of doing more work per clock. This is what happens. Neither did they get their hands on better tech yet. Still putting out 28nm, ok my bad not even 28nm, but 32nm, that's nuts.
They really better switch to 14nm or lower or what ever others will have or already some do have and skip anything that they've "missed" by now.
Quote:


> Originally Posted by *tsm106*
> 
> Your gscore isn't bad though. Rec is running a bit higher clocks and he's on an Ivy chip which gets a slight gscore bias in FS.
> Thanks man. One card from that duo actually hit 1400 core on low ambient when I loaned it to a friend.
> 
> http://www.3dmark.com/3dm/4282003


That's some gold mine card. What's your ASIC? Very high I guess.
Mine is so low the card almost doesn't clock at least not without running into glitches/artifacts.

The power consumption must be insane though with anything above 1.2GHz and voltage shooting north of 1.3V.


----------



## tsm106

The asic on that card and well, all my 7970s especially the quad array is around 75%. I found that for my purposes, watercooling that 75% gave me the best voltage scaling up to 1.38v.


----------



## Devildog83

Quote:


> Originally Posted by *aaronsta1*
> 
> your couple of 280xs are faster then my 7870XTs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/3528574
> 
> now i think my results are low, lol, why cant i hit 10k


For an overall score FX processors don't do well in firestrike. The combined score seems to kill it. Very nice graphics score though for 2 x 7870's


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Devildog83*
> 
> For an overall score FX processors don't do well in firestrike. The combined score seems to kill it. Very nice graphics score though for 2 x 7870's


I wonder why that is, in 3dmark11 the combined score was relative to the physics and actual graphics.. seems firestike is a bit different but not sure how


----------



## Devildog83

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> graphics score you have 1k more.. not sure why that is hrmmmm


My 7870/270x used to be the same way but the combined score and overall score sucked. Never could figure out why. Maybe the overclock or the high GPU overclock tapped out the CPU performance.

Firestrike really likes Intel though.


----------



## Devildog83

Quote:


> Originally Posted by *kamil234*
> 
> asus 280x here
> 
> 
> 
> 
> 
> 
> 
> 1050/1600


You are in. Do you have a pic for all to see?


----------



## jumpy2219

I got a Reference 270, I posted a pic a while back. Also did a review of it.

Running it on stock clocks, because I got bad bin, plus the reference cooler isn't the best.

Not an AD, but since joining requires a picture, why not show a video instead?


----------



## Devildog83

Quote:


> Originally Posted by *jumpy2219*
> 
> I got a Reference 270, I posted a pic a while back. Also did a review of it.
> 
> Running it on stock clocks, because I got bad bin, plus the reference cooler isn't the best.
> 
> Not an AD, but since joining requires a picture, why not show a video instead?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


I added you, welcome. Sorry I don't know the stock memory clock off the top of my head.


----------



## jumpy2219

If i'm not mistaken, it is 1400 mHz


----------



## Devildog83

Quote:


> Originally Posted by *jumpy2219*
> 
> If i'm not mistaken, it is 1400 mHz


K done.


----------



## tabascosauz

Quote:


> Originally Posted by *jumpy2219*
> 
> If i'm not mistaken, it is 1400 mHz


That is very cool. The only other place I've seen Pitcairn's new and elusive reference blower is in an AMD blog build with the R9 270X and 7850K.

What drove you to buy the R9 270?


----------



## MiladEd

I' d like to be listed as well!

Sapphire R9 280X Dual-X 1030/1500

Here's a few pics:


----------



## Devildog83

Quote:


> Originally Posted by *MiladEd*
> 
> I' d like to be listed as well!
> 
> Sapphire R9 280X Dual-X 1030/1500
> 
> Here's a few pics:
> 
> 
> Spoiler: Warning: Spoiler!


You have been added, Welcome !!!


----------



## MiladEd

Quote:


> Originally Posted by *Devildog83*
> 
> [/SPOILER]
> 
> You have been added, Welcome !!!


Thanks!


----------



## jumpy2219

Quote:


> Originally Posted by *tabascosauz*
> 
> That is very cool. The only other place I've seen Pitcairn's new and elusive reference blower is in an AMD blog build with the R9 270X and 7850K.
> 
> What drove you to buy the R9 270?


Sorry for this sort-of late response lol.

But, I did not buy it actually. AMD has been supporting me with my YouTube channel for a while now. So they sent me this GPU.

I am a tech reviewer for YouTube, not really big, but I'm having fun sharing what I love.


----------



## jason387

Quote:


> Originally Posted by *jumpy2219*
> 
> Sorry for this sort-of late response lol.
> 
> But, I did not buy it actually. AMD has been supporting me with my YouTube channel for a while now. So they sent me this GPU.
> 
> I am a tech reviewer for YouTube, not really big, but I'm having fun sharing what I love.


Have you tried editing the bios to achieve higher overclocks? I have the Sapphire R9 270. I've managed to overclock it to 1150Mhz. I used 1.3v but because these cards suffer from vdroop, the actual voltage is around 1.25v. Here is my Firestrike Score. Only have a look at the GPU Score since overall score is also affected by the CPU.


----------



## jumpy2219

Quote:


> Originally Posted by *jason387*
> 
> Have you tried editing the bios to achieve higher overclocks? I have the Sapphire R9 270. I've managed to overclock it to 1150Mhz. I used 1.3v but because these cards suffer from vdroop, the actual voltage is around 1.25v. Here is my Firestrike Score. Only have a look at the GPU Score since overall score is also affected by the CPU.


In my video I explain physical issues I had when Oc'ing this card. Sadly the issues I had with overheating, and just overall strange things happening was just too much. I decided that it would be safer not to OC the card.



This is what happened while just browsing after playing BF4, OC to 945mHz....


----------



## jason387

The thing is this card has boost. Now the voltages for boost and non boost clocks are different. So while switching between them the voltages don't set in right. You could try keeping the base clock and boost clock to the same at the same voltage. If you want help editing the bios I'll do it for you.


----------



## jumpy2219

Quote:


> Originally Posted by *jason387*
> 
> The thing is this card has boost. Now the voltages for boost and non boost clocks are different. So while switching between them the voltages don't set in right. You could try keeping the base clock and boost clock to the same at the same voltage. If you want help editing the bios I'll do it for you.


I see, I haven't really gone in too much detail with this card. Being a reviewer this card will probably be changed soon.

Although i've thought about editing the BIOS. I realize that my stock temperatures are really high, so I'd rather not cause a thermal throttle.

Thanks for the offer though


----------



## jason387

Not a problem. As a fellow reviewer for a tech site I can relate.


----------



## JackCY

That's one awful card ADR.

I couldn't review it, I would put into graveyard. "This screams performance" would rather be: "This screams low end" or "This screams performance, Note: Unbelievably loud"
No power, no OC, not a great reference design, coil whine, probably loud, ...

They wouldn't send me another card for review lol.


----------



## jumpy2219

Quote:


> Originally Posted by *JackCY*
> 
> That's one awful card ADR.
> 
> I couldn't review it, I would put into graveyard. "This screams performance" would rather be: "This screams low end" or "This screams performance, Note: Unbelievably loud"
> No power, no OC, not a great reference design, coil whine, probably loud, ...
> 
> They wouldn't send me another card for review lol.


In my video I did mention all these downsides of course, which actually are really accurate lol. But when it comes to performance, I was surprised. Its not a bad card at all. I would recommend it more than a 270X. The cheapest they can go for is around $110 USD. For high to medium setting at 60 FPS for almost every game I've played. This card isn't bad at all.

Also, when it comes to all the issues, ie. " no OC, not a great reference design, coil whine, probably loud" this can easily be fixed by buying a non-reference card.

When judging graphics cards, I point out positives and negatives. But I also look at it from a price standpoint, and for the price, this card is great.


----------



## jumpy2219

Just a note, the graphic on my thumbnail is referring to the design of the reference blower. I'm personally pretty fond of reference fans. lol


----------



## Devildog83

Quote:


> Originally Posted by *jumpy2219*
> 
> In my video I did mention all these downsides of course, which actually are really accurate lol. But when it comes to performance, I was surprised. Its not a bad card at all. I would recommend it more than a 270X. The cheapest they can go for is around $110 USD. For high to medium setting at 60 FPS for almost every game I've played. This card isn't bad at all.
> 
> Also, when it comes to all the issues, ie. " no OC, not a great reference design, coil whine, probably loud" this can easily be fixed by buying a non-reference card.
> 
> When judging graphics cards, I point out positives and negatives. But I also look at it from a price standpoint, and for the price, this card is great.


Why would you recommend it over a 270x? Especially with all of the issues. Heck you can get a Sapphire dual X for about $130 and can better performance with less issues.


----------



## jumpy2219

Quote:


> Originally Posted by *Devildog83*
> 
> Why would you recommend it over a 270x? Especially with all of the issues. Heck you can get a Sapphire dual X for about $130 and can better performance with less issues.


The 270x and the 270 are essentially the same card. As they are both based on the same card ( was one HD series card, the name escapes me). Basically the difference between these two are the operating frequencies.

Linus has a video basically explaining this here:



Also, again you have to realize that issues and overclocking capabilities can vary between cards. Just because someone got a bad bin doesn't mean others will too. The 270, overall, doesn't have bad reviews, and In my experience and use, this isn't a bad card at all, especially for a reference card.


----------



## aaronsta1

Quote:


> Originally Posted by *jumpy2219*
> 
> In my video I explain physical issues I had when Oc'ing this card. Sadly the issues I had with overheating, and just overall strange things happening was just too much. I decided that it would be safer not to OC the card.
> 
> 
> 
> This is what happened while just browsing after playing BF4, OC to 945mHz....


id return it for RMA.


----------



## jumpy2219

Quote:


> Originally Posted by *aaronsta1*
> 
> id return it for RMA.


This GPU is one sent for review, so I cannot return it for RMA.

It works fine at stock though. The one thing I can blame it on is the crappy cooling of the reference cooler. But they don't sell this card to the public anyways lol.


----------



## aaronsta1

Quote:


> Originally Posted by *jumpy2219*
> 
> This GPU is one sent for review, so I cannot return it for RMA.
> 
> It works fine at stock though. The one thing I can blame it on is the crappy cooling of the reference cooler. But they don't sell this card to the public anyways lol.


dd you try to re-paste it?


----------



## tsm106

Quote:


> Originally Posted by *jumpy2219*
> 
> Quote:
> 
> 
> 
> Originally Posted by *aaronsta1*
> 
> id return it for RMA.
> 
> 
> 
> This GPU is one sent for review, so I cannot return it for RMA.
> 
> It works fine at stock though. The one thing I can blame it on is the crappy cooling of the reference cooler. But they don't sell this card to the public anyways lol.
Click to expand...

lol rma...

Read this. This looks like classic hw accel breaking powerstates.

http://www.overclock.net/t/1529888/pc-freezes-from-youtube-browser-apps-after-r9-290-install/0_40#post_23271996


----------



## jumpy2219

Quote:


> Originally Posted by *tsm106*
> 
> lol rma...
> 
> Read this. This looks like classic hw accel breaking powerstates.
> 
> http://www.overclock.net/t/1529888/pc-freezes-from-youtube-browser-apps-after-r9-290-install/0_40#post_23271996


Just read it.

The difference between my case and that one is; that my GPU acts up randomly. I could be running it for an hour fine, playing games. Then move on to a YT video, maybe watch for around 20 -30 minutes while searching the web. Then BAM, random crash. (All this happens, only during an OC)

*Also, this isn't consistent, I just happens randomly.

Just as an FYI, i use ASUS Gpu Tweak for my OC'ing


----------



## jumpy2219

Quote:


> Originally Posted by *aaronsta1*
> 
> dd you try to re-paste it?


Nope, I have done no modifications to this card. Doesn't seem worth it IMO


----------



## rdr09

Quote:


> Originally Posted by *jumpy2219*
> 
> Just read it.
> 
> The difference between my case and that one is; that my GPU acts up randomly. I could be running it for an hour fine, playing games. Then move on to a YT video, maybe watch for around 20 -30 minutes while searching the web. Then BAM, random crash. (All this happens, only during an OC)
> 
> *Also, this isn't consistent, I just happens randomly.
> 
> Just as an FYI, i use ASUS Gpu Tweak for my OC'ing


you might have Overdrive enabled in CCC or another app like GPUZ running.


----------



## tsm106

Quote:


> Originally Posted by *jumpy2219*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> lol rma...
> 
> Read this. This looks like classic hw accel breaking powerstates.
> 
> http://www.overclock.net/t/1529888/pc-freezes-from-youtube-browser-apps-after-r9-290-install/0_40#post_23271996
> 
> 
> 
> Just read it.
> 
> The difference between my case and that one is; that my GPU acts up randomly. I could be running it for an hour fine, playing games. Then move on to a YT video, maybe watch for around 20 -30 minutes while searching the web. Then BAM, random crash. (All this happens, only during an OC)
> 
> *Also, this isn't consistent, I just happens randomly.
> 
> Just as an FYI, i use ASUS Gpu Tweak for my OC'ing
Click to expand...

?

You can lead a horse to water, but you can't make him drink...


----------



## jumpy2219

Quote:


> Originally Posted by *tsm106*
> 
> ?
> 
> You can lead a horse to water, but you can't make him drink...


I am going to try your suggestion. I'm just not at my computer right now.


----------



## JackCY

Quote:


> Originally Posted by *jumpy2219*
> 
> In my video I did mention all these downsides of course, which actually are really accurate lol. But when it comes to performance, I was surprised. Its not a bad card at all. I would recommend it more than a 270X. The cheapest they can go for is around $110 USD. For high to medium setting at 60 FPS for almost every game I've played. This card isn't bad at all.
> 
> Also, when it comes to all the issues, ie. " no OC, not a great reference design, coil whine, probably loud" this can easily be fixed by buying a non-reference card.
> 
> When judging graphics cards, I point out positives and negatives. But I also look at it from a price standpoint, and for the price, this card is great.


Around $170 on new egg, non reference design MSI, GB, Asus. I bet in US you can get a used but still with warranty 280x for that cash. Same here, what 270/270x cost new you can spend buying a used 280x.

I don't think it's worth buying anything apart from 280 and 970 now, if one wants a new unused card. Even 280x starts to struggle to run high settings at 1080p for me sometimes, 270 no chance, would be playable but below 60fps = more lag, bad for me. Less than 3GB VRAM, a no go when stock games hover around 2GB mark, sometimes 1.6GB sometimes 2.2GB.

270/270x ~= 7870
280/280x ~= 7970

Nothing new under the sun for years. Same old chips/cards with tiny changes. Same trick pulled off Nvidia with 770 ~= 680.


----------



## HitJacker

Hi !









I try to know what is the max vcore for 280X under water ? Car I push it to 1.3 h24 ?


----------



## Devildog83

Quote:


> Originally Posted by *jumpy2219*
> 
> The 270x and the 270 are essentially the same card. As they are both based on the same card ( was one HD series card, the name escapes me). Basically the difference between these two are the operating frequencies.
> 
> Linus has a video basically explaining this here:
> 
> 
> 
> Also, again you have to realize that issues and overclocking capabilities can vary between cards. Just because someone got a bad bin doesn't mean others will too. The 270, overall, doesn't have bad reviews, and In my experience and use, this isn't a bad card at all, especially for a reference card.


All of that is true but NONE of that would make me decide to purchase a 270 over a 270x because plain and simple you can get way higher clocks and higher performance, period. My 270x was everyday @ 1200/1450 and was stable at 1225/1550 if I needed it. There has not been a 270 built that could match it.

Regardless of what Linus says a 270 at $110 is not better than a 270x at $130.


----------



## Devildog83

Quote:


> Originally Posted by *JackCY*
> 
> Around $170 on new egg, non reference design MSI, GB, Asus. I bet in US you can get a used but still with warranty 280x for that cash. Same here, what 270/270x cost new you can spend buying a used 280x.
> 
> I don't think it's worth buying anything apart from 280 and 970 now, if one wants a new unused card. Even 280x starts to struggle to run high settings at 1080p for me sometimes, 270 no chance, would be playable but below 60fps = more lag, bad for me. Less than 3GB VRAM, a no go when stock games hover around 2GB mark, sometimes 1.6GB sometimes 2.2GB.
> 
> 270/270x ~= 7870
> 280/280x ~= 7970
> 
> Nothing new under the sun for years. Same old chips/cards with tiny changes. Same trick pulled off Nvidia with 770 ~= 680.


I agree, I got a used 290 flashed to a 290X for $230 used and it rocks price to performance. When they were now I spent just as much for my $270x Devil.

By the way I believe the 270 = 7850. The 270x's are binned higher and that's why in most cases, not all, the 270x is a much better performing card.


----------



## jumpy2219

Quote:


> Originally Posted by *JackCY*
> 
> Around $170 on new egg, non reference design MSI, GB, Asus. I bet in US you can get a used but still with warranty 280x for that cash. Same here, what 270/270x cost new you can spend buying a used 280x.
> 
> I don't think it's worth buying anything apart from 280 and 970 now, if one wants a new unused card. Even 280x starts to struggle to run high settings at 1080p for me sometimes, 270 no chance, would be playable but below 60fps = more lag, bad for me. Less than 3GB VRAM, a no go when stock games hover around 2GB mark, sometimes 1.6GB sometimes 2.2GB.
> 
> 270/270x ~= 7870
> 280/280x ~= 7970
> 
> Nothing new under the sun for years. Same old chips/cards with tiny changes. Same trick pulled off Nvidia with 770 ~= 680.


When it comes to price, the most that you can get for cheaper is always a plus. I think anyone would choose a used 280 for the same price as a 270. But what I think IMO, for the casual gamer with a budget in mind, the 270 isn't a bad card.


----------



## Devildog83

The 270 is not a bad card at all, I don't think anyone said that it is. If your not going to high res and don't have to have everything at max on demanding games it should do just fine. Where the lines really get blurred is with the 290/290x, the performance really is close and you can flash some 290's to 290x.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> The 270 is not a bad card at all, I don't think anyone said that it is. If your not going to high res and don't have to have everything at max on demanding games it should do just fine. Where the lines really get blurred is with the 290/290x, the performance really is close and you can flash some 290's to 290x.


How's your 290s going? Didn't you say you bought one a while back?


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> How's your 290s going? Didn't you say you bought one a while back?


Yes sir, my 290 is awesome. I don't have a game or task I cannot handle. Even with a G3258 duo core pentium @ 4.3 Ghz I get over 100FPS in BF4 at max w/mantle @ 1080p. Needless to say if I went to 4k I might not be able to run at max but with 4K who cares right.


----------



## austinmrs

I noticed something..

When i had nvidia card, with this same monitor, when i activated gpu scaling, i could use 75 hertz at any resolution, without any problem.

Now with amd card, when i activate gpu scaling, i can only use 60 hertz, no matter what resolution i try, even at 640 x 480, with gpu scaling, i can only get 60 herz...

What can i do? Already tried powerstrip and still cant..


----------



## diggiddi

Sounds like a difference between catalyst and nvidia's control center


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Yes sir, my 290 is awesome. I don't have a game or task I cannot handle. Even with a G3258 duo core pentium @ 4.3 Ghz I get over 100FPS in BF4 at max w/mantle @ 1080p. Needless to say if I went to 4k I might not be able to run at max but with 4K who cares right.


4k is 4k! I was on the brink of buying one but my 280x's are doing pretty well actually considering. Plus I'm to lazy to change everything again


----------



## Spork13

I'm looking forwards to gaming @ 4k, but not just yet. Not willing to fork out for 2 x 980's, or 295's to get decent FPS.
Will wait a generation or two for true 4k capable GPU's to become more affordable.
I compromised and got a (cheap) 2k (1440) 32" monitor and couldn't be happier.
Still manage 60 FPS on High / Ultra settings most games on the 280X + 280.
Big screen High (although not highest) resolution and didn't break the bank.


----------



## aaronsta1

Quote:


> Originally Posted by *austinmrs*
> 
> I noticed something..
> 
> When i had nvidia card, with this same monitor, when i activated gpu scaling, i could use 75 hertz at any resolution, without any problem.
> 
> Now with amd card, when i activate gpu scaling, i can only use 60 hertz, no matter what resolution i try, even at 640 x 480, with gpu scaling, i can only get 60 herz...
> 
> What can i do? Already tried powerstrip and still cant..


if you have your monitor connected with hdmi, it was always 60 hz. no matter what the software says.


----------



## austinmrs

Quote:


> Originally Posted by *aaronsta1*
> 
> if you have your monitor connected with hdmi, it was always 60 hz. no matter what the software says.


I know that









Its connected with vga. WIth gpu scaling off it gives 75 hz, if i turn it on, 60 hz, no more. WIth nvidia, with same monitor, i can use 75 hz.

THis is a problem with ati software..


----------



## jprovido

is this score ok? I think it's a bit low

my r9 280x (1145/7000) vs. gtx 970 (1545/7500)


----------



## Devildog83

edit: I got about 755/30fps out of a 7870 back in the day. I guess it's not horrible but you might be able to change some settings to improve that.


----------



## jprovido

Quote:


> Originally Posted by *Devildog83*
> 
> edit: I got about 755/30fps out of a 7870 back in the day. I guess it's not horrible but you might be able to change some settings to improve that.


i benched them in the same settings maxed out at 1080p for a good comparison. didn't think it would be almost 40% weaker than a gtx 970 I expected it to be a bit closer. both are overclocked to the max and I think 1145mhz core is a good OC for an r9 280x correct?


----------



## rdr09

Quote:


> Originally Posted by *jprovido*
> 
> i benched them in the same settings maxed out at 1080p for a good comparison. didn't think it would be almost 40% weaker than a gtx 970 I expected it to be a bit closer. both are overclocked to the max and I think 1145mhz core is a good OC for an r9 280x correct?


1145 core on a 280X is good but i think its below average. most can do 1200. even a 7950 can beat that score.

your 1545 core on the 970 is about even to my 290 at 1250.


----------



## jprovido

Quote:


> Originally Posted by *rdr09*
> 
> 1145 core on a 280X is good but i think its below average. most can do 1200. even a 7950 can beat that score.
> 
> your 1545 core on the 970 is about even to my 290 at 1250.


so with the 280x there's nothing wrong? I had a gtx 680 before and I know for a fact that the 7970/gtx 280x beats it when both are overclocked. I dun remember my gtx 680 being this bad with heaven. oh well


----------



## rdr09

Quote:


> Originally Posted by *jprovido*
> 
> so with the 280x there's nothing wrong? I had a gtx 680 before and I know for a fact that the 7970/gtx 280x beats it when both are overclocked. I dun remember my gtx 680 being this bad with heaven. oh well
> 
> +reps btw ty


at those clocks . . . seems normal. here was my 7950 @ 1230 . . .



a 7970 would score prolly 3 - 4 pts more at same clocks.


----------



## jprovido

Quote:


> Originally Posted by *rdr09*
> 
> at those clocks . . . seems normal. here was my 7950 @ 1230 . . .
> 
> 
> 
> a 7970 would score prolly 3 - 4 pts more at same clocks.


thanks + rep


----------



## alizubair

Will [email protected] handle r9 280x? Or will face bottleneck issue


----------



## jprovido

Quote:


> Originally Posted by *alizubair*
> 
> Will [email protected] handle r9 280x? Or will face bottleneck issue


it will bottleneck for sure. I had a q6600+3.6ghz + gtx 480 years ago and upgrading to an i7 950 made it fly


----------



## demitrisln

Hey all I was wondering if you could advise what would be a one card upgrade over my two R9 270X Gigabyte 4GB OC setup I have right now? Looking at Nvidia... Maybe 980? What are some suggestions / thoughts.


----------



## tsm106

Quote:


> Originally Posted by *demitrisln*
> 
> Hey all I was wondering if you could advise what would be a one card upgrade over my two R9 270X Gigabyte 4GB OC setup I have right now? Looking at Nvidia... Maybe 980? What are some suggestions / thoughts.


If your dual 270x is still working fine, I would stay with it until the next gen cards are out. The 980 will be slower at low or typical resolutions, so it's more of a really expensive side grade.


----------



## demitrisln

Quote:


> Originally Posted by *tsm106*
> 
> If your dual 270x is still working fine, I would stay with it until the next gen cards are out. The 980 will be slower at low or typical resolutions, so it's more of a really expensive side grade.


Sweet that was what I was thinking, but I wanted to do some water cooling and the 270X just isn't that high end card. But since the nothing will match my performance I will stick with what I got.

On a side note has anybody done any water cooling on a R9 270X?


----------



## aaronsta1

Quote:


> Originally Posted by *demitrisln*
> 
> Sweet that was what I was thinking, but I wanted to do some water cooling and the 270X just isn't that high end card. But since the nothing will match my performance I will stick with what I got.
> 
> On a side note has anybody done any water cooling on a R9 270X?


this is my current situation..
one of my computers has dual 7870XT and one dual 270x.

id wait till the new 300 series amd gpus come out..

i wouldnt get anything with less then 8gb ram..


----------



## diggiddi

Quote:


> Originally Posted by *tsm106*
> 
> If your dual 270x is still working fine, I would stay with it until the next gen cards are out. *The 980 will be slower at low or typical resolutions*, so it's more of a really expensive side grade.


I've never heard that before


----------



## gnemelf

welp, broke 10k on firestrike today. http://www.3dmark.com/3dm/5571903?
pretty happy with it considering the setup.


----------



## Devildog83

Quote:


> Originally Posted by *gnemelf*
> 
> welp, broke 10k on firestrike today. http://www.3dmark.com/3dm/5571903?
> pretty happy with it considering the setup.


That works!


----------



## Devildog83

Quote:


> Originally Posted by *demitrisln*
> 
> Sweet that was what I was thinking, but I wanted to do some water cooling and the 270X just isn't that high end card. But since the nothing will match my performance I will stick with what I got.
> 
> On a side note has anybody done any water cooling on a R9 270X?


I agree, even my 290 was a slight downgrade from a 7870/270x in max performance. The main reason was, well the 7870 pooped out and eliminating the X-Fire issue's. The 980 does use way way less power but the cost of the card is ridiculous.


----------



## jumpy2219

Quote:


> Originally Posted by *alizubair*
> 
> Will [email protected] handle r9 280x? Or will face bottleneck issue


Get a 270x and this processor : http://www.newegg.com/Product/Product.aspx?Item=N82E16819117374&nm_mc=KNC-GoogleAdwords-PC&cm_mmc=KNC-GoogleAdwords-PC-_-pla-_-Processors+-+Desktops-_-N82E16819117374&gclid=CjwKEAiA8_KlBRD9z_jl_fKBhQkSJABDKqiXvWSJA-EBf_3iGt9gm0L68XyJby4Is2XlWPaFqKSq5BoCFtbw_wcB&gclsrc=aw.ds

Or save up some more money?


----------



## radier

I have just flashed new Hybrid BIOS for my MSI 270X Gaming 2GB

My previous version was *TV303MH.101*
New one I have recieved from MSI Official Forum

Any one of you who own MSI card can request latest BIOS.

New BIOS version is: *TV303MH.103*



Now my OC is far more stable.

For now I am testing 1230/1500 MHz in games. So far so good.









Fire Strike with that clocks: http://www.3dmark.com/fs/3843117


----------



## aaronsta1

Quote:


> Originally Posted by *radier*
> 
> I have just flashed new Hybrid BIOS for my MSI 270X Gaming 2GB
> 
> My previous version was *TV303MH.101*
> New one I have recieved from MSI Official Forum
> 
> Any one of you who own MSI card can request latest BIOS.
> 
> New BIOS version is: *TV303MH.103*
> 
> 
> 
> Now my OC is far more stable.
> 
> For now I am testing 1230/1500 MHz in games. So far so good.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Fire Strike with that clocks: http://www.3dmark.com/fs/3843117


is it just the MH hybrid bios or do they have the MS standard bios update as well?
or can you just use the MH bios and vbe7 and flash it over the standard?

also care to upload it here?
http://www.techpowerup.com/vgabios/index.php?manufacturer=MSI&model=R9+270X


----------



## radier

Already made an upload with GPU-z. Don't know how long the verification will last.

Why don't you switch for Hybrid BIOS? It's safe on Windows 7 too.

Default voltage was rised for about 23mV (across the whole range from idle to stress which makes me more headroom while OC with Afterburner).


----------



## JackCY

I find it weird, don't you get the latest BIOS updates from some MSI utility?? And isn't the card from get go running a hybrid BIOS supporting both UEFI and legacy?


----------



## aaronsta1

Quote:


> Originally Posted by *JackCY*
> 
> I find it weird, don't you get the latest BIOS updates from some MSI utility?? And isn't the card from get go running a hybrid BIOS supporting both UEFI and legacy?


it has a switch..
and i edited my regular bios with vbe7 to overclock it.

some cards, not sure about this one, you cant flash the hybrid bios when it is edited..


----------



## neurotix

What's up guys. Been a while. Sorry I haven't been checking this thread.

Just posting here to bump it up in my thread list.


----------



## austinmrs

I have the Msi R9 270x Hawk.

Is it worthed to flash any firmware? What are the benefits?

If it is worthed, what should i flash?


----------



## radier

Make a request on MSI forum just like I did:

https://forum-en.msi.com/index.php?board=123.0

You have to provide serial number of your card + uppload current BIOS (you can save it with GPU-z).


----------



## austinmrs

Quote:


> Originally Posted by *radier*
> 
> Make a request on MSI forum just like I did:
> 
> https://forum-en.msi.com/index.php?board=123.0
> 
> You have to provide serial number of your card + uppload current BIOS (you can save it with GPU-z).


But whats the advantage of update the firmware of my card?!


----------



## marine88

does the asus dcot 2 version of 270x suport 1300mv os voltage? anyone manage tochange voltage on this graphics card? Im using gpu twear and when im running stress test the voltage drop from 1200mv and i never can get more that 1200 even if I define it on gpu tweak is it normal? happened to someone?


----------



## JackCY

All GPUs and CPUs, practically all higher power chips that require huge power input are suspect to voltage drop. -0.07V is not rare on 280x. Some manufacturers maybe do better some worse but so far I haven't seen a GPU that would have minimal V drop. That's also why for extreme OC you get a separate power delivery board that can handle the insane power demands better.


----------



## marine88

but even if I don't make any oc just original when I make a test with graphics drops a lot to 1158mv some times but when is not benching is at 1200mV this isn´t right right?


----------



## radier

After new BIOS my card can o as high as 1,320 V.
I didn't notice vdrop.


----------



## marine88

what bios do you put on it? i have de asus version!


----------



## radier

So you have a problem.


----------



## SunBro

Guys, after the glorious disappointment i've experienced by seeing the local prices of gtx 960, i decided to go teamred. I currently have a gtx 750 ti and although it's *really* great for its price, i want to step up. I've looked around a lot and my mind is made. It will be r9 280(*definitely* not x, just 280) Do you think i should bite? The only thing worrying me are drivers. Nvidia's drivers are legendary and all my friends with amd cards had some headaches at some moment during their ownership. Do the current state of the drivers make this card a good choice?


----------



## Agent Smith1984

Quote:


> Originally Posted by *SunBro*
> 
> Guys, after the glorious disappointment i've experienced by seeing the local prices of gtx 960, i decided to go teamred. I currently have a gtx 750 ti and although it's *really* great for its price, i want to step up. I've looked around a lot and my mind is made. It will be r9 280(*definitely* not x, just 280) Do you think i should bite? The only thing worrying me are drivers. Nvidia's drivers are legendary and all my friends with amd cards had some headaches at some moment during their ownership. Do the current state of the drivers make this card a good choice?


How much are you paying for a 280 vs the 960??


----------



## SunBro

msi r9 280 is 650 turkish liras while msi gtx 960 is 800 turkish liras.

edit: here are my choices: http://www.vatanbilgisayar.com/ekran-kartlari/?stk=True&srt=UP&page=4

my *absolute* budget limit is 700 tl


----------



## Agent Smith1984

Quote:


> Originally Posted by *SunBro*
> 
> msi r9 280 is 650 turkish liras while msi gtx 960 is 800 turkish liras.
> 
> edit: here are my choices: http://www.vatanbilgisayar.com/ekran-kartlari/?stk=True&srt=UP&page=4
> 
> my *absolute* budget limit is 700 tl


Well, looking at your choices, that 280 is the way to go.
Definitely get the MSI.... The Asus 280x I used to have gave me issues.


----------



## M1kuTheAwesome

How many issues does 280X Crossfire have compared to a strong single card? Which games have you guys had problems in? Any noticeable microstutter? I've been delaying grabbing a second 280X to see what the next gen cards will bring, but 280X CF should still be better bang for buck. So far I planned to go for a 970 or AMD's answer to it cause I was hoping to get a higher res monitor later on and then upgrade to dual cards as well, but realistically, I won't have the money for all that anytime soon. So unless Crossfire generates a lot of headaches, 2x 280X would be a better idea.


----------



## radier

Stay away from CFX/SLi if you don't want problems.


----------



## Particle

I'll say this. I've tried crossfire across four generations of Radeon cards now, and I've yet to be satisfied with it. Most of the games I play don't even utilize it. Those that do sometimes feel worse than playing with a single card. Those that don't certainly don't feel any better. The frame pacing push doesn't seem to have made any difference in the real world. Go with a single card.


----------



## rdr09

Quote:


> Originally Posted by *Particle*
> 
> I'll say this. I've tried crossfire across four generations of Radeon cards now, and I've yet to be satisfied with it. Most of the games I play don't even utilize it. Those that do sometimes feel worse than playing with a single card. Those that don't certainly don't feel any better. The frame pacing push doesn't seem to have made any difference in the real world. Go with a single card.


not hawaii. hawaiit in crossfire is like one big card. unless the game does not support multi-gpu, which happens in both camps. prolly more with amd.


----------



## tsm106

I think most ppl have a hard enough time configuring their cards in the first place. Two card cfx is very matured at this point. I don't have a problem with it on tahiti.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> I think most ppl have a hard enough time configuring their cards in the first place. Two card cfx is very matured at this point. I don't have a problem with it on tahiti.


that too, tsm. also, the other components like the cpu will have to keep up.


----------



## Particle

It's hard for me to get onboard after being burned with each new series. Each time it was always claimed that they behaved better in crossfire than the previous generations. Each time it still sucked.


----------



## aaronsta1

Quote:


> Originally Posted by *Particle*
> 
> I'll say this. I've tried crossfire across four generations of Radeon cards now, and I've yet to be satisfied with it. Most of the games I play don't even utilize it. Those that do sometimes feel worse than playing with a single card. Those that don't certainly don't feel any better. The frame pacing push doesn't seem to have made any difference in the real world. Go with a single card.


hmm in my experience, having dual cards makes the games almost twice as fast.

there is actually only a couple games i play that make a fit out of having dual cards, and those games run 60 fps with just one so it doesnt matter anyway.

you have to make sure to run the game in full screen, not window max..


----------



## JackCY

Single > dual. Nuf' said.
Until they iron out the ages old bugs and finally give a fast SFR or similar mode of rendering it's always gonna have some stutter in one form or other or an artificial delay like NV does to make the frames appear evenly spaced. With AFR you either get microstutter or lag, pick at least one.
Sell the card and buy a 970, much better option than investing in another old card like 7970/280x now or power hungry 290 series.


----------



## rdr09

Quote:


> Originally Posted by *JackCY*
> 
> Single > dual. Nuf' said.
> Until they iron out the ages old bugs and finally give a fast SFR or similar mode of rendering it's always gonna have some stutter in one form or other or an artificial delay like NV does to make the frames appear evenly spaced. With AFR you either get microstutter or lag, pick at least one.
> Sell the card and buy a 970, much better option than investing in another old card like 7970/280x now or power hungry 290 series.


i used to recommend the 970 but i think they are having issues . . .

http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/200


----------



## neurotix

Quote:


> Originally Posted by *rdr09*
> 
> not hawaii. hawaiit in crossfire is like one big card. unless the game does not support multi-gpu, which happens in both camps. prolly more with amd.


Quote:


> Originally Posted by *tsm106*
> 
> I think most ppl have a hard enough time configuring their cards in the first place. Two card cfx is very matured at this point. I don't have a problem with it on tahiti.


Quote:


> Originally Posted by *rdr09*
> 
> that too, tsm. also, the other components like the cpu will have to keep up.


I second this, have absolutely no issues with Crossfire with my setup. You're right, it's like having one big card.

There are certain games that don't utilize Crossfire well (Divinity: Original Sin comes to mind) but most of the big budget, AAA titles get double the performance compared to a single card. (Dragon Age Inquisition, Battlefield 4, Crysis 3, etc)

For older games, it absolutely chews through everything I throw at it, even at 5760x1080. Just Cause 2 and Sleeping Dogs come to mind.

Essentially, only crappy, poorly optimized games, DX9 games, bad console ports etc don't benefit from Crossfire. If you plan on playing the most recent big name releases, Crossfire is worth it.

I have no stuttering and no issues with my setup. Maybe if you have no idea what you're doing, no clue how to tweak Windows, no idea how to overclock properly, and a bunch of unnecessary apps or spyware running while you're playing games, then it would be bad.


----------



## llTheGOOSE

Jesus, There's a club on these boards for everything!
Anyway, Hey all, Pretty new to the OCN forums, thought I'd sign up to the club! Main Card is an MSI 280x, however that's been sent back due to a fault. Currently using my Girlfriends 270x.


----------



## tsm106

Quote:


> Originally Posted by *llTheGOOSE*
> 
> Jesus, There's a club on these boards for everything!
> Anyway, Hey all, Pretty new to the OCN forums, thought I'd sign up to the club! Main Card is an MSI 280x, however that's been sent back due to a fault. Currently using my Girlfriends 270x.


It's even more ironic that ppl felt the need for their own club for their rebranded cards.


----------



## aaronsta1

would you RMA this..

i have a MSI R9 270X Gaming 2g..
it says on their site, that the card has 1080 boost and 1120 OC..
the problem is, i been playing a few games and ive noticed that when its set to more then 1080 it crashes..
the card runs all day long at 1080 but anything over and its not happy..

keep in mind this would be the 3rd time i sent this card in.
first card the fan stopped spinning, the 2nd one was DOA.

what do you guys think? im trying the new 103 bios now but i think this card has a defect.


----------



## jprovido

can you guys help me. having this very annoying issue with my newly bought r9 280x
http://www.overclock.net/t/1537426/weird-gpu-load-issue-graph-causing-my-screen-to-flicker-help/0_30#post_23445879


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *JackCY*
> 
> Single > dual. Nuf' said.
> Until they iron out the ages old bugs and finally give a fast SFR or similar mode of rendering it's always gonna have some stutter in one form or other or an artificial delay like NV does to make the frames appear evenly spaced. With AFR you either get microstutter or lag, pick at least one.
> Sell the card and buy a 970, much better option than investing in another old card like 7970/280x now or power hungry 290 series.


Selling this card might be... problematic. I replaced the original fans with 2 120mm ones zip tied in place to keep the VRMs cool without it being too loud. In the process I lost one of the screws that attaches the fans. I imagine a missing screw does void warranty. So a card with one of the fans a bit loose and possibly rattling as a result and no warranty isn't going to sell well. That would be the best thing to do though.
Quote:


> Originally Posted by *neurotix*
> 
> I second this, have absolutely no issues with Crossfire with my setup. You're right, it's like having one big card.
> 
> There are certain games that don't utilize Crossfire well (Divinity: Original Sin comes to mind) but most of the big budget, AAA titles get double the performance compared to a single card. (Dragon Age Inquisition, Battlefield 4, Crysis 3, etc)
> 
> For older games, it absolutely chews through everything I throw at it, even at 5760x1080. Just Cause 2 and Sleeping Dogs come to mind.
> 
> Essentially, only crappy, poorly optimized games, DX9 games, bad console ports etc don't benefit from Crossfire. If you plan on playing the most recent big name releases, Crossfire is worth it.
> 
> I have no stuttering and no issues with my setup. Maybe if you have no idea what you're doing, no clue how to tweak Windows, no idea how to overclock properly, and a bunch of unnecessary apps or spyware running while you're playing games, then it would be bad.


The big AAA titles are the only reason I want more power anyways. For some older games I could always disable Crossfire and easily run them with just one card. It seems all games that would actually benefit from a second card run Crossfire just fine as far as FPS is concerned, I'm just worried that there might be stuttering or artifacts.
Maybe I should just stop troubling myself with upgrade plans until I actually have the money to do something, which won't happen until the end of February.


----------



## neurotix

Quote:


> Originally Posted by *M1kuTheAwesome*
> 
> The big AAA titles are the only reason I want more power anyways. For some older games I could always disable Crossfire and easily run them with just one card. It seems all games that would actually benefit from a second card run Crossfire just fine as far as FPS is concerned, I'm just worried that there might be stuttering or artifacts.
> Maybe I should just stop troubling myself with upgrade plans until I actually have the money to do something, which won't happen until the end of February.


You don't even really need to "disable" Crossfire anymore, you can just define a new profile for that game and set Crossfire to disabled in the profile. Then, that game won't use Crossfire and will only use your topmost card. This is handy because actually disabling Crossfire completely through the driver usually rearranges and messes up your desktop icons. At least on Win7 it does. The same thing usually happens when installing new drivers (I just install them over the old ones) or disabling Eyefinity.

As I mentioned M1ku, there are PLENTY of older games that support Crossfire well and get a big boost from it. It depends on what you want to play but generally, any big budget title from the last 5 years should support Crossfire well. Even some good games from smaller devs support Crossfire, like Shadowrun Returns and Torchlight II. Mind you, if you plan on playing Battlefield or Dragon Age, not only do they support Crossfire but they also support Mantle.

I can understand why people wouldn't suggest it because a strong single card IS generally better, but people forget that the sweet spot for Crossfire/SLI scaling is 2 cards. Anything more is a waste. I've even heard of people adding a 4th card and only seeing a 10 fps increase in Valley, for example. And you'd have to be an idiot to argue that one 970/980 is better than 2x 280X. 2x 280X will destroy any single card solution, and even when the new cards drop it will likely still be better (by the generation after this- R9 4xx and Nvidia 1080- we will have single cards that are as good or better.) The problem with going dual card is that you need a good CPU to drive it and a power supply that can handle it all as well.

You say you don't have the money now- that's fine- but when you do, I would highly suggest buying used (Ebay). You can get a used 280X or 7970 for about $150, which is a steal. I paid $450 for mine in March 2013 (it's in my backup rig). If you shop around you can easily get them for even less. http://www.overclock.net/t/1510010/sapphire-dual-x-280x-oc-video-cards-price-drop I've seen them for even less in the classifieds here. If you have the power supply for them and the room in your case, this a lot of bang for the buck.


----------



## Spork13

AAA titles work well with crossfire - eventually.
Shadow of Mordor didn't work well with Xfire - until I'd finished it, by then patches / drivers had it working better (but still nowhere close to twice as well).
Far cry IV same story.
However, I find if I wait a few months (and get games 1/2 price or less) the drivers / patches seem to make them run perfectly on Xfire.


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *neurotix*
> 
> You don't even really need to "disable" Crossfire anymore, you can just define a new profile for that game and set Crossfire to disabled in the profile. Then, that game won't use Crossfire and will only use your topmost card. This is handy because actually disabling Crossfire completely through the driver usually rearranges and messes up your desktop icons. At least on Win7 it does. The same thing usually happens when installing new drivers (I just install them over the old ones) or disabling Eyefinity.
> 
> As I mentioned M1ku, there are PLENTY of older games that support Crossfire well and get a big boost from it. It depends on what you want to play but generally, any big budget title from the last 5 years should support Crossfire well. Even some good games from smaller devs support Crossfire, like Shadowrun Returns and Torchlight II. Mind you, if you plan on playing Battlefield or Dragon Age, not only do they support Crossfire but they also support Mantle.
> 
> I can understand why people wouldn't suggest it because a strong single card IS generally better, but people forget that the sweet spot for Crossfire/SLI scaling is 2 cards. Anything more is a waste. I've even heard of people adding a 4th card and only seeing a 10 fps increase in Valley, for example. And you'd have to be an idiot to argue that one 970/980 is better than 2x 280X. 2x 280X will destroy any single card solution, and even when the new cards drop it will likely still be better (by the generation after this- R9 4xx and Nvidia 1080- we will have single cards that are as good or better.) The problem with going dual card is that you need a good CPU to drive it and a power supply that can handle it all as well.
> 
> You say you don't have the money now- that's fine- but when you do, I would highly suggest buying used (Ebay). You can get a used 280X or 7970 for about $150, which is a steal. I paid $450 for mine in March 2013 (it's in my backup rig). If you shop around you can easily get them for even less. http://www.overclock.net/t/1510010/sapphire-dual-x-280x-oc-video-cards-price-drop I've seen them for even less in the classifieds here. If you have the power supply for them and the room in your case, this a lot of bang for the buck.


I've been keeping one eye on the classifieds for a while actually and around here, prices are pretty similar. $150 is about €133, which is roughly the price all these cards exchange hands for. Really is a bargain price.
PSU _should_ be okay, it's a Seasonic M12 II EVO 750W. The highest power draw I've seen was 580 watts from the wall with Prime95 + Furmark and it's 80+ Bronze so I should have plenty of headroom. I don't mind the cards staying at stock voltage if that's needed.
Quote:


> Originally Posted by *Spork13*
> 
> AAA titles work well with crossfire - eventually.
> Shadow of Mordor didn't work well with Xfire - until I'd finished it, by then patches / drivers had it working better (but still nowhere close to twice as well).
> Far cry IV same story.
> However, I find if I wait a few months (and get games 1/2 price or less) the drivers / patches seem to make them run perfectly on Xfire.


This.
Wait a couple of months and get more game for less money.


----------



## Particle

I like to play Mass Effect 2, Natural Selection 2, DOTA2, and CSGO but none of them seem to benefit from crossfire. Some of them show utilization totaling more than 100% with crossfire enabled, but not all. Other games like Battlefield 4 have their frame rate increase and show high utilization approaching 200% with two cards, but it doesn't feel any smoother. When 40 fps feels as choppy as 20 fps, it's a waste.


----------



## diggiddi

Guys I'm considering picking up a Vapor X R9 280x, how is the quality compared to Asus. I had a couple of 7970 matrix platinum's and they simply oozed quality. I must say Asus knows how to build a quality product. My other option is the Asus 280x V2 triple slot card with the backplate


----------



## buttface420

Quote:


> Originally Posted by *Particle*
> 
> I like to play Mass Effect 2, Natural Selection 2, DOTA2, and CSGO but none of them seem to benefit from crossfire. Some of them show utilization totaling more than 100% with crossfire enabled, but not all. Other games like Battlefield 4 have their frame rate increase and show high utilization approaching 200% with two cards, but it doesn't feel any smoother. When 40 fps feels as choppy as 20 fps, it's a waste.


almost all my games work very well with crossfire.

omega drivers have helped quite a bit in most games with crossfire.

games i have that benefit crossfire: BF4,BF3,BC2, Black Ops 2 , bioshock infinite,watch dogs,crysis 3, shadow of morder

games that dont: advanced warfare


----------



## Agent Smith1984

Quote:


> Originally Posted by *diggiddi*
> 
> Guys I'm considering picking up a Vapor X R9 280x, how is the quality compared to Asus. I had a couple of 7970 matrix platinum's and they simply oozed quality. I must say Asus knows how to build a quality product. My other option is the Asus 280x V2 triple slot card with the backplate


Stay as far the hell away from the Asus 280x as you possibly can!!! TRUST ME


----------



## Particle

Quote:


> Originally Posted by *diggiddi*
> 
> Guys I'm considering picking up a Vapor X R9 280x, how is the quality compared to Asus. I had a couple of 7970 matrix platinum's and they simply oozed quality. I must say Asus knows how to build a quality product. My other option is the Asus 280x V2 triple slot card with the backplate


What is your motivating factor behind selling 7970s in favor of 280X's? They're nearly the same card.


----------



## diggiddi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Stay as far the hell away from the Asus 280x as you possibly can!!! TRUST ME


I'm talking about his one here though

Quote:


> Originally Posted by *Particle*
> 
> What is your motivating factor behind selling 7970s in favor of 280X's? They're nearly the same card.


I had a pair of matrixes that i got off ebay but returned them cos they were faulty. There is no way I'd side grade from those cards to any 280x that would be simply


----------



## aaronsta1

Quote:


> Originally Posted by *Particle*
> 
> I like to play Mass Effect 2, Natural Selection 2, DOTA2, and CSGO but none of them seem to benefit from crossfire. Some of them show utilization totaling more than 100% with crossfire enabled, but not all. Other games like Battlefield 4 have their frame rate increase and show high utilization approaching 200% with two cards, but it doesn't feel any smoother. When 40 fps feels as choppy as 20 fps, it's a waste.


these titles you specify can be maxed out with a single card.

the newer games, such as BF4 or FC4 or DA:I have a significant speed boost with dual cards than with a single card.. almost twice as fast.

perhaps your cpu was not able to feed dual cards?


----------



## sage101

What the best OC you guys got on the 270X?


----------



## Particle

Quote:


> Originally Posted by *aaronsta1*
> 
> these titles you specify can be maxed out with a single card.
> 
> the newer games, such as BF4 or FC4 or DA:I have a significant speed boost with dual cards than with a single card.. almost twice as fast.
> 
> perhaps your cpu was not able to feed dual cards?


It seems unlikely to me that my CPU would exactly limit me to 100% utilization of one GPU or 50% utilization each of two, especially when I've only got a pair of 270X GPUs. The 6970s I had before behaved the same way.


----------



## aaronsta1

Quote:


> Originally Posted by *Particle*
> 
> It seems unlikely to me that my CPU would exactly limit me to 100% utilization of one GPU or 50% utilization each of two, especially when I've only got a pair of 270X GPUs. The 6970s I had before behaved the same way.


1 GPU
http://www.3dmark.com/fs/3374492

2 GPUS
http://www.3dmark.com/fs/3864435

take this info how ever you want.


----------



## Devildog83

Quote:


> Originally Posted by *sage101*
> 
> What the best OC you guys got on the 270X?


1235/1590 was the best I ever got and still stayed stable enough to play games and run heaven.


----------



## sage101

Quote:


> Originally Posted by *Devildog83*
> 
> 1235/1590 was the best I ever got and still stayed stable enough to play games and run heaven.


Nice! @ what voltage?


----------



## neurotix

Quote:


> Originally Posted by *aaronsta1*
> 
> 1 GPU
> http://www.3dmark.com/fs/3374492
> 
> 2 GPUS
> http://www.3dmark.com/fs/3864435
> 
> take this info how ever you want.


This. I mean, every modern version of 3dmark and every version of Unigine will see huge increases (nearly double) with two cards. I'm #54 on our HWBOT team so I know this intimately.

There are plenty of games that will see the same scaling too. Both old and new.

But y'know, some people are never happy and will never believe what you say. Other people are just idiots and have no idea how to even configure their system for it properly. Or something else with their system is wrong hardware wise (weak PSU, weak CPU) and the cards aren't performing correctly because of it.

Sure, there's some games that don't benefit, but usually these are the ones that are weaker graphically and can be maxed out on a single card anyway.

Devildog, what's your experience with certain games and Crossfire on your dual 270X setup? How was the fps increase?


----------



## jprovido

im getting flickers at 2d clocks. is there a way for me to mod the bios? this is freakin annoying


----------



## neurotix

Quote:


> Originally Posted by *jprovido*
> 
> im getting flickers at 2d clocks. is there a way for me to mod the bios? this is freakin annoying


Well, I tried to help you in the other thread and so did TSM.

It sucks, but unfortunately if you did what we said there and still have screen flicker, you may just have to RMA the card. It happens.


----------



## jprovido

Quote:


> Originally Posted by *neurotix*
> 
> Well, I tried to help you in the other thread and so did TSM.
> 
> It sucks, but unfortunately if you did what we said there and still have screen flicker, you may just have to RMA the card. It happens.


Quote:


> Originally Posted by *neurotix*
> 
> Well, I tried to help you in the other thread and so did TSM.
> 
> It sucks, but unfortunately if you did what we said there and still have screen flicker, you may just have to RMA the card. It happens.


The thing is I'm moving to the US by next month. If i sent it to rma I won't have a gpu to use for my htpc while I'm still here in the philippines. I guess I will just ask my brother to rma but hopefully I can fix it by myself. he's busy with work too. I'm really regretting this purchase


----------



## neurotix

As I said, it happens, people get bad cards. You can keep trying to fix it through software, power states, and so on. Ultimately, it may have to be RMA'ed or returned (if you still can) if software issues don't fix it.

You may need to reinstall Windows and install fresh 14.12 drivers.

It all depends on how much effort you want to put in, but honestly it shouldn't be YOUR problem, it's probably defective and just needs to be replaced. If you have a backup or can use integrated graphics it might not be so bad.


----------



## jprovido

Quote:


> Originally Posted by *neurotix*
> 
> As I said, it happens, people get bad cards. You can keep trying to fix it through software, power states, and so on. Ultimately, it may have to be RMA'ed or returned (if you still can) if software issues don't fix it.
> 
> *You may need to reinstall Windows and install fresh 14.12 drivers*.
> 
> It all depends on how much effort you want to put in, but honestly it shouldn't be YOUR problem, it's probably defective and just needs to be replaced. If you have a backup or can use integrated graphics it might not be so bad.


tbh that's something I didn't even consider. I've seen stranger things getting fixed with a clean installs of windows. I will do this now


----------



## neurotix

Yeah, sweet.

I've been using PCs actively since about 1997 and TRUST ME things were MUCH MUCH WORSE then.









Standard procedure for things like this in the XP era was reinstall Windows. Heck, even if you had a gaming rig back then (which I didn't unless you count a Pentium 4 1.8ghz and Geforce 6200) you had to reinstall Windows once every few months because for whatever reason, it would slow itself to a crawl after about 2 months.

Someone even did some tests and showed that Windows XP in it's default service configuration without a firewall connected to the internet would be infected through NetBIOS in about *10 minutes*. Even today I make a habit of disconnecting my LAN cord when reinstalling Windows, and before doing it I put my antivirus on an external drive and install that first.

So yeah, sometimes wiping Windows is the best thing you can do.

(Don't even get me started on Windows 95/98 days and how many times I had to reinstall simply because Windows suddenly refused to boot.)


----------



## Particle

Quote:


> Originally Posted by *neurotix*
> 
> This. I mean, every modern version of 3dmark and every version of Unigine will see huge increases (nearly double) with two cards. I'm #54 on our HWBOT team so I know this intimately.
> 
> There are plenty of games that will see the same scaling too. Both old and new.
> 
> But y'know, some people are never happy and will never believe what you say. Other people are just idiots and have no idea how to even configure their system for it properly. Or something else with their system is wrong hardware wise (weak PSU, weak CPU) and the cards aren't performing correctly because of it.
> 
> Sure, there's some games that don't benefit, but usually these are the ones that are weaker graphically and can be maxed out on a single card anyway.
> 
> Devildog, what's your experience with certain games and Crossfire on your dual 270X setup? How was the fps increase?


Sort of a passive dig at me, and that's fine, but I would say this: I of course see the same kind of raw fps scaling in benchmarks including in Firestrike, but how does that help me play _real games_ better when they usually don't scale anything like this? Or when they do scale well but don't get any smoother (ie Battlefield)? It's just a waste in those situations, and no amount of magical "configuration" is going to fix it.


----------



## neurotix

There's plenty that scale well, both old and new, both SLI and Crossfire.

I can tell you this much, maybe it wouldn't matter at 1080p, but at my resolution (5760x1080p) having two cards makes a HUGE difference compared to having just one in MANY games.

Again, some people aren't going to believe anything you say no matter what you tell them, or what evidence you present to them. You've obviously decided that Crossfire is not for you, and nothing I say will change your opinion. But that doesn't change the facts.


----------



## Wicked_Bass

Just purchased my 2nd MSI R9 270x 2gig and I am trying to find out if a need a crossfire bridge? Both gpus didn't come with one and my motherboard didn't. Do I need to buy one to crossfire these?


----------



## Arkanon

Yep, they still need crossfire bridges


----------



## Wicked_Bass

Quote:


> Originally Posted by *Arkanon*
> 
> Yep, they still need crossfire bridges


Thank you


----------



## Arkanon

Ok, recap after more or less running my 270x for a year:

While the card was perfectly stable @ 1.4v i've toned it down a bit and i'm running daily clocks of 1300mhz @ 1.3V. Card still running strong and no sign of any wear and tear by the added voltage. Did however mod the cooling as the Twin Frozr fans started to act up after a couple of months of usage ( known problem with fanspeed dropping and eventually just not spinning at all anymore). Thought a while of RMA'ing the card, but then i'd risk losing one of the few cards out there that actually manage 1350mhz and above.


----------



## Recr3ational

Quote:


> Originally Posted by *Arkanon*
> 
> Ok, recap after more or less running my 270x for a year:
> 
> While the card was perfectly stable @ 1.4v i've toned it down a bit and i'm running daily clocks of 1300mhz @ 1.3V. Card still running strong and no sign of any wear and tear by the added voltage. Did however mod the cooling as the Twin Frozr fans started to act up after a couple of months of usage ( known problem with fanspeed dropping and eventually just not spinning at all anymore). Thought a while of RMA'ing the card, but then i'd risk losing one of the few cards out there that actually manage 1350mhz and above.


Aftermarket coolers?


----------



## Arkanon

Could look for some 92mm aftermarket coolers and put them on the stock mounting for it. Not a priority though. Card is modded in such a way it's cooled by 2 120mm fans controlled by my fancontroller. barely audible, even on full speed and cools just about the same, maybe just a tiny bit better than stock.


----------



## Recr3ational

Quote:


> Originally Posted by *Arkanon*
> 
> Could look for some 92mm aftermarket coolers and put them on the stock mounting for it. Not a priority though. Card is modded in such a way it's cooled by 2 120mm fans controlled by my fancontroller. barely audible, even on full speed and cools just about the same, maybe just a tiny bit better than stock.


If its not a problem don't worry about it.


----------



## daffy.duck

Got a used 280x from ebay for $150 last week.
Great card!
Handles my 1080p gaming without a hitch.
Only problem is Afterburner seems to not apply my overclocks properly.
Sometimes the new speed will apply but reset after a couple runs of anything graphically strenuous, other times the new speed will say applied but clearly it has not been applied.
Will have to look into modding the cards BIOS it seems.


----------



## Arkanon

Quote:


> Originally Posted by *daffy.duck*
> 
> Got a used 280x from ebay for $150 last week.
> Great card!
> Handles my 1080p gaming without a hitch.
> Only problem is Afterburner seems to not apply my overclocks properly.
> Sometimes the new speed will apply but reset after a couple runs of anything graphically strenuous, other times the new speed will say applied but clearly it has not been applied.
> Will have to look into modding the cards BIOS it seems.


Internet browser open ? Whenever i open an internet browser and run a benchmark/play a game my clocks reset to stock. Weird thing is that i still have the clocks applied and gpu-z still shows the card as overclocked, but then when you go to the sensor page you clearly see it doesn't go above stock clocks. After i close the internet browers and rerun the game/benchmark it does use to overclocked settings.


----------



## daffy.duck

Quote:


> Originally Posted by *Arkanon*
> 
> Internet browser open ? Whenever i open an internet browser and run a benchmark/play a game my clocks reset to stock. Weird thing is that i still have the clocks applied and gpu-z still shows the card as overclocked, but then when you go to the sensor page you clearly see it doesn't go above stock clocks. After i close the internet browers and rerun the game/benchmark it does use to overclocked settings.


Hmm definitely think I had the browser open, so that may be it.
Will have to do some testing later when I get home.
If all else fails, I will have to do some BIOS flashing to test.


----------



## Particle

Quote:


> Originally Posted by *neurotix*
> 
> There's plenty that scale well, both old and new, both SLI and Crossfire.
> 
> I can tell you this much, maybe it wouldn't matter at 1080p, but at my resolution (5760x1080p) having two cards makes a HUGE difference compared to having just one in MANY games.
> 
> Again, some people aren't going to believe anything you say no matter what you tell them, or what evidence you present to them. You've obviously decided that Crossfire is not for you, and nothing I say will change your opinion. But that doesn't change the facts.


You don't really understand my position I think. I like the idea of crossfire, and I'm not considering it a dead technology by any means. I just have yet to see it be implemented in a way that successfully accelerates many of the games that I like to play. That's my only complaint. They might fix it in the future, but at this point it's hard for me to want to spend more money on it after being burned so many times.

You say that some people won't accept an assertion regardless of the evidence. I would agree and point out that the same logic applies to your position. My direct experience demonstrates how crossfire can show a lack of improvement, and it seems that despite this real world example with real hardware you will not accept that my experience could be just as correct as your own given the difference in games, user sensitivities, and other hardware details. I play at WQHD resolution, have used both an i7-3930K and FX-9590 as a CPU for recent card generations, and have tried 4850s, 5850s, 6970s, and 270X's in crossfire over the years.

You can't simply dismiss a game like Battlefield where for me the GPUs both have full utilization and the frame rate increases by a substantial amount but the jitter (ie frame timing) deviates far enough away from being ideal that it doesn't feel any smoother. What good does a doubling of frame rate achieve if it's like this: [Frame]..[Frame].................................[Frame].....[Frame]..............................etc

Likewise, you can't dismiss other games like DOTA2 and CSGO that see no net increase in frame rate and have exactly 50% utilization of each card. The games aren't CPU bound when GPU utilization is at exactly 100%.


----------



## Falconx50

Quote:


> Originally Posted by *Sky-way*
> 
> Took the plunge a couple weeks ago and bought an msi r9 270x 2gb gaming. Very impressed with the build quality, everything installed correctly and is working perfectly. I don't have a lot of games right now, mostly free to play stuff from steam and starter editions of the blizzard games but it plays all those flawlessly on ultra without hesitation. I'm glad I went with msi, top notch stuff. Also, in case anyone cares, I got it for $170 on newegg.


Nice!









I also have the 270x, PowerColor Devil version, and I think if you can deal with turning down some of the eye candy, just less than MAX settings, You will have a great experience with it.
I have read a lot on this thread, and others, and unless you are extremely picky on what you consider "a playable game" then that card will last you a long time.

If you get the chance share your experience with any game or benchmark.

Here's mine for those to critique: 

Here's my Firestrike score: http://www.3dmark.com/3dm/5656998


----------



## radier

Your CPU is huge bootleneck.

Unigine is not CPU dependent byt Fire Strike and real games are.
Just compare my single card results: http://www.3dmark.com/fs/3843117


----------



## daffy.duck

Quote:


> Originally Posted by *Arkanon*
> 
> Internet browser open ? Whenever i open an internet browser and run a benchmark/play a game my clocks reset to stock. Weird thing is that i still have the clocks applied and gpu-z still shows the card as overclocked, but then when you go to the sensor page you clearly see it doesn't go above stock clocks. After i close the internet browers and rerun the game/benchmark it does use to overclocked settings.


Quote:


> Originally Posted by *daffy.duck*
> 
> Hmm definitely think I had the browser open, so that may be it.
> Will have to do some testing later when I get home.
> If all else fails, I will have to do some BIOS flashing to test.


Worked for a while then reverted back to stock speeds.
Guess I have some flashing to do.


----------



## bichael

I've also been enjoying my R9 270x which I've had for a couple of months now. Pretty perfect for 1080p and high settings.

Got it running at 1150/1475 which gives me 5491 in firestrike (6775 graphics). So I think my overclocked pentium is a little bit of a bottleneck but not major, actually seems a pretty good pairing in terms of performance/cost.

There are some quite impressive overclocks on here though so think I may need to try pushing it a bit further...

I'm also toying with the idea of putting an Accelero Hybrid II on it to hopefully achieve near silence (and just as a bit of a project if I'm honest).

edit: regarding clocks dropping to stock when overclocked I had similar issue when I first tried it. In my case it seemed to be Afterburner which was causing issues. After making sure that Afterburner was fully closed I overclocked with CCC and it worked fine. May be something to try if you haven't already.


----------



## Falconx50

Quote:


> Originally Posted by *radier*
> 
> Your CPU is huge bootleneck.
> 
> Unigine is not CPU dependent byt Fire Strike and real games are.
> Just compare my single card results: http://www.3dmark.com/fs/3843117


Yes I know.









I can't decide if I should go with a FX8320 or spend a little more on the FX8350. I don't think my Mobo can handle the 225 watts from anything else.
I have searched in vein for the wattage specs. I have the GA 990XA-UD3
Does anyone knows where to get them?


----------



## End3R

Quote:


> Originally Posted by *Falconx50*
> 
> Yes I know.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can't decide if I should go with a FX8320 or spend a little more on the FX8350. I don't think my Mobo can handle the 225 watts from anything else.
> I have searched in vein for the wattage specs. I have the GA 990XA-UD3
> Does anyone knows where to get them?


I can't speak for the 8350 but my 8320 is an absolute beast. I haven't had any trouble playing every game out there, maxed settings (sans AA) @1080p with smooth fps even while streaming. And I'm not even overclocking it.


----------



## Particle

Quote:


> Originally Posted by *Falconx50*
> 
> Yes I know.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can't decide if I should go with a FX8320 or spend a little more on the FX8350. I don't think my Mobo can handle the 225 watts from anything else.
> I have searched in vein for the wattage specs. I have the GA 990XA-UD3
> Does anyone knows where to get them?


Revision 1.x: http://www.gigabyte.com/support-downloads/cpu-support-popup.aspx?pid=3901
Revision 3.0: http://www.gigabyte.com/support-downloads/cpu-support-popup.aspx?pid=4434


----------



## aaronsta1

Quote:


> Originally Posted by *Falconx50*
> 
> Yes I know.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can't decide if I should go with a FX8320 or spend a little more on the FX8350. I don't think my Mobo can handle the 225 watts from anything else.
> I have searched in vein for the wattage specs. I have the GA 990XA-UD3
> Does anyone knows where to get them?


it says both the 8350 and 8320 are supported on that board.

you can pretty much run the 8320 at 8350 speeds guaranteed.. altho that might be pushing it on that board.
you might be better off just running the 8350 at stock speeds, then if you decide to get a better board you can move up to overclocking it.


----------



## Arkanon

Firestrike:

I5-750 @ 4Ghz
270x @ 1320 core - 1500 mem



about stock gtx960 levels still.


----------



## Arkanon

double post


----------



## Arkanon

Quote:


> Originally Posted by *End3R*
> 
> I can't speak for the 8350 but my 8320 is an absolute beast. I haven't had any trouble playing every game out there, maxed settings (sans AA) @1080p with smooth fps even while streaming. And I'm not even overclocking it.


Afaik the 8320 is the exact same cpu as the 8350 apart from it having a lower multiplier. 8350's might just be better binned allowing for higher clocks at same voltage.


----------



## Devildog83

Quote:


> Originally Posted by *Falconx50*
> 
> Yes I know.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can't decide if I should go with a FX8320 or spend a little more on the FX8350. I don't think my Mobo can handle the 225 watts from anything else.
> I have searched in vein for the wattage specs. I have the GA 990XA-UD3
> Does anyone knows where to get them?


I would spend a few bucks and get the 8350 just because higher clocks are pretty much guaranteed but they both can take an awful lot of torture and I have seen 8320's clock as good as some 8350's but you have to get lucky because most won't. They are both 125w TDP. By the way the UD3 isn't the best overclocking board but it should do at least OK with either the 8320 or 8350.


----------



## Nafu

Bought a used Sapphire Vapor-X R9 280X. upgraded from 7950.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Nafu*
> 
> Bought a used Sapphire Vapor-X R9 280X. upgraded from 7950.


Do you still have the 7950? May as well crossfire them


----------



## Nafu

nopes. its alreAdy SOLD. Now time to Fire up 280x. btw how max it can go with 1.300 voltage?


----------



## BruceB

Quote:


> Originally Posted by *Falconx50*
> 
> Yes I know.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can't decide if I should go with a FX8320 or spend a little more on the FX8350. I don't think my Mobo can handle the 225 watts from anything else.
> I have searched in vein for the wattage specs. I have the GA 990XA-UD3
> Does anyone knows where to get them?


If your MB manufacturer's webpage says that the board can Support the FX8350 then it can supply enough power to it too, easy as that!
Overclocking on the other Hand, you might want to see what other owners have to say about powerdraw on that specific board.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Nafu*
> 
> nopes. its alreAdy SOLD. Now time to Fire up 280x. btw how max it can go with 1.300 voltage?


Probably around 1180-1250MHz depending on the chip.

Hopefully you can around 1600+ on VRAM too. That will help a lot.
Otherwise, it won't be that big of an improvement over the 7950.


----------



## jprovido

http://www.overclock.net/forum/newestpost/1537426

I fixed my 2D flickering problems. VTX3D was great! they gave me a new bios and the flickering was totally gone


----------



## Catscratch

Quote:


> Originally Posted by *Arkanon*
> 
> Firestrike:
> 
> I5-750 @ 4Ghz
> 270x @ 1320 core - 1500 mem
> 
> 
> 
> about stock gtx960 levels still.


Great nonetheless. You are above reference 280x. Almost my stock 280x trix 1020mhz (no-boost) level. 6430 Firestrike (with 2500k at 4ghz.)



According to 3dmark site, there are people with 2500k and 280x that score 7500. And the clocks are like 1100 on the cards, some has 4.9ghz cpu. Shouldn't make a 1100 points difference thou









*WOAH*

SCRATCH THAT !!!!

The above was in 64 bit 3dmark. Look at 32 bit below. Darn, 3dmark11 was like this too. Didn't notice :/


----------



## Arkanon

Stock 280x beaten


----------



## Falconx50

Quote:


> Originally Posted by *Particle*
> 
> Revision 1.x: http://www.gigabyte.com/support-downloads/cpu-support-popup.aspx?pid=3901
> Revision 3.0: http://www.gigabyte.com/support-downloads/cpu-support-popup.aspx?pid=4434


Thanks! That is exactly what I was looking for!
And as expected my Motherboard can't handle the 220 watts with the FX-9xxx series.


----------



## Falconx50

Quote:


> Originally Posted by *BruceB*
> 
> If your MB manufacturer's webpage says that the board can Support the FX8350 then it can supply enough power to it too, easy as that!
> Overclocking on the other Hand, you might want to see what other owners have to say about powerdraw on that specific board.


I will look into that further; however I was able to push my FX-6100 to 4.0 GHz with no problems. I'm sure if I learn more about over clocking CPU's I might be able to get a little more out of it.








Of course there's the pucker factor to contend with too.


----------



## vintageclass

I was able to squish out 1325MHz core with 1500MHz memory on air cooling reaching max temps of about 55C.


----------



## Particle

I've put my 270X GPUs on the shelf temporarily as I play with my new toy that I got from eBay this week: A Radeon HD 2900 XT







It was still NIB.

I've played CS:S, CS:GO, and Skyrim so far. Surprisingly, they all three run. They all three defaulted to more ambitious settings than are appropriate for the card (mainly AA), but they run. I was even playing CS:GO today at WQHD around 60 fps with a mix of mostly high settings.


----------



## Nafu

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Probably around 1180-1250MHz depending on the chip.
> 
> Hopefully you can around 1600+ on VRAM too. That will help a lot.
> Otherwise, it won't be that big of an improvement over the 7950.


yes i can understand the difference between both. as 7950 when highly overclocked can eeven beat 7970 OCed score. like i did with my 7950 before. i used up two 7950 a different times. both oced easily 1250 on core and score was even better than 7970 previously.

lets experience 280x OCing.


----------



## daffy.duck

Quote:


> Originally Posted by *vintageclass*
> 
> I was able to squish out 1325MHz core with 1500MHz memory on air cooling reaching max temps of about 55C.


Very nice oc and that temperature is great too.
Let's see some benches


----------



## neurotix

Quote:


> Originally Posted by *daffy.duck*
> 
> Very nice oc and that temperature is great too.
> Let's see some benches


I second that. Is that a 280X Toxic?


----------



## daffy.duck

Quote:


> Originally Posted by *neurotix*
> 
> I second that. Is that a 280X Toxic?


looks like 270x toxic judging by the number of shaders


----------



## Nafu

now this is weird.

sapphire 280x vaporx giving artifacts and crashing only when i increase the power limit.

this is specs and MSI AB



Any suggestion please???

AMD driver 14,1 Omega Graphics.


----------



## vintageclass

It's an R9 270X Toxic edition.
Quote:


> Originally Posted by *daffy.duck*
> 
> looks like 270x toxic judging by the number of shaders


Quote:


> Originally Posted by *neurotix*
> 
> I second that. Is that a 280X Toxic?


Quote:


> Originally Posted by *daffy.duck*
> 
> Very nice oc and that temperature is great too.
> Let's see some benches


Any benches you'd like to see? I ran Unigene Valley. I actually got 1425MHz yesterday on a separate overclock boot drive but all the driver conflictions on my main drive get in the way of a full overclock. I'll try to stretch it some more.


----------



## radier

Low score.
With 1230/1500 I get around 1580 score at Extreme HD preset.


----------



## Devildog83

Quote:


> Originally Posted by *Nafu*
> 
> Bought a used Sapphire Vapor-X R9 280X. upgraded from 7950.
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


You have been added. Welcome to the club !!! Nice Card.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Nafu*
> 
> now this is weird.
> 
> sapphire 280x vaporx giving artifacts and crashing only when i increase the power limit.
> 
> this is specs and MSI AB
> 
> 
> 
> Any suggestion please???
> 
> AMD driver 14,1 Omega Graphics.


Couple bits of advice from having one....

14.7 beta benched the best, and gave the best stability for my 280x, which i ran at 1250/1800 with 1.3v and 20% limit. Mine was an Asus top, with the same core speed as yours, and with your cooling you can probably get similar results on the core, but not sure on vram, since you have elpidas.

I would try using AB with AMD overdrive disabled, and then try using it with AMD overdrive enabled. See if the power limit increase causes the crash either way (also apply the increase using CCC with ab closed and see what happens.

I've got some serious trench time in with those cards, so I'm glad to help


----------



## vintageclass

Quote:


> Originally Posted by *radier*
> 
> Low score.
> With 1230/1500 I get around 1580 score at Extreme HD preset.


You're also running an unlocked i5


----------



## neurotix

Quote:


> Originally Posted by *vintageclass*
> 
> You're also running an unlocked i5


Shouldn't really matter, even with an FX-8350 (which has weaker single core than your i3) I was getting 40 fps with my 270X.



270X Vapor-X, 84% ASIC, 1300mhz

Try again and make your CCC settings look like this:



That should bring your score up.


----------



## radier

@vintageclass
Unigine tests do not depend on CPU.

And don't use his cheating settings from CCC.

Wysłane z mojego GT-N7000


----------



## neurotix

Quote:


> Originally Posted by *radier*
> 
> @vintageclass
> Unigine tests do not depend on CPU.
> 
> And don't use his cheating settings from CCC.
> 
> Wysłane z mojego GT-N7000


LOL? Those settings are simply driver tweaks, and on HWBOT disabling tessellation is allowed on AMD cards.

I'm in the top 200 in the US Enthusiast League on HWBOT and number 55 on our HWBOT team overall.... where's your profile?

Based on that, vintageclass, you can decide who you want to believe or what you want to do.

(And radier, accusing random people of cheating isn't gonna make you any friends, especially people who have been in this particular thread since the beginning. Also, people who have been in the Valley thread since the beginning and using Valley since it was first available, on probably 10 different AMD cards.)









EDIT: Oh, here's his profile. Yeah, you can decide who you want to listen to.


----------



## Nafu

Quote:


> Originally Posted by *Devildog83*
> 
> [/SPOILER]
> 
> You have been added. Welcome to the club !!! Nice Card.


you need to correct the clocks of my card in first post to 1070/1550mhz

thanx


----------



## aaronsta1

Quote:


> Originally Posted by *neurotix*
> 
> LOL? Those settings are simply driver tweaks, and on HWBOT disabling tessellation is allowed on AMD cards.
> 
> I'm in the top 200 in the US Enthusiast League on HWBOT and number 55 on our HWBOT team overall.... where's your profile?
> 
> Based on that, vintageclass, you can decide who you want to believe or what you want to do.
> 
> (And radier, accusing random people of cheating isn't gonna make you any friends, especially people who have been in this particular thread since the beginning. Also, people who have been in the Valley thread since the beginning and using Valley since it was first available, on probably 10 different AMD cards.)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Oh, here's his profile. Yeah, you can decide who you want to listen to.


seriously, you are only fooling yourself if you put the settings on extreme and then disable the settings in the control panel.

i can show you high scores too, but it doesnt matter if its not accurate.


----------



## neurotix

Quote:


> Originally Posted by *aaronsta1*
> 
> seriously, you are only fooling yourself if you put the settings on extreme and then disable the settings in the control panel.
> 
> i can show you high scores too, but it doesnt matter if its not accurate.


For that bench in particular, see here: http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0

Tweaks are allowed, and it doesn't say anywhere you can't disable tess, in fact it even says tweaks are encouraged.

It's not just Valley, but also all 3dmark benches, and again this comes from hwbot.

Am I the only person here who has any clue about benchmarking?

(And besides that, his score was WAY too low for his clocks to just be the result of not tweaking, there was something else wrong with his card or system, a stock 270X should do around 35 fps in Valley on Extreme HD, even with tess on.)


----------



## tabascosauz

Quote:


> Originally Posted by *Nafu*
> 
> Bought a used Sapphire Vapor-X R9 280X. upgraded from 7950.


I see that you got the 280X. THE 280X.

Trust me, I would know. This card is the temperamental a$$ of a card among all the temperamental a$$ cards. Want a howling banshee? Now you've got one









But boy does it run cool. And damn does it have a great VRM. Go and OC the snot out of this bad boy.


----------



## benhiggs

hey guys, nice to see a big owners club

just managed to get hold of a gigabyte 280x windforce oc edition for a good price

anyone had any experience with this card?

one thing i already hate is that it doesn't have a backplate, seems dumb almost as its got a massive cooler on









has anyone seen any backplates for this card, or seen anyone make one to fit it

its not a problem for me now as i have a prodigy, but when i put it in my nzxt phantom its gonna sag badly


----------



## vintageclass

Quote:


> Originally Posted by *neurotix*
> 
> Shouldn't really matter, even with an FX-8350 (which has weaker single core than your i3) I was getting 40 fps with my 270X.
> 
> 
> 
> 270X Vapor-X, 84% ASIC, 1300mhz
> 
> Try again and make your CCC settings look like this:
> 
> 
> 
> That should bring your score up.


Quote:


> Originally Posted by *radier*
> 
> @vintageclass
> Unigine tests do not depend on CPU.
> 
> And don't use his cheating settings from CCC.
> 
> Wysłane z mojego GT-N7000


It seems that the CPU does make a difference in many cases. And LOL I had FAH core running when I did my tests so I was able to get up to 39 FPS average without tweaking driver settings yet.


----------



## radier

Tweaks are good for nolifes.

You have example few posts above.

"Show mi yours stats and I will tell you who you are"

Pleaseeeee.........


----------



## Devildog83

Got the G3258 back in and under water with a new motherboard and new colors.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Got the G3258 back in and under water with a new motherboard and new colors.


Prefer the white over the red. What happened to the 8350?


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Prefer the white over the red. What happened to the 8350?


Traded in for Intel, it's kid of a long story but I never ended up going with X99 like I was supposed to and by that time the 8350 was gone. I will be upgrading soon to a 4690k or 4790k soon though but for now the G3258 is fun to play with.


----------



## diggiddi

Quote:


> Originally Posted by *tabascosauz*
> 
> I see that you got the 280X. THE 280X.
> 
> Trust me, I would know. This card is the temperamental a$$ of a card among all the temperamental a$$ cards. Want a howling banshee? Now you've got one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But boy does it run cool. And damn does it have a great VRM. Go and OC the snot out of this bad boy.


What's the story man, I'm looking at picking up either this one or an Asus 280x


----------



## mhfr

Hi
Gigabyte relase new bios FA1 for Radeon R9 270x OC 2GB card (GV-R927XOC-2GD) http://www.gigabyte.com/products/product-page.aspx?pid=4795#bios
Now i have F1 BIOS. Did this bios is capable with my card? On webside is written: "If your VBIOS version is: F1, it can only be updated with VBIOS versions F2-F9. F10, it can only be updated with VBIOS versions F11-F19. F20, it can only be updated with VBIOS versions F21-F29."

But new version is FA1. My bios is F1.
Also, there is a different Substytem ID of this bios compared to F1.


----------



## radier

Use bootable pendrive with atiflash. Type command:

atiflash -f -p 0 R927XO2D.FA1


----------



## tomytom99

I love my Sapphire R9 270x. I got the Toxic edition, and I love it. I've had it for over a year now, and the darn thing just doesn't heat up, even when I want it to. I'll add the info when I get time today.


----------



## Devildog83

Quote:


> Originally Posted by *tomytom99*
> 
> I love my Sapphire R9 270x. I got the Toxic edition, and I love it. I've had it for over a year now, and the darn thing just doesn't heat up, even when I want it to. I'll add the info when I get time today.


Cool.

Love the avatar - Skittles and Arizona brings the Superbowl to mind. Go Hawks!!!


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> Traded in for Intel, it's kid of a long story but I never ended up going with X99 like I was supposed to and by that time the 8350 was gone. I will be upgrading soon to a 4690k or 4790k soon though but for now the G3258 is fun to play with.


Sweet. It looks good. 290 treating you well?
I've cleaned up my rig a little bit and swapped my pump and res.


----------



## Devildog83

Quote:


> Originally Posted by *Recr3ational*
> 
> Sweet. It looks good. 290 treating you well?
> I've cleaned up my rig a little bit and swapped my pump and res.


I like it. I still have to get the lighting right and get my camera back from my stepson, the pics are crappy.


----------



## Recr3ational

Quote:


> Originally Posted by *Devildog83*
> 
> I like it. I still have to get the lighting right and get my camera back from my stepson, the pics are crappy.


Ugh, tell me about it. I still have no idea how to take proper pictures.


----------



## rdr09

Quote:


> Originally Posted by *Recr3ational*
> 
> Ugh, tell me about it. I still have no idea how to take proper pictures.


nice rig. it deserves better shots . . .

http://www.overclock.net/t/912437/how-to-photograph-your-rig


----------



## jprovido

my htpc. maxes out any game I throw at it at 1080p


----------



## Spork13

Quote:


> Originally Posted by *jprovido*
> 
> 
> 
> my htpc. maxes out any game I throw at it at 1080p


Is my sarcasm detector malfunctioning?
Or haven't you played many recent release games?


----------



## CGabry

Here is mine Powercolor TurboDuo...

Runs perfectly at 1200Mhz Core / 6400 memory...


----------



## Recr3ational

Quote:


> Originally Posted by *rdr09*
> 
> nice rig. it deserves better shots . . .
> 
> http://www.overclock.net/t/912437/how-to-photograph-your-rig


Thank you. Yeah I have that thread bookmarked. I must say that my pictures have slowly gone better, still not as decent as I would like it too. I think, I just lack the photography-ness in me.


----------



## rdr09

Quote:


> Originally Posted by *Recr3ational*
> 
> Thank you. Yeah I have that thread bookmarked. I must say that my pictures have slowly gone better, still not as decent as I would like it too. I think, I just lack the photography-ness in me.


you really do have a nice looking rig. i can't follow the loop. lol

edit: Rec, you have a 360 and a 240, right? what pump are you using? thanks.


----------



## JackCY

Quote:


> Originally Posted by *jprovido*
> 
> my htpc. maxes out any game I throw at it at 1080p


280x? DREAM ABOUT IT.
980? Yeah maybe.


----------



## Nafu

Quote:


> Originally Posted by *JackCY*
> 
> 280x? DREAM ABOUT IT.
> 980? Yeah maybe.


Never underestimate an AMD GPU, that horsepower will curshed the rival and may perform above your expectation. i dont meant 980. lol


----------



## Nafu

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Couple bits of advice from having one....
> 
> 14.7 beta benched the best, and gave the best stability for my 280x, which i ran at 1250/1800 with 1.3v and 20% limit. Mine was an Asus top, with the same core speed as yours, and with your cooling you can probably get similar results on the core, but not sure on vram, since you have elpidas.
> 
> I would try using AB with AMD overdrive disabled, and then try using it with AMD overdrive enabled. See if the power limit increase causes the crash either way (also apply the increase using CCC with ab closed and see what happens.
> 
> I've got some serious trench time in with those cards, so I'm glad to help


Thanx for your response dear. Appreciated

but what i did try last time was, increased voltage to 1.125v, i then slide up +50 TPL and no more problem after that. so i guess its a voltage issue, it might be little less for the clocks. so how can i set this voltage parmanently ?? BIOS mod??


----------



## Recr3ational

Quote:


> Originally Posted by *rdr09*
> 
> you really do have a nice looking rig. i can't follow the loop. lol
> 
> edit: Rec, you have a 360 and a 240, right? what pump are you using? thanks.


Thank you.
I have a 360 and a phobya 200, the loop is weird but it goes under my psu compartment. I use a XSPC d5 with an EK top.


----------



## Devildog83

Quote:


> Originally Posted by *CGabry*
> 
> 
> 
> Here is mine Powercolor TurboDuo...
> 
> Runs perfectly at 1200Mhz Core / 6400 memory...


You have been added, Welcome!!!


----------



## Devildog83

Quote:


> Originally Posted by *JackCY*
> 
> 280x? DREAM ABOUT IT.
> 980? Yeah maybe.


290x Yeah!!!


----------



## benhiggs

Got my windforce 280x installed

looks big in my lil prodigy



sorry about the glare :/


----------



## DiceAir

Yesterday was playing Borderlands pre sequel and one of my r9 280x's went to 95C on 100% fan speed and all fans 100%. I already tried changing thermal paste and nothing changed. So what can I do. They club3d r9 280x royalking and I don't overclock whatsoever. CPU is running 50-60C max so I don't know what to do anymore. I was thikning of just getting rid of these cards and go Nvidia 980's but performance wise these cards should still be enough.

I have 3x cougar vortex pwm in front and stock fan on my air 540 in the back. on idle my temps seems fine about 40C top card and 30-35C bottom card. Ambient temps between 26C-30C( I know it's hot but I'm from South Africa and it an get very hot here) when I game all fans are at 100% and I also checked if my fans are spinning at the correct speed and gpu fans are about 4000RPM for both so all is fine there. I think the only solution is to go watercooling but that's a no go. So what can I do or should I just sell these cards and go over to Nvidia?


----------



## aaronsta1

Quote:


> Originally Posted by *DiceAir*
> 
> Yesterday was playing Borderlands pre sequel and one of my r9 280x's went to 95C on 100% fan speed and all fans 100%. I already tried changing thermal paste and nothing changed. So what can I do. They club3d r9 280x royalking and I don't overclock whatsoever. CPU is running 50-60C max so I don't know what to do anymore. I was thikning of just getting rid of these cards and go Nvidia 980's but performance wise these cards should still be enough.
> 
> I have 3x cougar vortex pwm in front and stock fan on my air 540 in the back. on idle my temps seems fine about 40C top card and 30-35C bottom card. Ambient temps between 26C-30C( I know it's hot but I'm from South Africa and it an get very hot here) when I game all fans are at 100% and I also checked if my fans are spinning at the correct speed and gpu fans are about 4000RPM for both so all is fine there. I think the only solution is to go watercooling but that's a no go. So what can I do or should I just sell these cards and go over to Nvidia?


crossfire really likes cases with side fans.. without a side fan bringing in fresh air, that top card just recycles the hot air from the bottom card.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *aaronsta1*
> 
> crossfire really likes cases with side fans.. without a side fan bringing in fresh air, that top card just recycles the hot air from the bottom card.


in most scenarios yes, however.. also depends on the card and the actual static pressure in the case


----------



## Devildog83

Quote:


> Originally Posted by *DiceAir*
> 
> Yesterday was playing Borderlands pre sequel and one of my r9 280x's went to 95C on 100% fan speed and all fans 100%. I already tried changing thermal paste and nothing changed. So what can I do. They club3d r9 280x royalking and I don't overclock whatsoever. CPU is running 50-60C max so I don't know what to do anymore. I was thikning of just getting rid of these cards and go Nvidia 980's but performance wise these cards should still be enough.
> 
> I have 3x cougar vortex pwm in front and stock fan on my air 540 in the back. on idle my temps seems fine about 40C top card and 30-35C bottom card. Ambient temps between 26C-30C( I know it's hot but I'm from South Africa and it an get very hot here) when I game all fans are at 100% and I also checked if my fans are spinning at the correct speed and gpu fans are about 4000RPM for both so all is fine there. I think the only solution is to go watercooling but that's a no go. So what can I do or should I just sell these cards and go over to Nvidia?


Just guessing but if you are that hot just playing a game with no overclock and 100% fan I would think there is a problem with the card. Either the temp sensor is bad or the card is giving up. Did you try doing a driver sweep and reinstalling the drivers, also uninstall and reinstall the monitoring software while your at it. Sometimes there can be conflicts so try only downloading 1 monitoring software until you are sure what's going on.

Also try disabling X-Fire and running just the 1 card that is getting hot and see if that makes a difference.


----------



## Spork13

My top (280x) card was hitting low 90s too while gaming.
I added a fan on top of the PSU shroud (or could use back of a drive bay in other cases) to blow air from the front intakes directly over and between the GPU's.
That dropped temps by close to 10c. Some time I think I'll raise that fan up a little so it's also blowing on the back / top of the GPU as well and see if that drops temps even further.
The bottom card (280) doesn't run anywhere near as warn.


----------



## DiceAir

Quote:


> Originally Posted by *Devildog83*
> 
> Just guessing but if you are that hot just playing a game with no overclock and 100% fan I would think there is a problem with the card. Either the temp sensor is bad or the card is giving up. Did you try doing a driver sweep and reinstalling the drivers, also uninstall and reinstall the monitoring software while your at it. Sometimes there can be conflicts so try only downloading 1 monitoring software until you are sure what's going on.
> 
> Also try disabling X-Fire and running just the 1 card that is getting hot and see if that makes a difference.


Yes tried all of the steps. The funny thing is there's nothing wrong with cards performance. Maybe I should just leave it at 95C. I also contacted Club3d support and all I can say is theys suck because they never get back to me last time I contacted them but I will try again.


----------



## daffy.duck

Here is a pic of my card, R9 280x. Sorry for the quality but it's the best that I could have done.


----------



## neurotix

Tbh, this is why I avoid some of the generic cheaper brands like VTX, Club3D, Powercolor (Look up the problems with the TurboDuo), Visiontek and so on... even avoid XFX, the DD cooler sucks...

Even in Newegg reviews, I see people with tons of problems with these cards from these companies.

I'm a Sapphire fan, but I'd also consider ASUS, MSI or possibly Gigabyte... I mean think about it, the last three make motherboards and other stuff too... they're much larger.

I wouldn't advise you sell your cards and get a different brand or anything, but ultimately, that might solve the problem...

Your system looks good otherwise, and you have a pretty nice Corsair case, I'm not sure what to tell you. Maybe you need more powerful fans to exhaust the hot air from the cards. Or maybe you need a case with a side fan like some people suggested. The cards could possibly be defective as well.

You might try putting your fans to 100% and running some benchmark (Valley), lay your case on it's side and make sure the fans on both cards are working properly.


----------



## DiceAir

Quote:


> Originally Posted by *neurotix*
> 
> Tbh, this is why I avoid some of the generic cheaper brands like VTX, Club3D, Powercolor (Look up the problems with the TurboDuo), Visiontek and so on... even avoid XFX, the DD cooler sucks...
> 
> Even in Newegg reviews, I see people with tons of problems with these cards from these companies.
> 
> I'm a Sapphire fan, but I'd also consider ASUS, MSI or possibly Gigabyte... I mean think about it, the last three make motherboards and other stuff too... they're much larger.
> 
> I wouldn't advise you sell your cards and get a different brand or anything, but ultimately, that might solve the problem...
> 
> Your system looks good otherwise, and you have a pretty nice Corsair case, I'm not sure what to tell you. Maybe you need more powerful fans to exhaust the hot air from the cards. Or maybe you need a case with a side fan like some people suggested. The cards could possibly be defective as well.
> 
> You might try putting your fans to 100% and running some benchmark (Valley), lay your case on it's side and make sure the fans on both cards are working properly.


I made sure fans are working as it should and it is. When feeling between the cards and the front fans I can feel the air pushing through but yes i agree with you. The cheaper brands is a no go. They super cheap for a reason. The warranty is also better on the Other brands. I will never buy from some run in the mill company again. Only from gigabyte, msi, ASUS and so on. They spend more time in getting things right. I had MSI cards failed on me but that was my own fault. Overclocking to much and then artifact but supplier Gave me a new card anyway even if I could just lower the clocks but hey I then pay in a bit more and get a better card. But anyway will never buy club3d and powercolor again.

Also what do you think of Galax. One of my clan mates have the Galax 980 SOC version but the card already goes up to 80C but at least on low fan speed and they designed to clock to 80C.

Oh and I won't get another 280x if I sell it. I will rather go back to Nvidia again but the 980 isn ot enough for my needs and 2 of them cost to much. I'm running [email protected]


----------



## neurotix

Quote:


> Originally Posted by *DiceAir*
> 
> I made sure fans are working as it should and it is. When feeling between the cards and the front fans I can feel the air pushing through but yes i agree with you. The cheaper brands is a no go. They super cheap for a reason. The warranty is also better on the Other brands. I will never buy from some run in the mill company again. Only from gigabyte, msi, ASUS and so on. They spend more time in getting things right. I had MSI cards failed on me but that was my own fault. Overclocking to much and then artifact but supplier Gave me a new card anyway even if I could just lower the clocks but hey I then pay in a bit more and get a better card. But anyway will never buy club3d and powercolor again.
> 
> Also what do you think of Galax. One of my clan mates have the Galax 980 SOC version but the card already goes up to 80C but at least on low fan speed and they designed to clock to 80C.
> 
> Oh and I won't get another 280x if I sell it. I will rather go back to Nvidia again but the 980 isn ot enough for my needs and 2 of them cost to much. I'm running [email protected]


Yeah, I'm sorry but there DO exist brands that are absolute garbage. I don't care if this offends anyone, because in my experience it's largely true. I've bought Sapphire since 2008 and had very few problems, and when I DID have problems I got my cards replaced by Althon Micro within 2-3 weeks. I mean, some of these brands don't even have hardware reps on our forums. As an example, look at some of the negative reviews here. That's more 1 egg reviews than any other kind of review. On top of that, look at the feedback for your exact Club3D cards here. Two of the reviews mention temperature problems. One review even details your EXACT problem (hope that review isn't you), that is, cards overheating in Crossfire. I checked Google for reviews of your cards and didn't find much, even Kitguru didn't have them and they review everything. I looked up pictures of the heatsink and didn't find much either. I would wager the reason they run so hot is that the heatsink is cheap: not enough heatpipes, heatpipes are too small, sink is poor material, and it only has two fans. Remember, Tahiti still runs very hot under normal conditions, for example my 7970 Vapor-X always ran around 65C while gaming, and that's with a moderate overclock @ 1.3v with 100% fan.

You also mentioned high ambient temps because South Africa, this might be a BIG part of your problem, throw one of those small AC units in a window near your computer and run it full blast 24/7, in fact doing this might just solve your whole overheating problem.

As far as Galaxy, I would say avoid. If I were going to buy Nvidia, I would buy EVGA, no contest. They're the best. Don't cheap out and don't settle for less. They've been around forever, they make quality cards, they allow waterblocks, and (from what I've heard) they have excellent customer service. There's a good reason why so many benchmarking records were broken with 780ti k|ngp|n Classified. I have also heard that the ASUS Strix cards are excellent, as well as MSI's offerings if you don't want EVGA. I would say avoid PNY and Zotac.

2560x1440p 96hz eh? If I were you, I'd throw two 970s at that. They will outperform your 280xs while using less power. For only $200 more than a single 980 you could have dual 970s, afaik. If you get $250 for each 280x on Amazon or something, you'd only have to barf up $200 more for a 970 setup. Consider it.

(No single card is going to drive that monitor at that resolution and refresh very well. Not even the new stuff from Nvidia and AMD that's around the corner.)

Hope this helps.


----------



## JackCY

Quote:


> Originally Posted by *DiceAir*
> 
> Yesterday was playing Borderlands pre sequel and one of my r9 280x's went to 95C on 100% fan speed and all fans 100%. I already tried changing thermal paste and nothing changed. So what can I do. They club3d r9 280x royalking and I don't overclock whatsoever. CPU is running 50-60C max so I don't know what to do anymore. I was thikning of just getting rid of these cards and go Nvidia 980's but performance wise these cards should still be enough.
> 
> I have 3x cougar vortex pwm in front and stock fan on my air 540 in the back. on idle my temps seems fine about 40C top card and 30-35C bottom card. Ambient temps between 26C-30C( I know it's hot but I'm from South Africa and it an get very hot here) when I game all fans are at 100% and I also checked if my fans are spinning at the correct speed and gpu fans are about 4000RPM for both so all is fine there. I think the only solution is to go watercooling but that's a no go. So what can I do or should I just sell these cards and go over to Nvidia?


4000rpm?!?!?!
Mine go to "only" 3000rpm and that already sounds like an airplane taking off and me hoping the fans don't disintegrate








Anyway Asus 280x DC2T during bench say 1600rpm/below 80C, during gaming 1200rpm/below 70C.

I don't think one exhaust fan is enough, I have one exhaust fan and the air can get quite hot with just one 280x and top of the case warms up.
You will need some serious blow for CF, both intake and exhaust and getting cold air to the suffocated card.
Still I think those club3d have a bad heatsink design. Cloud be that the heatpipes on sides don't touch the chip, small surface area, not enough pressure, ... who knows.


----------



## tabascosauz

Quote:


> Originally Posted by *DiceAir*
> 
> Yesterday was playing Borderlands pre sequel and one of my r9 280x's went to 95C on 100% fan speed and all fans 100%. I already tried changing thermal paste and nothing changed. So what can I do. They club3d r9 280x royalking and I don't overclock whatsoever. CPU is running 50-60C max so I don't know what to do anymore. I was thikning of just getting rid of these cards and go Nvidia 980's but performance wise these cards should still be enough.
> 
> I have 3x cougar vortex pwm in front and stock fan on my air 540 in the back. on idle my temps seems fine about 40C top card and 30-35C bottom card. Ambient temps between 26C-30C( I know it's hot but I'm from South Africa and it an get very hot here) when I game all fans are at 100% and I also checked if my fans are spinning at the correct speed and gpu fans are about 4000RPM for both so all is fine there. I think the only solution is to go watercooling but that's a no go. So what can I do or should I just sell these cards and go over to Nvidia?


Ouch, sorry to hear of your woes.

25-30°C is not too high in terms of ambient temps. I am very sensitive to temps, so I would be extremely uncomfortable working and gaming in such an environment, but even my 280X did not suffer like that while it was housed in the H440. At worst it would peak at around 70°C on a pretty silent fan profile, compared to 67-68°C in the exact same game under "normal" conditions. But that's keeping in mind that my card sports Vapor-X on a beefy custom PCB.

I think that you should RMA. Club3D's policies I have heard to be pretty good, comparable with XFX. Anyways, I think Club3D does allow you to take off the cooler to change paste, so if you really don't want any more RMA hassle, you could get a hold of an H55 and a Kraken G10 and see if it really is Club3D's cooler's fault.

Good luck.


----------



## MadRabbit

2 x MSI Gaming 3G 280x


----------



## DiceAir

Quote:


> Originally Posted by *neurotix*
> 
> Yeah, I'm sorry but there DO exist brands that are absolute garbage. I don't care if this offends anyone, because in my experience it's largely true. I've bought Sapphire since 2008 and had very few problems, and when I DID have problems I got my cards replaced by Althon Micro within 2-3 weeks. I mean, some of these brands don't even have hardware reps on our forums. As an example, look at some of the negative reviews here. That's more 1 egg reviews than any other kind of review. On top of that, look at the feedback for your exact Club3D cards here. Two of the reviews mention temperature problems. One review even details your EXACT problem (hope that review isn't you), that is, cards overheating in Crossfire. I checked Google for reviews of your cards and didn't find much, even Kitguru didn't have them and they review everything. I looked up pictures of the heatsink and didn't find much either. I would wager the reason they run so hot is that the heatsink is cheap: not enough heatpipes, heatpipes are too small, sink is poor material, and it only has two fans. Remember, Tahiti still runs very hot under normal conditions, for example my 7970 Vapor-X always ran around 65C while gaming, and that's with a moderate overclock @ 1.3v with 100% fan.
> 
> You also mentioned high ambient temps because South Africa, this might be a BIG part of your problem, throw one of those small AC units in a window near your computer and run it full blast 24/7, in fact doing this might just solve your whole overheating problem.
> 
> As far as Galaxy, I would say avoid. If I were going to buy Nvidia, I would buy EVGA, no contest. They're the best. Don't cheap out and don't settle for less. They've been around forever, they make quality cards, they allow waterblocks, and (from what I've heard) they have excellent customer service. There's a good reason why so many benchmarking records were broken with 780ti k|ngp|n Classified. I have also heard that the ASUS Strix cards are excellent, as well as MSI's offerings if you don't want EVGA. I would say avoid PNY and Zotac.
> 
> 2560x1440p 96hz eh? If I were you, I'd throw two 970s at that. They will outperform your 280xs while using less power. For only $200 more than a single 980 you could have dual 970s, afaik. If you get $250 for each 280x on Amazon or something, you'd only have to barf up $200 more for a 970 setup. Consider it.
> 
> (No single card is going to drive that monitor at that resolution and refresh very well. Not even the new stuff from Nvidia and AMD that's around the corner.)
> 
> Hope this helps.


Thanks for the info man. I appreciate it much. I completely agree with you there. I will never buy anything other than ASUS, MSI, Gigabyte or EVGA. I rather pay more and get good quality. I'm just scared of the 970's because of the memory issue. I'm just scared later on I might see the need for more than 3.5GB and get stutter so will wait a bit. At least my performance isn't worse than normal so 95C is not throttling the card whatsoever. I only get 95C when playing certain games and so on and really putting strain on the card. I think i can safely say I will wait for next gen nvidia or AMD cards but I'm thinking of changing back to Nvidia.

I wanted to get the sapphire r9 280x toxic and crossfire and I regret it. Never will I cheap out ever again. Even the asic quality is really bad @ 55% per card and that's low. Although my purchase didn't work out as I wanted it's still good. Now I've learned never to cheap out on pc part to get what you want. As long as you don't go overboard.


----------



## dzigg

Hey guys, I'm interested in 280x now to replace my aging nvidia gtx 285, since there's a price drop in my country for that particular card.

But will my i5 750 bottleneck it? I'm curious if there's still any i5 750 (or any lynnfield) users here.

Note : I can overclock it to about 3,6 ghz. I can push it further but I prefer to keep it as cool as I can.


----------



## JackCY

Quote:


> Originally Posted by *DiceAir*
> 
> I wanted to get the sapphire r9 280x toxic and crossfire and I regret it. Never will I cheap out ever again. Even the asic quality is really bad @ 55% per card and that's low. Although my purchase didn't work out as I wanted it's still good. Now I've learned never to cheap out on pc part to get what you want. As long as you don't go overboard.


ASUS 280x DC2T, 59% ASIC and I have again reverted to stock clocks because even 1.1GHz was throwing me artifacts in one game so I'm testing if it's the core which I think it is or it's the memory OC to 7GHz which otherwise can run even 7.2GHz. The ASIC quality and overclockability is a pure luck of the draw with any brand.
You can get coil whining ASUS or Club3D too, or Nvidia or AMD, none of this really matter. Sometimes this or that particular design and batch is more susceptible and other is less, that's all there is to it.
When it comes to coolers, yeah it pays to do a bit of research on temperatures, noise and power data first.

Personally I'm looking forward to die shrink, until then no upgrade. Sure 980 is nice but insanely expensive, 970 is a cut down 980 but a botched one. I think they should have made it 3.5GB and drop the price lower by leaving out that one chip of memory altogether like they used to with most designs to avoid weird issues and slowdowns.

---

dzigg: I would say yes, but it's only a guess. It's a very old platform.


----------



## DiceAir

Quote:


> Originally Posted by *JackCY*
> 
> ASUS 280x DC2T, 59% ASIC and I have again reverted to stock clocks because even 1.1GHz was throwing me artifacts in one game so I'm testing if it's the core which I think it is or it's the memory OC to 7GHz which otherwise can run even 7.2GHz. The ASIC quality and overclockability is a pure luck of the draw with any brand.
> You can get coil whining ASUS or Club3D too, or Nvidia or AMD, none of this really matter. Sometimes this or that particular design and batch is more susceptible and other is less, that's all there is to it.
> When it comes to coolers, yeah it pays to do a bit of research on temperatures, noise and power data first.
> 
> Personally I'm looking forward to die shrink, until then no upgrade. Sure 980 is nice but insanely expensive, 970 is a cut down 980 but a botched one. I think they should have made it 3.5GB and drop the price lower by leaving out that one chip of memory altogether like they used to with most designs to avoid weird issues and slowdowns.
> 
> ---
> 
> dzigg: I would say yes, but it's only a guess. It's a very old platform.


The die shrink might only come next year.


----------



## Arkanon

Quote:


> Originally Posted by *dzigg*
> 
> Hey guys, I'm interested in 280x now to replace my aging nvidia gtx 285, since there's a price drop in my country for that particular card.
> 
> But will my i5 750 bottleneck it? I'm curious if there's still any i5 750 (or any lynnfield) users here.
> 
> Note : I can overclock it to about 3,6 ghz. I can push it further but I prefer to keep it as cool as I can.


a i5 750 is still a decent performer and especially overclocked won't bottleneck a 280x


----------



## Agent Smith1984

Quote:


> Originally Posted by *Arkanon*
> 
> a i5 750 is still a decent performer and especially overclocked won't bottleneck a 280x


You're going to find that it will indeed bottleneck a 280x, unless you have it running somewhere in the realm of 3.6+ GHz, and even then, it'll have it's hands full.
Will definitely still run some games decently though!


----------



## dzigg

Quote:


> Originally Posted by *JackCY*
> 
> dzigg: I would say yes, but it's only a guess. It's a very old platform.


Quote:


> Originally Posted by *Arkanon*
> 
> a i5 750 is still a decent performer and especially overclocked won't bottleneck a 280x


Quote:


> Originally Posted by *Agent Smith1984*
> 
> You're going to find that it will indeed bottleneck a 280x, unless you have it running somewhere in the realm of 3.6+ GHz, and even then, it'll have it's hands full.
> Will definitely still run some games decently though!


thanks guys,

how much of a bottleneck though should I suspect? 10-15%? 30-40%? I think I can live with about 5-10%, if not I'll just buy a lesser card.


----------



## Spork13

Start windows task manager.
Run MSI aftermburner (or similar)
Play some games and do a benchmark or two.
Have a look @ the statistics - if there is a bottleneck CPU will be @ 100% and GPU will be < 100%


----------



## neurotix

Quote:


> Originally Posted by *DiceAir*
> 
> Thanks for the info man. I appreciate it much. I completely agree with you there. I will never buy anything other than ASUS, MSI, Gigabyte or EVGA. I rather pay more and get good quality. I'm just scared of the 970's because of the memory issue. I'm just scared later on I might see the need for more than 3.5GB and get stutter so will wait a bit. At least my performance isn't worse than normal so 95C is not throttling the card whatsoever. I only get 95C when playing certain games and so on and really putting strain on the card. I think i can safely say I will wait for next gen nvidia or AMD cards but I'm thinking of changing back to Nvidia.
> 
> I wanted to get the sapphire r9 280x toxic and crossfire and I regret it. Never will I cheap out ever again. Even the asic quality is really bad @ 55% per card and that's low. Although my purchase didn't work out as I wanted it's still good. Now I've learned never to cheap out on pc part to get what you want. As long as you don't go overboard.


95C is outrageous and I'm surprised your cards aren't throttling. I had problems with my 7970 (which is a 280X) when it passed 70C core. I would get artifacts. Mainly in Crysis 3.

55% ASIC is also very poor. I've used 3 different 7970 Vapor-X and the lowest I saw was 66%.

If you can't afford the 970s, or are worried about the memory problem (I think this has been blown way out of proportion), you might consider selling your cards and replacing them. You might even be able to pick up two 290 Tri-X for around $250 each if you buy used. I've even seen them for less than that on Ebay, the marketplace here, and Amazon. If you get a 290 just don't get reference. I don't see 290s here anymore, but there's this listing. If you got $400 for your current cards, you could buy those and have the same setup, and < $100 in your pocket.









Good luck, I hope you find a solution.


----------



## tabascosauz

Quote:


> Originally Posted by *dzigg*
> 
> thanks guys,
> 
> how much of a bottleneck though should I suspect? 10-15%? 30-40%? I think I can live with about 5-10%, if not I'll just buy a lesser card.


A bottleneck in perhaps a different sense than a X4 860K would in Star Swarm. Not too much though. If you can push 3.3GHz+ on your CPU, it's just a matter of getting less frames than the top-end CPUs of today.

I would wait though, honestly. The HD 7970 has been around for way too long and AMD's next cards are just around the corner.


----------



## dzigg

Quote:


> Originally Posted by *tabascosauz*
> 
> A bottleneck in perhaps a different sense than a X4 860K would in Star Swarm. Not too much though. If you can push 3.3GHz+ on your CPU, it's just a matter of getting less frames than the top-end CPUs of today.
> 
> I would wait though, honestly. The HD 7970 has been around for way too long and AMD's next cards are just around the corner.


I tried the MSI afterburner on bioshock infinite using my old card and the CPU is only working about 50-60% on all cores. Granted it's an old game.

I don't mind getting 5-15 fps less, considering I'm using a 5 year old CPU it's kind of a bargain









Yeah I'd probably wait until the 300 series, thanks for the suggestion.


----------



## tabascosauz

Quote:


> Originally Posted by *dzigg*
> 
> I tried the MSI afterburner on bioshock infinite using my old card and the CPU is only working about 50-60% on all cores. Granted it's an old game.
> 
> I don't mind getting 5-15 fps less, considering I'm using a 5 year old CPU it's kind of a bargain
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah I'd probably wait until the 300 series, thanks for the suggestion.


What are your clocks?


----------



## DiceAir

Quote:


> Originally Posted by *neurotix*
> 
> 95C is outrageous and I'm surprised your cards aren't throttling. I had problems with my 7970 (which is a 280X) when it passed 70C core. I would get artifacts. Mainly in Crysis 3.
> 
> 55% ASIC is also very poor. I've used 3 different 7970 Vapor-X and the lowest I saw was 66%.
> 
> If you can't afford the 970s, or are worried about the memory problem (I think this has been blown way out of proportion), you might consider selling your cards and replacing them. You might even be able to pick up two 290 Tri-X for around $250 each if you buy used. I've even seen them for less than that on Ebay, the marketplace here, and Amazon. If you get a 290 just don't get reference. I don't see 290s here anymore, but there's this listing. If you got $400 for your current cards, you could buy those and have the same setup, and < $100 in your pocket.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good luck, I hope you find a solution.


i have them money for the gtx 970 but it's not fast enough for my needs. I want to crank up the details in the games excepts 1 or 2 settings if need be. I know some games I will not be abnle to do max and maybe asking a bit much but I think the gtx 970 is not enough of an upgrade for what I'm going to pay. I must admit it's not all the time it goes 88C-95C so in many cases it's under 95C. Cards still under warranty so i must just use it as is until it either breaks or AMD/Nvidia next cards comes.

I've undervolted my gpu's and it works sort of fine maybe 1 or 2 games it will crash butt he most games I play it's fine and temps goes down a bit. stock volts is 1.265V for 1100mhz core


----------



## neurotix

Quote:


> Originally Posted by *DiceAir*
> 
> i have them money for the gtx 970 but it's not fast enough for my needs. I want to crank up the details in the games excepts 1 or 2 settings if need be. I know some games I will not be abnle to do max and maybe asking a bit much but I think the gtx 970 is not enough of an upgrade for what I'm going to pay. I must admit it's not all the time it goes 88C-95C so in many cases it's under 95C. Cards still under warranty so i must just use it as is until it either breaks or AMD/Nvidia next cards comes.
> 
> I've undervolted my gpu's and it works sort of fine maybe 1 or 2 games it will crash butt he most games I play it's fine and temps goes down a bit. stock volts is 1.265V for 1100mhz core


Well, there's a couple of things I see from this.

A GTX 970 would be quite a bit faster, use less power, and run cooler (even on air) compared to your 280x's. It should totally be doable to get them up to 1500mhz core, on air, from everything I've heard. They overclock phenomenally. At those kinds of speeds, they outperform pretty much everything out there. They should be able to push your monitor no sweat at those clocks, even with graphic settings maxed out etc.

The other thing that I noticed, we never established what your fan speeds are. Don't use a default fan profile. When I game, I manually set my cards to 100% fan. I don't care about the noise, since I have full ear covering headphones that go up pretty loud. I can't hear the fans over the game, ever.

Additionally, 1.265v for 1100mhz is a lot. You don't need to be at 1100mhz, you have two cards. My cards stay within reasonable temps at 100% fan at 1100mhz, and my voltage is much lower, so it's not a problem. In your case, with Tahiti, I would recommend you try lowering the clocks to 1000mhz and reduce voltage to about 1.22v (or less), use 100% fan and see what your temps are like. I believe my 7970 Vapor-X has stock clocks of 1050mhz/1400mhz and 1.2v. It also seems to have a much better cooler.

Give what I said a try, the extra 100mhz won't make THAT much of a performance difference, and if it makes your cards run cooler and last longer, then it's better anyway.


----------



## DiceAir

Quote:


> Originally Posted by *neurotix*
> 
> Well, there's a couple of things I see from this.
> 
> A GTX 970 would be quite a bit faster, use less power, and run cooler (even on air) compared to your 280x's. It should totally be doable to get them up to 1500mhz core, on air, from everything I've heard. They overclock phenomenally. At those kinds of speeds, they outperform pretty much everything out there. They should be able to push your monitor no sweat at those clocks, even with graphic settings maxed out etc.
> 
> The other thing that I noticed, we never established what your fan speeds are. Don't use a default fan profile. When I game, I manually set my cards to 100% fan. I don't care about the noise, since I have full ear covering headphones that go up pretty loud. I can't hear the fans over the game, ever.
> 
> Additionally, 1.265v for 1100mhz is a lot. You don't need to be at 1100mhz, you have two cards. My cards stay within reasonable temps at 100% fan at 1100mhz, and my voltage is much lower, so it's not a problem. In your case, with Tahiti, I would recommend you try lowering the clocks to 1000mhz and reduce voltage to about 1.22v (or less), use 100% fan and see what your temps are like. I believe my 7970 Vapor-X has stock clocks of 1050mhz/1400mhz and 1.2v. It also seems to have a much better cooler.
> 
> Give what I said a try, the extra 100mhz won't make THAT much of a performance difference, and if it makes your cards run cooler and last longer, then it's better anyway.


100mhz per card is actually a lot to what I tested before as far as I can remember but will try again. Maybe if I clock something like 1050MHz at first and see what voltages i can run at.

The longer I can hold on the better and the more money I have in the end to spend on GPU(s) if I must. I eagerly await Directx 12 as i said before it may solve some of my performance issues and possibly heat issues but I don't know.

I also have my fans on 100% 4000RPm for the top card and bottom card about 75%. Bottom card goes to about 70-75C max but top card can go as high as 85C-95C+ but I think maybe I should have both 100% when gaming cause I think if I lower temps on bottom card then the top card won't run as hot.

So I did a bit of searching and according to guru3d overclocking test you loose up to 6 FPS per card and in crossfire it might be a bit more if I downclock the gpu's so although I might downclock just to save my gpu from geting to hot. I will also try 1050 @ 1.2V to see the temps and if it's stable.

Just to note in bf4 I play on high graphics with no aa and HBAO and I get my 95FPS and the card will be highest 85C and I think that's still fine it's just some other games like the other day I played Borderalnds pre sequal and saw my gpu go 90C and I must admit it was super hot in my room. If I must guess was over 30C in my room but I muight be exaggerate a bit but yeah will run all gpu's at 100% all time when gaming to see temps tonight compared to my normal fan speed.

VRM only goes to around 70C max


----------



## Mr-Dark

Quote:


> Originally Posted by *DiceAir*
> 
> i have them money for the gtx 970 but it's not fast enough for my needs. I want to crank up the details in the games excepts 1 or 2 settings if need be. I know some games I will not be abnle to do max and maybe asking a bit much but I think the gtx 970 is not enough of an upgrade for what I'm going to pay. I must admit it's not all the time it goes 88C-95C so in many cases it's under 95C. Cards still under warranty so i must just use it as is until it either breaks or AMD/Nvidia next cards comes.
> 
> I've undervolted my gpu's and it works sort of fine maybe 1 or 2 games it will crash butt he most games I play it's fine and temps goes down a bit. stock volts is 1.265V for 1100mhz core


i just uograde from 280x/7950 heavy oc in crossfire to single gtx 970 G1 and its upgrade i see good boost on same game but the gampley smooth now

just the card now have peak temp 58c and around 170w my CF run hot 80c and high fan speed

look to this 3Dmark score from my cf build

stock cf

http://www.3dmark.com/3dm/4297625

and this oc 1100/1600 cf

http://www.3dmark.com/3dm/4911966

and this stock 970

http://www.3dmark.com/3dm/5687964

and this gtx 970 with oc 1500/1900mhz super easy on air with 60c max temp stock fan profile

http://www.3dmark.com/3dm/5688137


----------



## DiceAir

Quote:


> Originally Posted by *Mr-Dark*
> 
> i just uograde from 280x/7950 heavy oc in crossfire to single gtx 970 G1 and its upgrade i see good boost on same game but the gampley smooth now
> 
> just the card now have peak temp 58c and around 170w my CF run hot 80c and high fan speed
> 
> look to this 3Dmark score from my cf build
> 
> stock cf
> 
> http://www.3dmark.com/3dm/4297625
> 
> and this oc 1100/1600 cf
> 
> http://www.3dmark.com/3dm/4911966
> 
> and this stock 970
> 
> http://www.3dmark.com/3dm/5687964
> 
> and this gtx 970 with oc 1500/1900mhz super easy on air with 60c max temp stock fan profile
> 
> http://www.3dmark.com/3dm/5688137


How come the cf 3dmark results saying 7970(2x) and not not 7950(2x)? Also nice results. I've been hearing lot's of people say going from 7000, r9 series crossfire to single Nvidia cards and it's smoother. So maybe it is.


----------



## rdr09

Quote:


> Originally Posted by *DiceAir*
> 
> How come the cf 3dmark results saying 7970(2x) and not not 7950(2x)? Also nice results. I've been hearing lot's of people say going from 7000, r9 series crossfire to single Radeon cards and it's smoother. So maybe it is.


even crossfire hawaii is smoother 'cause of XDMA. here is one of my 290 at 1200 core . . .

http://www.3dmark.com/3dm/4079609?


----------



## Mr-Dark

Quote:


> Originally Posted by *DiceAir*
> 
> How come the cf 3dmark results saying 7970(2x) and not not 7950(2x)? Also nice results. I've been hearing lot's of people say going from 7000, r9 series crossfire to single Nvidia cards and it's smoother. So maybe it is.


I dont know all bench program say 2*7970 or 2* 280x


----------



## DiceAir

Quote:


> Originally Posted by *Mr-Dark*
> 
> I dont know all bench program say 2*7970 or 2* 280x


oh ok

i really tempted to upgrade but keep getting this gut feeling that I should wait a bit untill nvidia release next cards next year or whenever or wait for AMD to release next cards.


----------



## Decerto

I just got mine today...Gigabyte R9 270,overclocked it from 975 to 1050,thats the max it will go,and memory from 1400 to 1500.Works like a charm.Lots of people told me that my old Q6600 (overclocked to 3.4ghz) will bottleneck it,but as I guessed,people throw that term nowadays a lot,without even knowing what they are talking about.Ran unigine valley benchmark on extreme preset,98% gpu usage and got 40 fps average.More than happy with it


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *DiceAir*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mr-Dark*
> 
> I dont know all bench program say 2*7970 or 2* 280x
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> oh ok
> 
> i really tempted to upgrade but keep getting this gut feeling that I should wait a bit untill nvidia release next cards next year or whenever or wait for AMD to release next cards.
Click to expand...

The 280x is a rebrand of the 7970GHz edition and the 280 is a rebrand of the 7950
So in benchmarks you will see the 280/280x show as a 79xx series.. because they are Tahiti GPUs

As for a better card, there is the 290 and 290x that are out now.. however there will be a launch of new cards from AMD and nVidia soon, when they launch the card prices should drop shortly after.


----------



## Arkanon

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You're going to find that it will indeed bottleneck a 280x, unless you have it running somewhere in the realm of 3.6+ GHz, and even then, it'll have it's hands full.
> Will definitely still run some games decently though!


Well, after some decent testing with an i5-750 @ 4GHz in some of the latest games ( bf hardline beta, dragon age inq,...) there is no bottleneck of the cpu what so ever. Cpu didn't go over 80% usage at all times. Granted the tests were done with a heavely overclocked 270x that should match or be around 280x stock speeds. So it's pretty safe to assume it won't bottleneck a 280x. However, you might get a bit better frames with a newer cpu due to architectural improvements, but in games it would be fairly minor (5-10 fps at most i assume).


----------



## BornOfScreams

Sapphire R9 280X Toxic Edition. Overclocked to 1175 from 1150 on the core. Haven't touched the memory. This card doesn't have much headroom.


----------



## tomytom99

Alright, I've got the pics, the thing is all at the factory clocks, 1150 Core, and 1500 RAM, outstanding 270x.


Love the thing.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Arkanon*
> 
> Well, after some decent testing with an i5-750 @ 4GHz in some of the latest games ( bf hardline beta, dragon age inq,...) there is no bottleneck of the cpu what so ever. Cpu didn't go over 80% usage at all times. Granted the tests were done with a heavely overclocked 270x that should match or be around 280x stock speeds. So it's pretty safe to assume it won't bottleneck a 280x. However, you might get a bit better frames with a newer cpu due to architectural improvements, but in games it would be fairly minor (5-10 fps at most i assume).


You go to overclocking the 280x though, and it is gonna walk away from that 270x by a great deal....

First thing to do with the 280x, is to check the GPU utlization during gameplay, using fully playable settings. I say this, because you could max the settings out on a given game, and get 100% GPU usage, but the game may run like ass if the card isn't capable of running it at those settings to begin with.

Once you have found the highest playable settings, use afterburner to monitor GPU and CPU usage. If you see CPU spikes, that's okay, as long as those spikes are not forcing the GPU usage to drop below 99% usage, and are not causing bothersome skips/dips.

Once you have found max game settings for stock clocks on the GPU, then begin overclocking it (if you are not getting CPU bottleneck already). Once you find your max stable overclock, go back and look at the GPU usage again during gameplay. If there are spots where usage drops below 99% (will directly coincide with CPU spikes of 99%+), then that means that the overclock has left you with no CPU overhead left to push the card any harder.

The next step is to see if the overclock you are running has made the card strong enough to run any of the settings any higher.
Sometimes this is the case, and by turning up a few settings a notch or two (with acceptable framerates of course), you may lean back on the GPU enough to get full utilization again.....

The only other options are to just accept the bottleneck if gameplay is still good, or try to overclock the CPU some more.

Let us know how you fair with it.....


----------



## Geniebeard

First post at Overclock.net so hey








Just got my XFX R9 280x 'black' this morning and loving it, such a good upgrade from my GTX 660.

Sorry for crappy quality picture.


----------



## dzigg

Quote:


> Originally Posted by *tabascosauz*
> 
> What are your clocks?


On that bioshock infinite test, didn't even OC my CPU, just stock 2,66 ghz.

Quote:


> Originally Posted by *Decerto*
> 
> I just got mine today...Gigabyte R9 270,overclocked it from 975 to 1050,thats the max it will go,and memory from 1400 to 1500.Works like a charm.Lots of people told me that my old Q6600 (overclocked to 3.4ghz) will bottleneck it,but as I guessed,people throw that term nowadays a lot,without even knowing what they are talking about.Ran unigine valley benchmark on extreme preset,98% gpu usage and got 40 fps average.More than happy with it


This is great to hear, now I'm really tempted to upgrade!
Quote:


> Originally Posted by *Arkanon*
> 
> Well, after some decent testing with an i5-750 @ 4GHz in some of the latest games ( bf hardline beta, dragon age inq,...) there is no bottleneck of the cpu what so ever. Cpu didn't go over 80% usage at all times. Granted the tests were done with a heavely overclocked 270x that should match or be around 280x stock speeds. So it's pretty safe to assume it won't bottleneck a 280x. However, you might get a bit better frames with a newer cpu due to architectural improvements, but in games it would be fairly minor (5-10 fps at most i assume).


Music to my ears







Especially since I have my eyes on 280x. Yep I can live with 5-10 fps difference. Thanks!


----------



## tabascosauz

Quote:


> Originally Posted by *Arkanon*
> 
> Well, after some decent testing with an i5-750 @ 4GHz in some of the latest games ( bf hardline beta, dragon age inq,...) there is no bottleneck of the cpu what so ever. Cpu didn't go over 80% usage at all times. Granted the tests were done with a heavely overclocked 270x that should match or be around 280x stock speeds. So it's pretty safe to assume it won't bottleneck a 280x. However, you might get a bit better frames with a newer cpu due to architectural improvements, but in games it would be fairly minor (5-10 fps at most i assume).


Alrighty...unless you received a magical 270X capable of 1750MHz+, I really don't think that the the OCed 270X and 280X are even remotely on the same level...

270X owners with well-binned cards capable of 1150-1200MHz would usually compare their results to the R9 280 or HD 7950...not the 280X. And considering most 280X are closer to the HD 7970 GHz Edition than the HD 7970, the gap widens further.
Quote:


> Originally Posted by *dzigg*
> 
> On that bioshock infinite test, didn't even OC my CPU, just stock 2,66 ghz.
> This is great to hear, now I'm really tempted to upgrade!
> Music to my ears
> 
> 
> 
> 
> 
> 
> 
> Especially since I have my eyes on 280x. Yep I can live with 5-10 fps difference. Thanks!


Regardless, you should still look to OCing that i5-750. It really will help.


----------



## tabascosauz

Quote:


> Originally Posted by *tabascosauz*
> 
> Alrighty...unless you received a magical 270X capable of 1750MHz+, I really don't think that the the OCed 270X and 280X are even remotely on the same level...
> 
> 270X owners with well-binned cards capable of 1150-1200MHz would usually compare their results to the R9 280 or HD 7950...not the 280X. And considering most 280X are closer to the HD 7970 GHz Edition than the HD 7970, the gap widens further.
> Regardless, you should still look to OCing that i5-750. It really will help.


----------



## dzigg

Quote:


> Originally Posted by *tabascosauz*
> 
> Regardless, you should still look to OCing that i5-750. It really will help.


Definitely, I already saved an OC profile on the BIOS so hopefully I'm good.

Now it's about choosing gtx 960, gtx 770, 280x, or wait for the 300 series to come.


----------



## Roboyto

Quote:


> Originally Posted by *dzigg*
> 
> Definitely, I already saved an OC profile on the BIOS so hopefully I'm good.
> 
> Now it's about choosing gtx 960, gtx 770, 280x, or wait for the 300 series to come.


Wait for 300 even if you want a green card because their prices will drop.


----------



## Arkanon

Quote:


> Originally Posted by *tabascosauz*
> 
> Alrighty...unless you received a magical 270X capable of 1750MHz+, I really don't think that the the OCed 270X and 280X are even remotely on the same level....


Then i'd suggest looking up the benches in here that i posed of my 270x doing 1360 on core. Comes damn close to stock 280x's. And yes, that's Game stable.


----------



## Arkanon

Quote:


> Originally Posted by *Catscratch*
> 
> Great nonetheless. You are above reference 280x. Almost my stock 280x trix 1020mhz (no-boost) level. 6430 Firestrike (with 2500k at 4ghz.)
> 
> 
> 
> According to 3dmark site, there are people with 2500k and 280x that score 7500. And the clocks are like 1100 on the cards, some has 4.9ghz cpu. Shouldn't make a 1100 points difference thou
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *WOAH*


Quote:


> Originally Posted by *Arkanon*
> 
> Stock 280x beaten


there we go.


----------



## rdr09

Quote:


> Originally Posted by *Recr3ational*
> 
> Thank you.
> I have a 360 and a phobya 200, the loop is weird but it goes under my psu compartment. I use a XSPC d5 with an EK top.


Thanks for the info.


----------



## tabascosauz

Quote:


> Originally Posted by *Arkanon*
> 
> there we go.


Well, I'll attach a screenie of my results on the 280X.

Anyways, your R9 270X is a lot stronger than I thought it would be, so I'm not going to hold it against you. I stand corrected









The 7k result is downclocked to 1020 core/1550 mem, the 8k result is 1150 core/1575 mem. It appears that the my 4790K vs your i5-750 might have something to do with the physics score.





EDIT: Thought you had a 2500K. My 4790K scores 5000 higher than your i5-750 lol


----------



## Arkanon

Physics scores are irrelevant anyway








Just tried to prove the point that a 270X ain't as slow as some people think they are. Ofc with a fairly 'insane' overclock on it, which i'm guessing 95% of the other 270x's won't/can't reach.


----------



## Devildog83

Quote:


> Originally Posted by *tomytom99*
> 
> Alright, I've got the pics, the thing is all at the factory clocks, 1150 Core, and 1500 RAM, outstanding 270x.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Love the thing.


You have been added!! Gotta' love a "Toxi". Welcome.


----------



## Devildog83

Quote:


> Originally Posted by *Arkanon*
> 
> Then i'd suggest looking up the benches in here that i posed of my 270x doing 1360 on core. Comes damn close to stock 280x's. And yes, that's Game stable.


I agree there are some 270x's that will clock like crazy, I had one that was pretty good, but a 384 bit bus, 3 Gb's memory and 2048 stream processors compared to 256,2 Gb's and 1280, just crazy clocks alone will not make a 270x perform nearly as good as a 280x regardless of the bench's you may have posted. Also running a card to those extremes for any length of time will shorten it's lifespan dramatically.

Having said that 1360 core on a 270x is crazy good!!! I could get mine to 1260 with some added volts and lower memory clock but it was best at 1235/1590 memory.


----------



## tomytom99

Quote:


> Originally Posted by *Devildog83*
> 
> I agree there are some 270x's that will clock like crazy, I had one that was pretty good, but a 384 bit bus, 3 Gb's memory and 2048 stream processors compared to 256,2 Gb's and 1280, just crazy clocks alone will not make a 270x perform nearly as good as a 280x regardless of the bench's you may have posted. Also running a card to those extremes for any length of time will shorten it's lifespan dramatically.
> 
> Having said that 1360 core on a 270x is crazy good!!! I could get mine to 1260 with some added volts and lower memory clock but it was best at 1235/1590 memory.


I remember my old 7870 would get a mad OC, and my friend's only got a few extra MHz. The whole inequality thing pretty much goes for anything that uses a processing chip.


----------



## Devildog83

Quote:


> Originally Posted by *tomytom99*
> 
> I remember my old 7870 would get a mad OC, and my friend's only got a few extra MHz. The whole inequality thing pretty much goes for anything that uses a processing chip.


True: it's silicon and what get's stuck in it.

I had a 7870 Devil and a 270x Devil - Supposed to be about the same but the 7870 would do 1275+ core with 1450 mem. and as mentioned earlier the 270x was best at a high of 1235/1590. I ran them everyday at 1200/1400, finally the 7870 gave out after many months of hard use.


----------



## StriickeN

If I could get a visiontek r9 280x for $100 would that be a good price?


----------



## tabascosauz

Quote:


> Originally Posted by *StriickeN*
> 
> If I could get a visiontek r9 280x for $100 would that be a good price?


Sure. That's an insane price.

But the seller probably has a reason for selling it so low. There's no reason why a perfectly good 280X cannot fetch $150 on the market. Maybe he mined extensively on it without any regard for temperatures?


----------



## StriickeN

Quote:


> Originally Posted by *tabascosauz*
> 
> Sure. That's an insane price.
> 
> But the seller probably has a reason for selling it so low. There's no reason why a perfectly good 280X cannot fetch $150 on the market. Maybe he mined extensively on it without any regard for temperatures?


It's on craigslist, the pics show it fully wrapped with the plastic and everything. I'm guessing he maybe got it as a gift and really needs money lol, luckily he's gonna gold it for me.







It's hard to find reviews on the visiontek model so I had no idea of the current pricing.. amazon never helps


----------



## tabascosauz

Quote:


> Originally Posted by *StriickeN*
> 
> It's on craigslist, the pics show it fully wrapped with the plastic and everything. I'm guessing he maybe got it as a gift and really needs money lol, luckily he's gonna gold it for me.
> 
> 
> 
> 
> 
> 
> 
> It's hard to find reviews on the visiontek model so I had no idea of the current pricing.. amazon never helps


Anyone can wrap it back up in the bubble wrap or ESD bag it originally came in. I'd advise you to inquire a little bit as to how he has been using the card.

The Visiontek 280X looks to be AMD's reference PCB, so if you're into fullcover waterblocks...it'd be a good choice.


----------



## StriickeN

Quote:


> Originally Posted by *tabascosauz*
> 
> Anyone can wrap it back up in the bubble wrap or ESD bag it originally came in. I'd advise you to inquire a little bit as to how he has been using the card.
> 
> The Visiontek 280X looks to be AMD's reference PCB, so if you're into fullcover waterblocks...it'd be a good choice.


Thanks for the info







i'll make sure to ask a billion questions + test it before buying, I wouldn't want a mining card.


----------



## Recr3ational

Quote:


> Originally Posted by *StriickeN*
> 
> Thanks for the info
> 
> 
> 
> 
> 
> 
> 
> i'll make sure to ask a billion questions + test it before buying, I wouldn't want a mining card.


I don't want to be one of those guys but, I bought 3 ex miners, 2 x 7950 and a 280x all three has no problems. To me, miners tend to look after their cards... The only problem is that they tend to run the fans at 100% 24:7


----------



## JackCY

^ that

I got mine from someone who ran CF but was stupid enough that the top card sagged. Since his mobo was the silly type with no extra free PCI-E slot in between the cards, they were right next to each other. I have no idea how the top card could have any air intake, needless to say the top card sagged and one of the fans hit the bottom card that I now have (with 3y warranty, only 4 months old) and the fan on top card broke and he RMAd the top card. Selling both and getting 290 instead or something or keeping the single RMAd 280x instead.
Damage on mine? Almost none, one or two bent pins on the rear of the PCB, no scratches or anything, you won't notice unless someone tells you and points you to it.
Plus I think all these high power cards after a while of use look like they've been through a heat process since they reach so high temperatures as seen in reviews with thermo cameras/photos. The PCBs tend to look like it. Doesn't matter if you mine or game.


----------



## StriickeN

Quote:


> Originally Posted by *Recr3ational*
> 
> I don't want to be one of those guys but, I bought 3 ex miners, 2 x 7950 and a 280x all three has no problems. To me, miners tend to look after their cards... The only problem is that they tend to run the fans at 100% 24:7


You're not considered "one of those guys" to me, i'm open to opinion and tips as I am pretty much a noob with gpu's lol. I'll keep that in mind, thanks!


----------



## Devildog83

Quote:


> Originally Posted by *StriickeN*
> 
> You're not considered "one of those guys" to me, i'm open to opinion and tips as I am pretty much a noob with gpu's lol. I'll keep that in mind, thanks!


I also bought a 290 that was used for mining for about a month and I have no issues except it's a reference and it runs warm or the fans need to ramp up to keep it cool. That will be solved soon as I plan adding to my loop in the near future along with a 240mm rad in the front.


----------



## Recr3ational

Quote:


> Originally Posted by *StriickeN*
> 
> You're not considered "one of those guys" to me, i'm open to opinion and tips as I am pretty much a noob with gpu's lol. I'll keep that in mind, thanks!


Good! It's nice that you're taking it positively. I love my 280x's.


----------



## BruceB

Quote:


> Originally Posted by *Devildog83*
> 
> True: it's silicon and what get's stuck in it.
> 
> I had a 7870 Devil and a 270x Devil - Supposed to be about the same but the 7870 would do 1275+ core with 1450 mem. and as mentioned earlier the 270x was best at a high of 1235/1590. I ran them everyday at 1200/1400, *finally the 7870 gave out after many months of hard use*.


Could you tell me (round about) how Long your 7870 Held out with an OC? I'm starting to worry my 280X might give up the ghost earlier than it should because of the OC


----------



## Mr-Dark

Quote:


> Originally Posted by *BruceB*
> 
> Could you tell me (round about) how Long your 7870 Held out with an OC? I'm starting to worry my 280X might give up the ghost earlier than it should because of the OC


My msi 280x in the paste give me artfact after 10 days of oc every time then back to stock


----------



## Agent Smith1984

Quote:


> Originally Posted by *BruceB*
> 
> Could you tell me (round about) how Long your 7870 Held out with an OC? I'm starting to worry my 280X might give up the ghost earlier than it should because of the OC


Brother's 7950 ran at 1200/1825 with 50% power limit for several months no problem.

My 280x ran at 1.3v 1250/1800 with 20% power limit for over 4 months no problem.

As long as temps are good, the hardware should hold up for a few years minimum....


----------



## tabascosauz

Quote:


> Originally Posted by *Recr3ational*
> 
> I don't want to be one of those guys but, I bought 3 ex miners, 2 x 7950 and a 280x all three has no problems. To me, miners tend to look after their cards... The only problem is that they tend to run the fans at 100% 24:7


In that sense, of course. No one wants their cards to melt prematurely, so they put it on 100% speed.

Now that's a huge problem in itself, because there are about 1 or 2 AIB partners that actually implement ball-bearing fans. The rest are all sleeve. I wouldn't want to buy a card that has been used for mining at 100% fan speed for a year, and then try to make the fan last for 2 more years.

They start making funny noises and doing funny things, then they go.


----------



## JackCY

Quote:


> Originally Posted by *tabascosauz*
> 
> In that sense, of course. No one wants their cards to melt prematurely, so they put it on 100% speed.
> 
> Now that's a huge problem in itself, because there are about 1 or 2 AIB partners that actually implement ball-bearing fans. The rest are all sleeve. I wouldn't want to buy a card that has been used for mining at 100% fan speed for a year, and then try to make the fan last for 2 more years.
> 
> They start making funny noises and doing funny things, then they go.


And you collect your nice RMA before the 3 years of warranty are gone and get a completely new generation card or at worst the same one if it's still sold









Certainly easier to explain that the fan died than trying to explain the card artifacts on stock clocks


----------



## bigaza2151

There was a recent "build a cheap pc" type video that tek syndicate did that has me kinda confused.

Basically logan recommends getting an r9 290 toxic edition for the build and then adds "for the ppl saying amd runs too hot and loud, who the hell runs a stock cooler anyhow!?"

Now what has me confused is, if i were to go buy a r9290 is he implying that even the non ref high end coolers are not good enough? The issue for me in australia has always been high temps in summer and when i hear things like this it really discourages me from buying a new amd card seeing as i dont really want to tinker with my cards to try throw them on water cooling. Ive built 3 rigs now and about the only thing i havent messed with is card cooling, just seems a lil tricky when you get into pulling apart your card


----------



## Canon007

returned to bf3 and i see my dual gpus never surpassing 50 percent is ti time to upgrade my cpu? If so which one should I choose?


----------



## diggiddi

I don't know what speed you have it at, but first I'd try overclocking to 4.0-4.2 ghz if your cooling is sufficient


----------



## Alfshizzle

Been playing with the omega driver on 2x 280x top Asus cards, peformance seems great but I was wondering what drivers others have been using/ have been found to give the best performance?

I've seen 14.7 be suggested in this thread, 14.9 also.


----------



## JackCY

Quote:


> Originally Posted by *bigaza2151*
> 
> There was a recent "build a cheap pc" type video that tek syndicate did that has me kinda confused.
> 
> Basically logan recommends getting an r9 290 toxic edition for the build and then adds "for the ppl saying amd runs too hot and loud, who the hell runs a stock cooler anyhow!?"
> 
> Now what has me confused is, if i were to go buy a r9290 is he implying that even the non ref high end coolers are not good enough? The issue for me in australia has always been high temps in summer and when i hear things like this it really discourages me from buying a new amd card seeing as i dont really want to tinker with my cards to try throw them on water cooling. Ive built 3 rigs now and about the only thing i havent messed with is card cooling, just seems a lil tricky when you get into pulling apart your card


Reference blowers are awful, doesn't matter if it's AMD or NV.
Lots of NV cards runs hotter than AMD, it's all about what fan profiles the cards have set up by default or what you choose.

Some non reference design coolers are fine some are not so great but often better than the reference blowers.

I think they meant "reference" not "stock".

Second hand 290 (4GB), good buy. A new one? Probably not since it's in the range of better 970 (3.5+0.5GB).


----------



## Devildog83

Quote:


> Originally Posted by *BruceB*
> 
> Could you tell me (round about) how Long your 7870 Held out with an OC? I'm starting to worry my 280X might give up the ghost earlier than it should because of the OC


I had it for more than a year, I did a lot of bench's with it so it ran at 100% a lot.

I really tortured it and then X-Fire and tortured it some more.


----------



## bigaza2151

yeah i know reference models generally suck

here is the video in question, at about the 5:20 mark

http://youtu.be/SsiHegrG3gs?t=5m22s


----------



## The Pook

Got my R9 280 a month or so ago and they already dropped $60 ... buying an i5 or an i7 (cant' decide) before GTA V comes out and might grab as second R9 280 if it drops another ~5-10%. Sitting at $159.99 with MIR at the moment


----------



## tabascosauz

Quote:


> Originally Posted by *JackCY*
> 
> Reference blowers are awful, doesn't matter if it's AMD or NV.
> Lots of NV cards runs hotter than AMD, it's all about what fan profiles the cards have set up by default or what you choose.
> 
> Some non reference design coolers are fine some are not so great but often better than the reference blowers.
> 
> I think they meant "reference" not "stock".
> 
> Second hand 290 (4GB), good buy. A new one? Probably not since it's in the range of better 970 (3.5+0.5GB).


For Pete's sake, don't get started on Nvidia's reference coolers. Everything about NVTTM aside from its appearance is garbage. Enhanced vapor chamber my ass. The only reason any of the GTX 770/780/780 Ti/970/980/Titan (Black) stock coolers manage stay at 80/82°C is because Boost caps it there. It's basically the same mechanism as AMD's throttling, except AMD's occurs at 94°C and is much more aggressive. And that's all assuming that 82°C is acceptable in the first place. It's only now considered "acceptable" because Nvidia wants us to believe that it is, and everyone trusts Nvidia of course. Basically the same technology as the R9 290 reference coolers, just on top of a GPU with more palatable thermal characteristics.

And the lower end GTX 670/760 coolers are just deplorable abominations that shouldn't exist in the first place. I'm honestly pretty glad that AMD didn't push out reference blower R9 270 cards, because they'd probably be just as cheaply built. Just a stupid slab of metal. Nothing else.

I guess you could argue that those cards are the ones built for FC waterblocks, but most people buy them without that intention.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Alfshizzle*
> 
> Been playing with the omega driver on 2x 280x top Asus cards, peformance seems great but I was wondering what drivers others have been using/ have been found to give the best performance?
> 
> I've seen 14.7 be suggested in this thread, 14.9 also.


14.7 ran best on my Asus 280x. Gave the best stability too. Not sure if you have the 1600mhz 1.6v memory version or the 1500mhz 1.5v revision, but mine was a1600 model and it was super finicky with drivers. The VRAM would artifact on all other drivers i tried. I also found that i actually had to clock it up to 1800 to use that extra voltage. Very strange card.... It was fast as hell for a Tahiti though. Core did 1250 on 1.3v


----------



## Alfshizzle

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 14.7 ran best on my Asus 280x. Gave the best stability too. Not sure if you have the 1600mhz 1.6v memory version or the 1500mhz 1.5v revision, but mine was a1600 model and it was super finicky with drivers. The VRAM would artifact on all other drivers i tried. I also found that i actually had to clock it up to 1800 to use that extra voltage. Very strange card.... It was fast as hell for a Tahiti though. Core did 1250 on 1.3v


Mine are the 1600mhz versions, never had any artifacting problems with them, I seem to be one of the lucky ones as many have had issues.

I'll try the 14.7 betas and see if they improve performance over the omega drivers.

Mine will also do 1250 on 1.3v +20%, but the vram will only do 1700mhz.


----------



## Devildog83

Big scare here yesterday. I thought my GPU went bad in my system. It was unstable for a while and I kept trying everything I could. You know the drill. It turns out the POS Krait Edition Z97 beautiful black and white mobo I bought is crap after all. I basically tested the RAM slots and #2 and #4 are goners. Constant freezes, BSOD's and or restarts. Tried slot 1 and it was good, 1 and 3 and good to go. It recommends using slots 2 and 4 for only 2 sticks but that's where I am stuck. I want to RMA the thing but don't want to be without a PC if I don't have to. At least the R9 290 is still running perfect.


----------



## frankenstein406

I just picked up a sapphire dual x, 280x and It seems like the frames keep dropping even if I lower game settings. Would anyone have any ideas on what would cause this before I make a thread please.


----------



## MiladEd

I've the exact model, I don't have any specific problems with it. What games are you having this problem with?


----------



## frankenstein406

Quote:


> Originally Posted by *MiladEd*
> 
> I've the exact model, I don't have any specific problems with it. What games are you having this problem with?


So far cod 2 multiplayer, seems to be getting high frame rates on san andreas tho.


----------



## tabascosauz

Quote:


> Originally Posted by *Devildog83*
> 
> Big scare here yesterday. I thought my GPU went bad in my system. It was unstable for a while and I kept trying everything I could. You know the drill. It turns out the POS Krait Edition Z97 beautiful black and white mobo I bought is crap after all. I basically tested the RAM slots and #2 and #4 are goners. Constant freezes, BSOD's and or restarts. Tried slot 1 and it was good, 1 and 3 and good to go. It recommends using slots 2 and 4 for only 2 sticks but that's where I am stuck. I want to RMA the thing but don't want to be without a PC if I don't have to. At least the R9 290 is still running perfect.


Sorry to hear that. I had my suspicions about the Z97 Krait SLI. White color scheme or not, matte PCB coating or not, it's still based on the cheap and flimsy Z97 SLI. The Sabertooth Mark S and Z97-AR look like much better boards.

I'd tell you to buy a H81 or B85 board, but it seems that aesthetics are a priority for you so that won't work :/


----------



## Agent Smith1984

Quote:


> Originally Posted by *frankenstein406*
> 
> I just picked up a sapphire dual x, 280x and It seems like the frames keep dropping even if I lower game settings. Would anyone have any ideas on what would cause this before I make a thread please.


Have you checked your CPU usage?
If the CPU is topping out during any part of the game, you are going to get skips.....
i5 750 is going to pose as a bottleneck in some cases I would think....


----------



## Devildog83

Quote:


> Originally Posted by *tabascosauz*
> 
> Sorry to hear that. I had my suspicions about the Z97 Krait SLI. White color scheme or not, matte PCB coating or not, it's still based on the cheap and flimsy Z97 SLI. The Sabertooth Mark S and Z97-AR look like much better boards.
> 
> I'd tell you to buy a H81 or B85 board, but it seems that aesthetics are a priority for you so that won't work :/


The Z97 AR doesn't look bad but I would have to get a refund from Newegg and then purchase the AR from Tiger Direct or something because Newegg is out of them. The Heat sink would be easy to paint white with a ceramic based paint. It also has has 8 Phases as opposed to 6 so a 4690k/4790k shouldn't have any overclocking issues.


----------



## crazyxelite

hi everyone can i flash a r9 280 or 280x on my msi 7950?


----------



## frankenstein406

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Have you checked your CPU usage?
> If the CPU is topping out during any part of the game, you are going to get skips.....
> i5 750 is going to pose as a bottleneck in some cases I would think....


No what what be a good program for that? I have it overclocked to 3.6, but it is a older cpu. The thing I can't figure out is even going from the main screen to the server page it drops down from 125fps to 111fps. Its like it will hold steady at 125 and just all of a sudden dip down to 34 fps.


----------



## tabascosauz

Quote:


> Originally Posted by *frankenstein406*
> 
> No what what be a good program for that? I have it overclocked to 3.6, but it is a older cpu. The thing I can't figure out is even going from the main screen to the server page it drops down from 125fps to 111fps. Its like it will hold steady at 125 and just all of a sudden dip down to 34 fps.


Lol if you're already at 3.6GHz there isn't much more you can do. And that framerate dip, if I understand correctly, isn't in game but kinda like on the map loading screen in BF3 where the FPS count is there and displays <30fps even though there's actually nothing going on except the word LOADING... and a little animation on the dots. So it doesn't even matter lol

Unless you're willing to push it to 4GHz, I think there shouldn't be any problems with what you have now. At stock (3.2GHz), the i5-650 manages still to best newer Steamroller CPUs like the 7850K's pseudo-core at Cinebench single-threaded. I'd imagine Lynnfield's core performs similarly to Clarkdale, so at 3.6GHz you're already at a healthy performance level.


----------



## tabascosauz

Quote:


> Originally Posted by *Devildog83*
> 
> The Z97 AR doesn't look bad but I would have to get a refund from Newegg and then purchase the AR from Tiger Direct or something because Newegg is out of them. The Heat sink would be easy to paint white with a ceramic based paint. It also has has 8 Phases as opposed to 6 so a 4690k/4790k shouldn't have any overclocking issues.


Yeah, the Z97-A has pretty standard VRM with the IR3548 and has pretty good ratings across caps and mosfets, and I'd imagine the Z97-AR is identical in that regard, so no problems there. God knows what kind of horrible VRMs MSI has devised to put on their low-end Z97 boards; back in the 7-series days the Z77A-G45 and the like were some truly deplorable boards. I'm guessing MSI hasn't really changed since then in their approach to budget boards.

Anyways, all the best of luck to you, because I know that Newegg is has some pretty dubious services when it comes to boards. But I'm pretty sure it'll work out well, as long as pins aren't bent







you know how Newegg handles THOSE boards.


----------



## frankenstein406

Quote:


> Originally Posted by *tabascosauz*
> 
> Lol if you're already at 3.6GHz there isn't much more you can do. And that framerate dip, if I understand correctly, isn't in game but kinda like on the map loading screen in BF3 where the FPS count is there and displays <30fps even though there's actually nothing going on except the word LOADING... and a little animation on the dots. So it doesn't even matter lol
> 
> Unless you're willing to push it to 4GHz, I think there shouldn't be any problems with what you have now. At stock (3.2GHz), the i5-650 manages still to best newer Steamroller CPUs like the 7850K's pseudo-core at Cinebench single-threaded. I'd imagine Lynnfield's core performs similarly to Clarkdale, so at 3.6GHz you're already at a healthy performance level.


No it is doing it in game, It will hold at 125 and all of a sudden drop down to 34 or so. Cod 2 seems to be the only one having this issue, it holds 60 fps on ultra on skyrim.


----------



## dzigg

Quote:


> Originally Posted by *frankenstein406*
> 
> No what what be a good program for that? I have it overclocked to 3.6, but it is a older cpu. The thing I can't figure out is even going from the main screen to the server page it drops down from 125fps to 111fps. Its like it will hold steady at 125 and just all of a sudden dip down to 34 fps.


I have the same CPU, currently using gtx 285 (the old nvidia ones, heh), and experiencing similiar problems on Bioshock infinite.

Have you tried MSI Afterburner? Mine indicates that somehow the GPU utilization drops from 100% to 30%, that's what makes the framedrops. In my case, I suspect it's the GPU throttling because of high temps (mine goes to 92 degrees sometimes!). Maybe you should check if that's also the case with you.

Hoping it's not the CPU, cause I'm not planning upgrading anytime soon.


----------



## BookerCZ

Hi guys !
I had a 7870 XT, which burned down. I bouht R9 280x Vapor X. What is the maximum clocks for this graphic?


----------



## MiladEd

Quote:


> Originally Posted by *BookerCZ*
> 
> Hi guys !
> I had a 7870 XT, which burned down. I bouht R9 280x Vapor X. What is the maximum clocks for this graphic?


Feel free go as high as you get about 80 C temps, I've OCed mine (not Vapor-X, just Dual-X) at 1060 MHz and I'm getting about 70 C at high load. You must get worried if your temps reach 85 C or something.


----------



## tabascosauz

Quote:


> Originally Posted by *BookerCZ*
> 
> Hi guys !
> I had a 7870 XT, which burned down. I bouht R9 280x Vapor X. What is the maximum clocks for this graphic?


I run mine at 1150 core/ 1575 mem/ 1.25V for very demanding games. The fan profile for those clocks is 45% continuous.

As it can get quite loud, I tend to stick to the stock 1070, which I can run for hours at around 38% and not have it break 74°C in most games.

I've tried 1200 core/1575 mem/1.3V for benchmarking (albeit at 50% fan speed which is WAY too loud), but I consider it beyond the point of diminishing returns. It produces massive amounts of fan noise in return for a handful of FPS.


----------



## frankenstein406

Quote:


> Originally Posted by *dzigg*
> 
> I have the same CPU, currently using gtx 285 (the old nvidia ones, heh), and experiencing similiar problems on Bioshock infinite.
> 
> Have you tried MSI Afterburner? Mine indicates that somehow the GPU utilization drops from 100% to 30%, that's what makes the framedrops. In my case, I suspect it's the GPU throttling because of high temps (mine goes to 92 degrees sometimes!). Maybe you should check if that's also the case with you.
> 
> Hoping it's not the CPU, cause I'm not planning upgrading anytime soon.


Haven't tried afterburner, but with hardware monitor its showing max of 62c for gpu and max of 67c for cpu.


----------



## Derek129

Using Asus gpu tweak with my ASUS r9 270 oc card I just achieved a very stable overclock of 1155 core and 1650 memory. This is from 975 core and 1400 mem stock clocks.Temperature does not break past 75c while stress testing with furmark for 15 minutes. This is also my first ever gpu overclock. Very stoked with the outcome!


----------



## neurotix

Quote:


> Originally Posted by *Derek129*
> 
> Using Asus gpu tweak with my ASUS r9 270 oc card I just achieved a very stable overclock of 1155 core and 1650 memory. This is from 975 core and 1400 mem stock clocks.Temperature does not break past 75c while stress testing with furmark for 15 minutes. This is also my first ever gpu overclock. Very stoked with the outcome!


1650 RAM clock on Pitcairn is an extremely rare and excellent overclock. You are very lucky.

My 270X won't do anything over 1500 without black screening.

My R7 265 (7850, same GPU) will also not pass 1500 RAM.

I think most people here can't get much higher than 1500 without severe problems.


----------



## Derek129

I am very pleased to hear that then! I followed this guide and it was smooth sailing from there.
http://rog.asus.com/96782012/gaming-graphics-cards-2/a-simple-guide-to-overclocking-your-graphics-card-with-gpu-tweak/


----------



## radier

GDDR5 memory has error correction mechanism. You can rise clock but performance will decrease. Unigine Heaven is good benchmark to find out real max clock.

Wysłane z mojego GT-N7000


----------



## oshit

Hi guys, got a few question on R9 270X, particular the Asus DirectCU2 TOP.

A) Is the max wattage and amps used by this card = the 2 x PCIe 6pin + the PCIe slot

= ((2x80) + 75) watts
= 235 watts?

and

235watts / 12v = 19.58 amps total?

If my PSU have 3 x 12v rails with 18amps each, do i divide the 19.58amp to 3? (lets assume that this 3 PCIe connecter/slot uses the same amps and wattage - i can recalculate later)

Is the 2 formula above correct?

B) For the motherboard *alone*, how do you calculate the wattage and amps used at its peak(non-overclock)?

C) For the CPU, how do u get the wattage and amps used at its peak (non-overclock)? Is there info on the CPU spec?

Thanks.


----------



## JackCY

What PSU is that? You gotta know what lines are on what connectors. Some PSUs are marked as multi rail but in fact are manufactured as single rail.


----------



## frankenstein406

delete


----------



## MiladEd

Hey, can I update my stats? Clock to 1060 MHz and memory clock to 1600 MHz. Thanks!


----------



## afaque

hello guys!
im new to this forum!
and i m in great problem right now! i have got a r9 280x gigabyte rev 2
and i have artifacts in my games after 15 or 20 mins of gaming.. sometimes the games crash to desktop
i have tried reinstalling windows, trying old drivers and stuff but with no luck..
i read many forums online that maybe the memory voltage is set to something by gigabyte thats not gud for the gpu to run!
so i wanna noe that how can i edit bios if possible and solve this issue!








im in great trouble right now! plzz anyone help!
ive got hynix memory and 1.2v max adjustable at 1100/1500
and i wanna noe how to solve this issue plzz if anyone could help me


----------



## Catscratch

Quote:


> Originally Posted by *afaque*
> 
> hello guys!
> im new to this forum!
> and i m in great problem right now! i have got a r9 280x gigabyte rev 2
> and i have artifacts in my games after 15 or 20 mins of gaming.. sometimes the games crash to desktop
> i have tried reinstalling windows, trying old drivers and stuff but with no luck..
> i read many forums online that maybe the memory voltage is set to something by gigabyte thats not gud for the gpu to run!
> so i wanna noe that how can i edit bios if possible and solve this issue!
> 
> 
> 
> 
> 
> 
> 
> 
> im in great trouble right now! plzz anyone help!
> ive got hynix memory and 1.2v max adjustable at 1100/1500
> and i wanna noe how to solve this issue plzz if anyone could help me


You can't change memory voltage. what you can try is to reduce memory speed to like 1490 and try the games. You should also give us the details of your system, Power Supply, CPU, Motherboard, etc.


----------



## afaque

Quote:


> Originally Posted by *Catscratch*
> 
> You can't change memory voltage. what you can try is to reduce memory speed to like 1490 and try the games. You should also give us the details of your system, Power Supply, CPU, Motherboard, etc.


ok so my pc is as follows!
i3 2120
6gb ram ddr3
gigabyte r9 280x rev 2
pc power and cooling 610w sli crossfire certified psu
gigabyte ga-b75m-d3h
and cm usp 100 case which is modded by me!
and i have trid using it in my friends pc and the issue remains the same! shows artifacts in games!


----------



## afaque

Quote:


> Originally Posted by *Catscratch*
> 
> You can't change memory voltage. what you can try is to reduce memory speed to like 1490 and try the games. You should also give us the details of your system, Power Supply, CPU, Motherboard, etc.


do u think that my psu is enough for this gpu
can it be the cause of artifacts and all
cause my psu is 4 years old and i bought the gpu just two months back!
my psu is an ocz silencer eps 610w!
thanks in advance


----------



## Wicked_Bass

Quote:


> Originally Posted by *afaque*
> 
> do u think that my psu is enough for this gpu
> can it be the cause of artifacts and all
> cause my psu is 4 years old and i bought the gpu just two months back!
> my psu is an ocz silencer eps 610w!
> thanks in advance


Your psu should be enough for one card.


----------



## JackCY

Quote:


> Originally Posted by *afaque*
> 
> hello guys!
> im new to this forum!
> and i m in great problem right now! i have got a r9 280x gigabyte rev 2
> and i have artifacts in my games after 15 or 20 mins of gaming.. sometimes the games crash to desktop
> i have tried reinstalling windows, trying old drivers and stuff but with no luck..
> i read many forums online that maybe the memory voltage is set to something by gigabyte thats not gud for the gpu to run!
> so i wanna noe that how can i edit bios if possible and solve this issue!
> 
> 
> 
> 
> 
> 
> 
> 
> im in great trouble right now! plzz anyone help!
> ive got hynix memory and 1.2v max adjustable at 1100/1500
> and i wanna noe how to solve this issue plzz if anyone could help me


1100/1500MHz is stock, 1.2V on core should be stock at least that's what AMD manufactures it as. If it can't run that, RMA it.
If it's not the card but your PC, then probably a crappy low quality PSU or some other instability not connected to GPU.

ASUS runs some 280x with higher memory voltage, I think 1.6V on mine, so it can do 7.2GHz instead of stock 6.4GHz. Most Nvidia cards run 7GHz as they probably pay the little extra for better memory chips and volt them higher than AMD.


----------



## afaque

Quote:


> Originally Posted by *JackCY*
> 
> 1100/1500MHz is stock, 1.2V on core should be stock at least that's what AMD manufactures it as. If it can't run that, RMA it.
> If it's not the card but your PC, then probably a crappy low quality PSU or some other instability not connected to GPU.
> 
> ASUS runs some 280x with higher memory voltage, I think 1.6V on mine, so it can do 7.2GHz instead of stock 6.4GHz. Most Nvidia cards run 7GHz as they probably pay the little extra for better memory chips and volt them higher than AMD.


what if i cant rma








i think there must be a way to solve this issue anywhere


----------



## JackCY

Quote:


> Originally Posted by *afaque*
> 
> what if i cant rma
> 
> 
> 
> 
> 
> 
> 
> 
> i think there must be a way to solve this issue anywhere


Does it run with underclocked core? To say 1000MHz only?
What about only 1400MHz VRAM?

If it still crashes even with significantly lower clocks, then it's probably not even due to the GPU. You can always ask someone to test the GPU in their system to be sure.
VBIOS on 280x? As far as I know the last ones working were for the previous generation, 7970 and 280x with their legacy and UEFI VBIOS do not work very well, plus you can brick the card too unless it has dual VBIOS so you could recover. Plus I think the powertune part was never able to be adjusted. So sort of a hack to input/force boost voltages and clocks on some selected cards via VBIOS. You can do that with AB, GPUTweak, etc. the soft way via software on the fly. I think it was mainly a thing for Linux miners to change the VBIOSes.


----------



## afaque

Quote:


> Originally Posted by *JackCY*
> 
> Does it run with underclocked core? To say 1000MHz only?
> What about only 1400MHz VRAM?
> 
> If it still crashes even with significantly lower clocks, then it's probably not even due to the GPU. You can always ask someone to test the GPU in their system to be sure.
> VBIOS on 280x? As far as I know the last ones working were for the previous generation, 7970 and 280x with their legacy and UEFI VBIOS do not work very well, plus you can brick the card too unless it has dual VBIOS so you could recover. Plus I think the powertune part was never able to be adjusted. So sort of a hack to input/force boost voltages and clocks on some selected cards via VBIOS. You can do that with AB, GPUTweak, etc. the soft way via software on the fly. I think it was mainly a thing for Linux miners to change the VBIOSes.


i have dual bios on the card! actually im thinking that all the aritfacts im getting are maybe due to memory or vrm heating up so by reducing the voltage a bit maybe i can get rid of those artifacts cuz temp will decrease by decreasing voltages! so the thing i wanna noe is that from factory my card is an oc version with 1100/1500 speed at 1.2v how much can i decrease this voltage on these clocks can anyone tell me that! and i will edit only my own vbios in vbe7 so that the card can run and i will modify the fan curves also to keep the card cool as much as possible!
i need guidence as to which voltage to start from in order to keep that speeds! ?


----------



## afaque

Quote:


> Originally Posted by *JackCY*
> 
> Does it run with underclocked core? To say 1000MHz only?
> What about only 1400MHz VRAM?
> 
> If it still crashes even with significantly lower clocks, then it's probably not even due to the GPU. You can always ask someone to test the GPU in their system to be sure.
> VBIOS on 280x? As far as I know the last ones working were for the previous generation, 7970 and 280x with their legacy and UEFI VBIOS do not work very well, plus you can brick the card too unless it has dual VBIOS so you could recover. Plus I think the powertune part was never able to be adjusted. So sort of a hack to input/force boost voltages and clocks on some selected cards via VBIOS. You can do that with AB, GPUTweak, etc. the soft way via software on the fly. I think it was mainly a thing for Linux miners to change the VBIOSes.


and what do u think is the advanatage of using uefi and non uefi bios i mean whats the difference! i dont know anything about uefi stuff can u clarify that


----------



## tomytom99

Quote:


> Originally Posted by *afaque*
> 
> and what do u think is the advanatage of using uefi and non uefi bios i mean whats the difference! i dont know anything about uefi stuff can u clarify that


It's only subtle differences that you'll notice if you're running 8, or if your main board supports it. It's just an updated BIOS configuration.


----------



## KingCry

Guess I'm a little late to the party? R9 280 Gaming from MSI


----------



## daffy.duck

Quote:


> Originally Posted by *KingCry*
> 
> Guess I'm a little late to the party? R9 280 Gaming from MSI


Holy crap! Crazy overclock!


----------



## KingCry

Quote:


> Originally Posted by *daffy.duck*
> 
> Holy crap! Crazy overclock!


Thank you!

Yeah that was benching gaming is 1250MHz core 1850 Memory and its stable as all hell too.


----------



## daffy.duck

^^^ Sounds like a hell of an overclocker


----------



## KingCry

Quote:


> Originally Posted by *daffy.duck*
> 
> ^^^ Sounds like a hell of an overclocker


Yeah I feel like could get more if I could get voltage control back again. I lost it so its locked at 1.25v


----------



## Devildog83

Quote:


> Originally Posted by *MiladEd*
> 
> Hey, can I update my stats? Clock to 1060 MHz and memory clock to 1600 MHz. Thanks!


Done


----------



## MiladEd

Quote:


> Originally Posted by *Devildog83*
> 
> Done


Thanks a lot man.


----------



## KingCry

So the MSI Gaming cards have a hard locked limit on the min fan speeds


----------



## miklkit

Hello all!

I have a question. The fans on my MSI R9 280X have started rattling when they rev up beyond a certain speed. I'm not ready to buy a new VC yet so, where can I get replacement fans? I have searched but came up empty.


----------



## KingCry

Quote:


> Originally Posted by *miklkit*
> 
> Hello all!
> 
> I have a question. The fans on my MSI R9 280X have started rattling when they rev up beyond a certain speed. I'm not ready to buy a new VC yet so, where can I get replacement fans? I have searched but came up empty.


You should be able to find them on ebay.


----------



## The Pook

Quote:


> Originally Posted by *KingCry*
> 
> Guess I'm a little late to the party? R9 280 Gaming from MSI


Nice OC, beats mine @ 1.3v







... I do 1230mhz in benches and 1200mhz in games, 1275mv @ 1160mhz









Haven't tried OCing memory yet, just started OCing this card today. Kind of a waste with a G3258 but I should have my i7 in a few weeks


----------



## daffy.duck

@ 1.3V, what kind of temps do you guys get?
@ 1.225V I max out @ 70°C but then again my room isn't the coolest either.


----------



## The Pook

62-66C during games.

Sapphire Dual-X here, 20C ambient. Kinda chilly


----------



## daffy.duck

I might do some testing @ 1.3V.
My room definitely ain't 20° C but as long as I stay below 76° C, I'll be happy.


----------



## oshit

Quote:


> Originally Posted by *JackCY*
> 
> What PSU is that? You gotta know what lines are on what connectors. Some PSUs are marked as multi rail but in fact are manufactured as single rail.


Cosair HX520


----------



## KingCry

Quote:


> Originally Posted by *The Pook*
> 
> Nice OC, beats mine @ 1.3v
> 
> 
> 
> 
> 
> 
> 
> ... I do 1230mhz in benches and 1200mhz in games, 1275mv @ 1160mhz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Haven't tried OCing memory yet, just started OCing this card today. Kind of a waste with a G3258 but I should have my i7 in a few weeks


Quote:


> Originally Posted by *daffy.duck*
> 
> @ 1.3V, what kind of temps do you guys get?
> @ 1.225V I max out @ 70°C but then again my room isn't the coolest either.


I can't get anything past 1.25v to be happy with this 280 I flashed a custom BIOS I did yesterday to it and it wasn't happy even trying to boot at 1.27v But of course I got super lucky its the Highest Clocked R9 280 gaming anywhere even out of all the guys on HWBot. But temps with a custom fan curve hit about 68C but thats at 11% fan speed.


----------



## The Pook

Was gonna wait a few weeks but said screw it and ordered my 4690K. Should be here Saturday.

Might see higher in game temps now that I have a CPU to run it harder


----------



## jprovido

lemme join the club. my VTX3D R9 280x <3


----------



## daffy.duck

Quote:


> Originally Posted by *jprovido*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> lemme join the club. my VTX3D R9 280x <3[/SPOILER


Is that memory overclock at the default voltage? That's a great overclock for crappy elpida


----------



## jprovido

Quote:


> Originally Posted by *daffy.duck*
> 
> Is that memory overclock at the default voltage? That's a great overclock for crappy elpida


really? that's a very conservative overclock(im at stock voltage with that gpu-z screenshot). mine can do 7ghz (1750mhz) without a problem.


----------



## Devildog83

Quote:


> Originally Posted by *jprovido*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> lemme join the club. my VTX3D R9 280x <3


You have been added. Welcome to the club!!!


----------



## Devildog83

Quote:


> Originally Posted by *The Pook*
> 
> Nice OC, beats mine @ 1.3v
> 
> 
> 
> 
> 
> 
> 
> ... I do 1230mhz in benches and 1200mhz in games, 1275mv @ 1160mhz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Haven't tried OCing memory yet, just started OCing this card today. Kind of a waste with a G3258 but I should have my i7 in a few weeks


I have a G3258 too, it has been fun but just isn't able to handle my 290 nor was it able to handle the 270x/7870 X-Fire. At 4.3 Ghz at least I didn't freeze, black screen or BSOD but with this new motherboard I can't hit 4.3 so an i7 is in the cards for me too. I am also considering an Asus board to go with it. Not very happy with MSI. The Asrock pro4 was OK but it's so narrow it looked funny in my case.


----------



## KingCry

Quote:


> Originally Posted by *Devildog83*
> 
> You have been added. Welcome to the club!!!


Would you mind adding me?
R9 280 Gaming


----------



## Devildog83

Quote:


> Originally Posted by *KingCry*
> 
> Would you mind adding me?
> R9 280 Gaming
> 
> 
> Spoiler: Warning: Spoiler!


No problem, Done!!! Welcome.


----------



## neurotix

Quote:


> Originally Posted by *Devildog83*
> 
> I have a G3258 too, it has been fun but just isn't able to handle my 290 nor was it able to handle the 270x/7870 X-Fire. At 4.3 Ghz at least I didn't freeze, black screen or BSOD but with this new motherboard I can't hit 4.3 so an i7 is in the cards for me too. I am also considering an Asus board to go with it. Not very happy with MSI. The Asrock pro4 was OK but it's so narrow it looked funny in my case.


Keep in mind, your i7 might not be able to hit more than 4.3ghz either.

Haswell overclocks poorer than Ivy and Sandy. Some samples can't even do 4.1ghz, no matter the voltage. Personally, my chip takes crazy voltage to do 4.5ghz (1.4v+) and its not fully stable in x264. It will crash eventually if I stress it. But for daily use and everything else, it's stable.

As long as you keep temps under 100C you should be fine, even with high voltage. My OC has been like this for the last year, roughly, and I haven't noticed any degradation.


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> Keep in mind, your i7 might not be able to hit more than 4.3ghz either.
> 
> Haswell overclocks poorer than Ivy and Sandy. Some samples can't even do 4.1ghz, no matter the voltage. Personally, my chip takes crazy voltage to do 4.5ghz (1.4v+) and its not fully stable in x264. It will crash eventually if I stress it. But for daily use and everything else, it's stable.
> 
> As long as you keep temps under 100C you should be fine, even with high voltage. My OC has been like this for the last year, roughly, and I haven't noticed any degradation.


Yeah but a 4.3 Ghz i7 is going to be far better than a Dual core Pentium @ 4.3, or at least it freakin' better be. I think to be honest the i5 4670k looks to be a pretty nice deal now since I can get one for less than $200. Like I said the G3258 was fun to play with and very cheap but it's time. Heck I was at 4.3 hz and never went about 55c at 100% load so I had some headroom there if I needed it with the pro4. I am sure I could get more out of this if I wanted but I don't. Intel sucks for over-cocking. AMD is way more fun.


----------



## KingCry

Quote:


> Originally Posted by *neurotix*
> 
> Keep in mind, your i7 might not be able to hit more than 4.3ghz either.
> 
> Haswell overclocks poorer than Ivy and Sandy. Some samples can't even do 4.1ghz, no matter the voltage. Personally, my chip takes crazy voltage to do 4.5ghz (1.4v+) and its not fully stable in x264. It will crash eventually if I stress it. But for daily use and everything else, it's stable.
> 
> As long as you keep temps under 100C you should be fine, even with high voltage. My OC has been like this for the last year, roughly, and I haven't noticed any degradation.


4770k takes 1.75 to bench at 4.9 and its border line unstable.


----------



## oshit

Anyone have this Sapphire R9 280 OC 3GB Dual-X (non-X version)?

How is it?
What is the power draw and amps used?

I am trying to decide the above over Asus R9 270X 4GB DirectCU II TOP.

Which shd i choose? I am replaceing current GPU. And i can port over to new system once Skylake arrive..

FYI i got a Corsair HX-520 PSU
My system is old E8400 non-overclock.


----------



## neurotix

Realistically, the 280 is only going to be maybe 5% better than a 270X.

You can overclock the 270X to match or exceed the 280.

The only thing the 280 has over the 270X is 3gb RAM on a faster bus, as opposed to 2gb RAM.

Core wise, though, the 280 isn't much faster.

Personally, I'd consider the 280X instead of either the 270X or 280.

A used 280X/7970 should be like $150 or less now, which is less than the price of a new 270X.


----------



## oshit

Quote:


> Originally Posted by *neurotix*
> 
> Realistically, the 280 is only going to be maybe 5% better than a 270X.
> 
> You can overclock the 270X to match or exceed the 280.
> 
> The only thing the 280 has over the 270X is 3gb RAM on a faster bus, as opposed to 2gb RAM.
> 
> Core wise, though, the 280 isn't much faster.
> 
> Personally, I'd consider the 280X instead of either the 270X or 280.
> 
> A used 280X/7970 should be like $150 or less now, which is less than the price of a new 270X.


Well 280X is outside my consideration due to PSU limit.
Also i will be using the old PSU for future upgrade.

I dont buy used GPU.
Thanks for the suggestion, though.

Anyway i am wondering :-

Sapphire R9 280 3GB OC Dual-X
vs
Asus R9 270X 4GB DirectCu 2 TOP

Which of them have better hardware. ( memory, capacitor, inductor, VRM etc )

I read some where that Sapphire cools better..


----------



## KingCry

Quote:


> Originally Posted by *o*****
> 
> Well 280X is outside my consideration due to PSU limit.
> Also i will be using the old PSU for future upgrade.
> 
> I dont buy used GPU.
> Thanks for the suggestion, though.
> 
> Anyway i am wondering :-
> 
> Sapphire R9 280 3GB OC Dual-X
> vs
> Asus R9 270X 4GB DirectCu 2 TOP
> 
> Which of them have better hardware. ( memory, capacitor, inductor, VRM etc )
> 
> I read some where that Sapphire cools better..


The Asus card has custom VRM,mosfet, and voltage controller.


----------



## tabascosauz

Quote:


> Originally Posted by *KingCry*
> 
> The Asus card has custom VRM,mosfet, and voltage controller.


None of which matters as long as

a. user isn't pushing for the highest possible clocks

b. card is not from an obscure AIB that doesn't know how to make VRMs, coolers and cards properly

AMD has a history of implementing well-constructed VRMs on its higher end Tahiti / Hawaii reference designs (note that it doesn't mean that the cards' VRMs run cool since phase counts are low). I wouldn't be so quick to suggest Asus over Sapphire on this; Sapphire doesn't skimp on VRM components, and the whole point behind Asus rebranding all its voltage controllers into oblivion is so you don't know exactly how good/atrocious the controller actually is.

No one actually knows if Asus' Super Alloy Chokes actually make a difference, or are even superior to Cooper Bussmanns on AMD reference designs (which are reputable outside of the video card market). I'm not sure if Asus is even telling the truth when they say that virtually all of their cards, budget to enthusiast, use the same Digital PWMs. Most Asus cards offer voltage control, but an old uPI PWM can offer voltage control as an IR3567 can. As I've said above, Asus rebrands the controllers so they are all ASxxxx without a trace of what the real model and manufacturer might be.

It's Nvidia that tends to take advantage of every opportunity to cut corners on the VRM. This includes cutting down as much as possible on phase count wherever possible, and that's why GTX 570s, 580s and 590s had burning/exploding MOSFETs.


----------



## neurotix

Quote:


> Originally Posted by *tabascosauz*
> 
> None of which matters as long as
> 
> a. user isn't pushing for the highest possible clocks
> 
> b. card is not from an obscure AIB that doesn't know how to make VRMs, coolers and cards properly
> 
> AMD has a history of implementing well-constructed VRMs on its higher end Tahiti / Hawaii reference designs (note that it doesn't mean that the cards' VRMs run cool since phase counts are low). I wouldn't be so quick to suggest Asus over Sapphire on this; Sapphire doesn't skimp on VRM components, and the whole point behind Asus rebranding all its voltage controllers into oblivion is so you don't know exactly how good/atrocious the controller actually is.
> 
> No one actually knows if Asus' Super Alloy Chokes actually make a difference, or are even superior to Cooper Bussmanns on AMD reference designs (which are reputable outside of the video card market). I'm not sure if Asus is even telling the truth when they say that virtually all of their cards, budget to enthusiast, use the same Digital PWMs. Most Asus cards offer voltage control, but an old uPI PWM can offer voltage control as an IR3567 can. As I've said above, Asus rebrands the controllers so they are all ASxxxx without a trace of what the real model and manufacturer might be.
> 
> It's Nvidia that tends to take advantage of every opportunity to cut corners on the VRM. This includes cutting down as much as possible on phase count wherever possible, and that's why GTX 570s, 580s and 590s had burning/exploding MOSFETs.


+1

I'm not even sure about all that stuff myself, it's a little too technical for me, but what I can say is that the Dual-X cooler is no slouch. (I have a R7 265 Dual-X and even with a huge overclock I don't remember ever seeing it pass 55C)

Let's also consider some other facts, I may not be a VRM guru but I'm pretty into overclocking, and this is the small stuff they DON'T tell you.

1. A lot of the time, ASUS custom components actually make the card WORSE, look up the guy who got so frustrated with his 7970 Matrix Platinum that he bent it in half, at the time it was a $600 card. A lot of the really high end ASUS cards are actually terrible overclockers. I have heard they generally have low ASIC. Look into reviews on Newegg and Amazon of user experiences with some of the Radeon cards from this and the last generation. No amount of custom components can salvage a poorly binned/crap core.

2. ASUS coolers aren't the best. Look into the reviews of the ASUS 290X DCUII. Only 2/5 heatpipes touch the core. I'm not sure their lower spec designs are very good, either. The Dual-X isn't the best cooler, but it IS constantly improved between hardware generations afaik (instead of just slapping last generations cooler, or a 780 cooler, on a card). I'd say, screw the Dual-X and pay a little more for a Vapor-X 270X if you are going to stick with air. (I have one, and it's amazing: 84% ASIC)

3. ASUS cards are frequently locked into using ASUS GPU Tweak (which flippin' sucks) for voltage control, fan control and so on. Again, I'm not sure because I've only heard about the 7970 Matrix Platinum having this problem, but apparently these cards can't be controlled via Afterburner or other utilities. Sapphire has the excellent Trixx OC utility, and you can of course use MSI Afterburner if you prefer that.

4. Sapphire does not voltage lock their cards. This is the *BIGGEST reason* I buy and recommend Sapphire (even with the bad warranty). I have had 4670s, a 6870, a 7770, three 7970s, 4 R9 290s, and a R7 265 (7850) from them. *NONE* of them have been voltage locked. Furthermore, in all my dealings on these forums for five years I don't think I've ever encountered anyone with a Sapphire card who had a voltage locked card. I'm fairly certain that ASUS is "overclocker friendly" and doesn't voltage lock their cards anymore, they want to appeal to enthusiasts and have the OC branding et cetera. However, this goes for any manufacturer really, I've heard XFX and HIS are bad about voltage locking some cards.

In short, I would recommend the Sapphire over the ASUS, but I would still recommend ASUS over XFX, Powercolor, HIS and so on. At this point, if I couldn't get Sapphire my next choice would probably be MSI (Lightning).

Hope this helps.


----------



## JackCY

ASUS 290 has an older cooler, from I don't know what anymore, 780? Some chip that was larger. And so not all heatpipes touch the chip on 290 directly.

AB can be used with ASUS but how AB works with voltage control is silly = has to be hard locked and what no and GPUTweak does it better and can also update the VBIOS. I don't know what is special about Trixx but it was crashing on me and the features didn't seem to be extensive or something, just regular clocks and core voltage, but hey what more do you really need?

If some cards are voltage locked, well as with VRM and coolers one has to do their research before buying.
Doesn't matter what brand, some brand/model combos have coil whine/bad cooler/what ever some don't. There is no golden manufacturer. Look at 960s, half of them sucks in terms of VRM temps. and coil whine by design.

Sapphire doesn't seem to be making, at lest not anymore, any cards except with AMD/ATI chips. Much like EVGA only does cards with Nvidia chips.


----------



## tomytom99

Quote:


> Originally Posted by *neurotix*
> 
> +1
> 
> ~Snip~
> 
> Hope this helps.


Very much so agreed. My old ASUS 5450 had a better cooler design for it than some of their better cards. My 270x Toxic (Sapphire) is really nice. I do agree the warranty is a little sketchy, as I had a 270x where the fans went bad (wobbled and went slow). My XFX 7870 was a nice card. It had a great cooler design, and a real nice warranty service too. I got the Double D 7870 Black Edition, which luckily wasn't V-Locked, but didn't OC much at all. Then before my 7870 were my 6950's. I got a Sapphire one (the one with the rear firing blower), and a Double D XFX. That card was V-Locked. If anything, Sapphire is a brand that isn't that great, but it's stuff is great. Then XFX is like a luxury restaurant. You have to go by what they give you, but it is a nice quality card either way. (At least the 7870, a metal cooler assembly.)
Here's a picture of the ASUS 290 cooler:








Notice how the two outer heat pipes are almost useless. Also, I looked and it is the same as the cooler on their 780.


----------



## oshit

Thks Guys,

Yah i agreed with Sapphire Dual-X cooling, i got a HD6770 myself, it cool as ever.

On Asus, if i rem correct i read somewhere that a certain model GPU is spiking on the PCIe slot, above the 75watt limit, and i mean quiet frequently, not random. While the similar model, Gigabyte do spikes, buy doesnt even reach the 75watt limit.

Guess i have to becareful with Asus products.

Ok back to Sapphire R9 280 3GB OC Dual-X,

I am trying to look for the thermal image of the card, particularly on the VRM. I can only find dissecting images of the card (taking apart) from ocholicnet or something. But could not find a single thermal image and elsewhere.
Any1 have link(s)?

The reason why i am looking for it, is to provide cooling on the VRM or other hotspot, if any.

Oh yah, wanted to add.

Does the Sapphire R9 280 3GB OC Dual-X works in my old board G41M-combo with PCIe 1.1?
I am replacing the Sapphire HD6770 1GB. Until skylake arrive i wont be replacing my system.


----------



## tabascosauz

Quote:


> Originally Posted by *o*****
> 
> I am trying to look for the thermal image of the card, particularly on the VRM. I can only find dissecting images of the card (taking apart) from ocholicnet or something. But could not find a single thermal image and elsewhere.
> Any1 have link(s)?
> 
> The reason why i am looking for it, is to provide cooling on the VRM or other hotspot, if any.
> 
> Oh yah, wanted to add.
> 
> Does the Sapphire R9 280 3GB OC Dual-X works in my old board G41M-combo with PCIe 1.1?
> I am replacing the Sapphire HD6770 1GB. Until skylake arrive i wont be replacing my system.


Not quite sure. I think almost all cards are backwards compatible all the way to PCIe 1.1, but you should check. And seriously, maybe a G3220/3258 + H81 board could serve as an inexpensive stopgap upgrade until Skylake? It looks like Skylake is going to be further delayed, so you might be looking at a whole year until its release.

I am not in any way accusing Asus of lying to its customers or neutering its cards VRMs; I'm just expressing some skepticism over whether their "technologies" are actually as good as they claim. Much of it also applies to their motherboards; the X99-Deluxe as well as Z97-WS and some others are true high end boards integrating IR3550Ms, 60A chokes and latest IR PWMs (I think), but even much of their ROG lineup is based around the same "Digi+ ASxxxx" rebranded PWMs that really cast a shadow over otherwise excellent boards.

Sapphire doesn't have any noteworthy technologies like Asus and MSI do, but they tend to be solidly built cards (aside from their crappy fans).

The 280 Dual-X should be identical to the 280X Dual-X. They are both baseline cards in their respective categories and should only differ by the GPU (Tahiti Pro vs Tahiti XT). There are probably images out there of the 280X Dual-X PCB; I wouldn't expect any more than something that looks like the Tahiti reference design.


----------



## oshit

Quote:


> Originally Posted by *tabascosauz*
> 
> Not quite sure. I think almost all cards are backwards compatible all the way to PCIe 1.1, but you should check. And seriously, maybe a G3220/3258 + H81 board could serve as an inexpensive stopgap upgrade until Skylake? It looks like Skylake is going to be further delayed, so you might be looking at a whole year until its release.
> 
> I am not in any way accusing Asus of lying to its customers or neutering its cards VRMs; I'm just expressing some skepticism over whether their "technologies" are actually as good as they claim. Much of it also applies to their motherboards; the X99-Deluxe as well as Z97-WS and some others are true high end boards integrating IR3550Ms, 60A chokes and latest IR PWMs (I think), but even much of their ROG lineup is based around the same "Digi+ ASxxxx" rebranded PWMs that really cast a shadow over otherwise excellent boards.
> 
> Sapphire doesn't have any noteworthy technologies like Asus and MSI do, but they tend to be solidly built cards (aside from their crappy fans).


The 3258 is only slightly better than my e8400. What i need for the future is a 4 core CPU.
Moreover i will need to fork out more $$ for an OS + the M/B. I think its not worth the stopgap.
I rather saved up for skylake. anyway i can always fall back to the Boardwell if i m desperate by than. But i will hold out till then.

Thanks for the advice.

As for Asus, yah i dont like their non-transparency in those area. Their product is a hit n miss, if u r not careful. Gigabyte also changes their components with different revisions, but at least they are transparent about it.


----------



## oshit

Quote:


> Originally Posted by *tabascosauz*
> 
> Sapphire doesn't have any noteworthy technologies like Asus and MSI do, but they tend to be solidly built cards (aside from their crappy fans).
> 
> The 280 Dual-X should be identical to the 280X Dual-X. They are both baseline cards in their respective categories and should only differ by the GPU (Tahiti Pro vs Tahiti XT). There are probably images out there of the 280X Dual-X PCB; I wouldn't expect any more than something that looks like the Tahiti reference design.


o.0 i didnt know their fans are crappy, maybe i just dont OC my gpu anymore.

I think the 280 should be fine, just a guess, as they are clocked lower than 270X and 280X.
Since lower clocked speed = lesser stress to VRM, lower temperature.

Their component may be not the best, but they will do for me since i have stop OC GPU since the 9800GT days.


----------



## tabascosauz

Quote:


> Originally Posted by *o*****
> 
> o.0 i didnt know their fans are crappy, maybe i just dont OC my gpu anymore.
> 
> I think the 280 should be fine, just a guess, as they are clocked lower than 270X and 280X.
> Since lower clocked speed = lesser stress to VRM, lower temperature.
> 
> Their component may be not the best, but they will do for me since i have stop OC GPU since the 9800GT days.


Tahiti Pro is an ice cube compared to Tahiti XT. The HD7950 is pretty easy to cool, but the Tahiti XT is almost as temperamental as Hawaii.

Doesn't matter as much sine Sapphire's warranty is short and RMA is garbage so their cards are a one off lol.


----------



## neurotix

Quote:


> Originally Posted by *tabascosauz*
> 
> Sapphire doesn't have any noteworthy technologies like Asus and MSI do, but they tend to be solidly built cards (aside from their crappy fans).


I beg to differ...

"Black Diamond Choke & Full Solid Cap Design

Choke is an important component of the graphics card. By working with the component engineer, Sapphire's patent pending choke is 10% cooler and offers 25% more power efficiency than a normal choke. The graphics card will be more reliable and save energy.
Improved reliability and better overclocking are possible by using only high-polymer, aluminum capacitors which posses far superior characteristics than regular aluminum capacitor for a longer product life. When operational temperatures drop by 20°C, the product life span is extended by a factor of ten, when the operational temperature increases by 20°C, the product life span only decreases by 10%."

From http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=2167

I believe I heard something about a redone custom 10-phase (or possibly 12-phase) power delivery system on the 290/X Vapor-X cards. Also, they have a phenomenal passive heatsink on the VRMs on these cards, my VRMs are usually 15C cooler than the core is.

That doesn't include stuff like the custom LEDs, backplate etc which are all bonus features.

Pretty sure stuff like this makes it into the lower-model R9 cards too.


----------



## oshit

omg didnt reliease my name got block o *****.

lol

i think i go change the i to 1. brb.


----------



## oshit

Quote:


> Originally Posted by *neurotix*
> 
> I beg to differ...
> 
> "Black Diamond Choke & Full Solid Cap Design
> 
> .


Not all cards having that. The one i m buying doesnt have it.


----------



## oshit

Quote:


> Originally Posted by *o*****
> 
> omg didnt reliease my name got block o *****.
> 
> lol
> 
> i think i go change the i to 1. brb.


crap doesnt work


----------



## tabascosauz

Quote:


> Originally Posted by *neurotix*
> 
> I beg to differ...
> 
> "Black Diamond Choke & Full Solid Cap Design
> 
> Choke is an important component of the graphics card. By working with the component engineer, Sapphire's patent pending choke is 10% cooler and offers 25% more power efficiency than a normal choke. The graphics card will be more reliable and save energy.
> Improved reliability and better overclocking are possible by using only high-polymer, aluminum capacitors which posses far superior characteristics than regular aluminum capacitor for a longer product life. When operational temperatures drop by 20°C, the product life span is extended by a factor of ten, when the operational temperature increases by 20°C, the product life span only decreases by 10%."


Of course I would know...lol...my 280X Vapor-X incorporates 8 of these. Except they aren't nearly as significant as Sapphire claims. They are better than your average inductor, but we know next to nothing about the MOSFETs that get paired with these. And without that knowledge, we can't pass judgment on the quality.

Sapphire still has its fair share of marketing (Asus and MSI just use more than their fair share). For a long time (and still going on), MSI's VRMs have been notoriously bad. They appear to have excellent and beefy capacitance ratings, but that's to compensate for their horrific VRM designs. They are the polar opposite of Gigabyte. Their 970 boards, for example are infamous for VRM failures and their incapability to handle overclocked 4-module FX chips. As of X99, the PWM on MPower has been downgraded to an analog ISL PWM. And MSI always hides cheap MOSFETs (in the case of the Z77 budget boards, D-PAK) below fancy heatsinks and their "Super Ferrite Choke". This applies to their videocards as well.

/end rant

TLDR: chokes don't mean anything unless the MOSFETs and PWM are up to the same standard. 60A Cooper Bussmanns/Blackwing chokes + IR3550M + IR3570 =/= 60A Cooper Bussmanns + D-PAK doubled + ISL6366


----------



## tabascosauz

Quote:


> Originally Posted by *neurotix*
> 
> I believe I heard something about a redone custom 10-phase (or possibly 12-phase) power delivery system on the 290/X Vapor-X cards. Also, they have a phenomenal passive heatsink on the VRMs on these cards, my VRMs are usually 15C cooler than the core is.
> 
> That doesn't include stuff like the custom LEDs, backplate etc which are all bonus features.
> 
> Pretty sure stuff like this makes it into the lower-model R9 cards too.


12-phase VRMs, as long as they are of good quality, are more or less beyond the point of diminishing returns. Plus, it's not like the 290X cards implement 12-phase PWMs, because I don't think they even exist; these are obviously doubled and not as good as a true 12-phase would be.

LEDs are nice, like the one on my 280X and the backplates as well, which also made it onto my card, but they don't change the fact that Sapphire couldn't care less about the quality of the fans they use. They have a bad track record for fans because they're all sleeve-bearing fans, and even the new Tri-X models of 280X and 290/290X are bringing in reports of fan failure, where one of the fans will operate erratically or simply refuse to run, despite it being only a few months since purchase.

Sapphire's cards aren't the ones succumbing to burning/exploding MOSFETs, so I'm not sure why these additional "features" even matter. Sapphire sticks to a good VRM design philosophy.


----------



## neurotix

Quote:


> Originally Posted by *tabascosauz*
> 
> Of course I would know...lol...my 280X Vapor-X incorporates 8 of these. Except they aren't nearly as significant as Sapphire claims. They are better than your average inductor, but we know next to nothing about the MOSFETs that get paired with these. And without that knowledge, we can't pass judgment on the quality.
> 
> Sapphire still has its fair share of marketing (Asus and MSI just use more than their fair share). For a long time (and still going on), MSI's VRMs have been notoriously bad. They appear to have excellent and beefy capacitance ratings, but that's to compensate for their horrific VRM designs. They are the polar opposite of Gigabyte. Their 970 boards, for example are infamous for VRM failures and their incapability to handle overclocked 4-module FX chips. As of X99, the PWM on MPower has been downgraded to an analog ISL PWM. And MSI always hides cheap MOSFETs (in the case of the Z77 budget boards, D-PAK) below fancy heatsinks and their "Super Ferrite Choke". This applies to their videocards as well.
> 
> /end rant
> 
> TLDR: chokes don't mean anything unless the MOSFETs and PWM are up to the same standard. 60A Cooper Bussmanns/Blackwing chokes + IR3550M + IR3570 =/= 60A Cooper Bussmanns + D-PAK doubled + ISL6366


Oh, well I was more talking about GPUs than motherboards.

Yeah, I realize there's a lot of marketing spin and hype, and we don't know for sure what's in the custom cards compared to the reference designs.

For mobos, pretty much all I use and recommend are ASUS and I've never had problems with any of mine. Never had to RMA, ran my chips heavily overclocked (see sig rigs), benched on them and everything. To say I'm a ROG fan is an understatement. In my experience, the reputation holds true.

It doesn't surprise me that MSI's 970 motherboards are bad, because a few years ago pretty much ALL their AMD motherboards were bad. See this link and take note of how many MSI 790, 785, 770 etc failures there are. There's another "MSI hate" thread around here somewhere but I'm too lazy to dig it up.
Quote:


> Originally Posted by *o*****
> 
> Not all cards having that. The one i m buying doesnt have it.


Again, that's why I'm recommending you skip the Dual-X and get a Vapor-X model instead. Especially if you plan on keeping your card on air cooling. Pay a little extra, and you WON'T be disappointed with the cooling and performance. (I'm not sure, but I think Sapphire bins chips for Vapor-X and Toxic models too.)


----------



## JackCY

Dual-X no thanks. There isn't more to add. It's one of the entry cheapo, low clocks cards, expect nothing more.


----------



## maximsilentfoot

First club I'm joining in here... Just picked up an MSI 270 OC 2GB Gaming yesterday, loving it so far!


----------



## Trayeon

Hey Guys, so i'm new here, i was stumblin across the internet looking for a big experienced community and finally ended up here. Just in Case you're still adding people...










As far as i know, the R9 270 is exactly the same card as the R9 270X, except of the Core Clock and the TDP, which is 150 Watt for the non-X. So far i have only seen 270's vanilla with only 1x6-pin pci-e connector on them. But i believe there were some OC Editions of the Vanilla 270, which were shipped with 2x6-pin pci-e. Correct me if i'm wrong, but it's a major problem for overclocking the 270 because of a missing power connector, in order to exceed (probably with a bios mod) the TDP of 150 Watt. So my Question is: Is it possible to add (by soldering) an extra 6 pin power connector to the graphicscard PCB. Was thinking about soldering another cable to the back of the PCB, where the ends of the already present connectors are, but I'm not sure bout how those circuits work or if it is divided in Lanes? Any Ideas? Don't bother about the PSU, its 600 WATT with 1x8pin & 1x6pin PCI-E.

Basically i want to get the max. performance out of this card. The only thing i tried is the amd overdrive which gives me 1050/1500, but i would like to go to somewhere like 1150-1200 on the core.
How can i achieve this, i already have VBE7, but i'm really scared about the fact, that i only have a single BIOS GFX and all these options in VBE7 are still mysterious







. Anybody tried crazy stuff with their 270's like flashing the bios of a 270X? Any Help link or PM would be much appreciated.

I usually do not sign up in Forums, but this one seems to be the s**t! Sorry for not reading thru all those pages, i also tried the search function, but my questions, were kinda to specific.

Kind regards Tray


----------



## tabascosauz

Quote:


> Originally Posted by *Trayeon*
> 
> Hey Guys, so i'm new here, i was stumblin across the internet looking for a big experienced community and finally ended up here. Just in Case you're still adding people...
> 
> As far as i know, the R9 270 is exactly the same card as the R9 270X, except of the Core Clock and the TDP, which is 150 Watt for the non-X. So far i have only seen 270's vanilla with only 1x6-pin pci-e connector on them. But i believe there were some OC Editions of the Vanilla 270, which were shipped with 2x6-pin pci-e. Correct me if i'm wrong, but it's a major problem for overclocking the 270 because of a missing power connector, in order to exceed (probably with a bios mod) the TDP of 150 Watt. So my Question is: Is it possible to add (by soldering) an extra 6 pin power connector to the graphicscard PCB. Was thinking about soldering another cable to the back of the PCB, where the ends of the already present connectors are, but I'm not sure bout how those circuits work or if it is divided in Lanes? Any Ideas? Don't bother about the PSU, its 600 WATT with 1x8pin & 1x6pin PCI-E.
> 
> Basically i want to get the max. performance out of this card. The only thing i tried is the amd overdrive which gives me 1050/1500, but i would like to go to somewhere like 1150-1200 on the core.
> How can i achieve this, i already have VBE7, but i'm really scared about the fact, that i only have a single BIOS GFX and all these options in VBE7 are still mysterious
> 
> 
> 
> 
> 
> 
> 
> . Anybody tried crazy stuff with their 270's like flashing the bios of a 270X? Any Help link or PM would be much appreciated.
> 
> I usually do not sign up in Forums, but this one seems to be the s**t! Sorry for not reading thru all those pages, i also tried the search function, but my questions, were kinda to specific.
> 
> Kind regards Tray


Welcome!

I wouldn't consider the missing power connector to be THE limiting factor in OC; plenty of "vanilla" R9 270 cards have reached respectable clockspeeds, albeit short of the R9 270X's potential OC.

I'd say the soldering part isn't worth it. Those "OC" cards are probably designed by their manufacturer to be another confusing wrench in the works: an R9 270 that is no different from an R9 270X, with BIOS, VRM, binned Pitcairn and everything. You can't guarantee that your GPU is binned.


----------



## neurotix

Quote:


> Originally Posted by *Trayeon*
> 
> Hey Guys, so i'm new here, i was stumblin across the internet looking for a big experienced community and finally ended up here. Just in Case you're still adding people...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As far as i know, the R9 270 is exactly the same card as the R9 270X, except of the Core Clock and the TDP, which is 150 Watt for the non-X. So far i have only seen 270's vanilla with only 1x6-pin pci-e connector on them. But i believe there were some OC Editions of the Vanilla 270, which were shipped with 2x6-pin pci-e. Correct me if i'm wrong, but it's a major problem for overclocking the 270 because of a missing power connector, in order to exceed (probably with a bios mod) the TDP of 150 Watt. So my Question is: Is it possible to add (by soldering) an extra 6 pin power connector to the graphicscard PCB. Was thinking about soldering another cable to the back of the PCB, where the ends of the already present connectors are, but I'm not sure bout how those circuits work or if it is divided in Lanes? Any Ideas? Don't bother about the PSU, its 600 WATT with 1x8pin & 1x6pin PCI-E.
> 
> Basically i want to get the max. performance out of this card. The only thing i tried is the amd overdrive which gives me 1050/1500, but i would like to go to somewhere like 1150-1200 on the core.
> How can i achieve this, i already have VBE7, but i'm really scared about the fact, that i only have a single BIOS GFX and all these options in VBE7 are still mysterious
> 
> 
> 
> 
> 
> 
> 
> . Anybody tried crazy stuff with their 270's like flashing the bios of a 270X? Any Help link or PM would be much appreciated.
> 
> I usually do not sign up in Forums, but this one seems to be the s**t! Sorry for not reading thru all those pages, i also tried the search function, but my questions, were kinda to specific.
> 
> Kind regards Tray


Isn't the 270X only like $20 more MSRP than the 270?

Just sell the 270 and get a 270X instead.


----------



## Trayeon

Thanks @ tabascosauz









@ Neurotix: Why should I? My R9 270 is already running faster then an R9 270X at stock. I'm wondering if someone ever tried flashing a 270X bios on a 270. Basically AMD Overdrive don't let go past 1050/1500. I would really like to try VBE7 but i just dont know which settings to set :/


----------



## bichael

Quote:


> Originally Posted by *Trayeon*
> 
> Thanks @ tabascosauz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @ Neurotix: Why should I? My R9 270 is already running faster then an R9 270X at stock. I'm wondering if someone ever tried flashing a 270X bios on a 270. Basically AMD Overdrive don't let go past 1050/1500. I would really like to try VBE7 but i just dont know which settings to set :/


I think you would need to edit the power limit in the BIOS with VBE7 as well as slowly upping the core clocks but not something I have ever tried. Trust you have looked at the big VBE7 thread on techpowerup. Seems potentially a bit risky for fairly limited benefit but of course it's your choice!

Complete shot in the dark but it could be worth trying different software such as Sapphire TriXX etc. in case it offers a bit more than Overdrive?

Is your card voltage unlocked? It would be a shame to do a BIOS mod and then find it wasn't stable once you got past 1050 anyway...

I have a 270x and the below should give you some idea of how performance may scale. I run at 1150 as although 1170 was fine in firestrike I was getting some driver crashes in Tomb Raider. Pretty sure myne is completely voltage locked.

Firestrike graphics scores:
1100/1425 (stock for me) - 6470
1140/1450 - 6664
1150/1475 - 6775
1170/1475 - 6841

Edit: Had a quick check on what clocks 270s are likely to get and came across this thread which you should check out. The 1050 limit thing seems a bit odd as lots have gone past that though 1100 seems to be about average I guess due to power supply limit and maybe lower binned chips.
http://www.overclock.net/t/1465987/msi-r9-270-overclocking-result/10


----------



## Trayeon

@ bichael: Thanks for that link, much appreciated







I'll go head and read a bit, but i might ending up NOT flashing the BIOS, you kinda convinced me


----------



## bichael

No worries!

Yeah I would suggest you look into how far you can go with the standard +20% power first as it doesn't seem that 1050 should be a hard limit. I guess it may be an Overdrive thing so definitely try Afterburner, TriXX etc. and play with some of their settings if needed.

I also looked at using VBE7 for the potential extra power limit etc. but decided it wasn't worth the risk and loss of UEFI bios!


----------



## Devildog83

Quote:


> Originally Posted by *bichael*
> 
> I think you would need to edit the power limit in the BIOS with VBE7 as well as slowly upping the core clocks but not something I have ever tried. Trust you have looked at the big VBE7 thread on techpowerup. Seems potentially a bit risky for fairly limited benefit but of course it's your choice!
> 
> Complete shot in the dark but it could be worth trying different software such as Sapphire TriXX etc. in case it offers a bit more than Overdrive?
> 
> Is your card voltage unlocked? It would be a shame to do a BIOS mod and then find it wasn't stable once you got past 1050 anyway...
> 
> I have a 270x and the below should give you some idea of how performance may scale. I run at 1150 as although 1170 was fine in firestrike I was getting some driver crashes in Tomb Raider. Pretty sure myne is completely voltage locked.
> 
> Firestrike graphics scores:
> 1100/1425 (stock for me) - 6470
> 1140/1450 - 6664
> 1150/1475 - 6775
> 1170/1475 - 6841
> 
> Edit: Had a quick check on what clocks 270s are likely to get and came across this thread which you should check out. The 1050 limit thing seems a bit odd as lots have gone past that though 1100 seems to be about average I guess due to power supply limit and maybe lower binned chips.
> http://www.overclock.net/t/1465987/msi-r9-270-overclocking-result/10


Pretty much worth the extra money in my opinion. My 270x was stable at 1225/1550 and I ran it everyday X-Fired with a 7870 at 1200/1450 with zero in game issues or benching crashes unless I went beyond 1235 core clock. For a few extra bucks you get a better, faster card so why not.


----------



## Devildog83

Quote:


> Originally Posted by *Trayeon*
> 
> Hey Guys, so i'm new here, i was stumblin across the internet looking for a big experienced community and finally ended up here. Just in Case you're still adding people...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As far as i know, the R9 270 is exactly the same card as the R9 270X, except of the Core Clock and the TDP, which is 150 Watt for the non-X. So far i have only seen 270's vanilla with only 1x6-pin pci-e connector on them. But i believe there were some OC Editions of the Vanilla 270, which were shipped with 2x6-pin pci-e. Correct me if i'm wrong, but it's a major problem for overclocking the 270 because of a missing power connector, in order to exceed (probably with a bios mod) the TDP of 150 Watt. So my Question is: Is it possible to add (by soldering) an extra 6 pin power connector to the graphicscard PCB. Was thinking about soldering another cable to the back of the PCB, where the ends of the already present connectors are, but I'm not sure bout how those circuits work or if it is divided in Lanes? Any Ideas? Don't bother about the PSU, its 600 WATT with 1x8pin & 1x6pin PCI-E.
> 
> Basically i want to get the max. performance out of this card. The only thing i tried is the amd overdrive which gives me 1050/1500, but i would like to go to somewhere like 1150-1200 on the core.
> How can i achieve this, i already have VBE7, but i'm really scared about the fact, that i only have a single BIOS GFX and all these options in VBE7 are still mysterious
> 
> 
> 
> 
> 
> 
> 
> . Anybody tried crazy stuff with their 270's like flashing the bios of a 270X? Any Help link or PM would be much appreciated.
> 
> I usually do not sign up in Forums, but this one seems to be the s**t! Sorry for not reading thru all those pages, i also tried the search function, but my questions, were kinda to specific.
> 
> Kind regards Tray


I am still adding folks, please tell us what card you have and post a pic if you have one.


----------



## Devildog83

Quote:


> Originally Posted by *maximsilentfoot*
> 
> First club I'm joining in here... Just picked up an MSI 270 OC 2GB Gaming yesterday, loving it so far!
> 
> 
> Spoiler: Warning: Spoiler!


I will add you. Welcome !!! Do you have any pics?


----------



## oshit

Quote:


> Originally Posted by *Devildog83*
> 
> [/SPOILER]
> 
> I will add you. Welcome !!! Do you have any pics?


Hi Lord Devil,

Can add me too?
I just got the Sapphire R9 280 3GB OC Dual-X (280 non-X).
But sadly i wont be OC it.

End.

To the rest of the community,

Thanks for the feedbacks and suggestions guys. I finally got the above card (wont be OCing).
Works fine on my old system (non-OC), with PCIe v1.1

775 board (G41M-combo).
G Skill 2x2GB
intel C2D E8400
WinXP pro sp3.
Corsair HX-520.
5 system fans, 120mm.
CPU cooler, Xigmatec DarkNight 2 NightHawk ed. with a Corsair SP120 PWM (high performance).

Until skylake, i will wait and see.


----------



## Devildog83

Quote:


> Originally Posted by *o*****
> 
> Hi Lord Devil,
> 
> Can add me too?
> I just got the Sapphire R9 280 3GB OC Dual-X (280 non-X).
> But sadly i wont be OC it.
> 
> End.
> 
> To the rest of the community,
> 
> Thanks for the feedbacks and suggestions guys. I finally got the above card (wont be OCing).
> Works fine on my old system (non-OC), with PCIe v1.1
> 
> 775 board (G41M-combo).
> G Skill 2x2GB
> intel C2D E8400
> WinXP pro sp3.
> 
> Until skylake, i will wait and see.


Could you tell me what the specs are for that card and do you have a pic? I would love to see that old school system.


----------



## oshit

Quote:


> Originally Posted by *Devildog83*
> 
> Could you tell me what the specs are for that card and do you have a pic? I would love to see that old school system.


Spec?
Err. I am not OCing it., so u r refering to the base spec?

I can upload the photo later.. when i get home.


----------



## Devildog83

Quote:


> Originally Posted by *o*****
> 
> Spec?
> Err. I am not OCing it., so u r refering to the base spec?
> 
> I can upload the photo later.. when i get home.


Yes what ever you run it at, I don't know the specs of every card and need them to add you. I will definitely add you as soon as I have them.


----------



## oshit

Added
Quote:


> Originally Posted by *Devildog83*
> 
> Yes what ever you run it at, I don't know the specs of every card and need them to add you. I will definitely add you as soon as I have them.


From HWinfo

ATI/AMD Radeon HD 7950 (TAHITI PRO) <== R9 280
GPU Thermal Diode 43.0 °C
GPU VDDC 1.213 V
GPU Fan 1348 RPM
GPU Clock 940.0 MHz
GPU Memory Clock 1250.0 MHz
GPU Utilization 0.0 %
GPU Fan Speed 29.000 %
PCIe Link Speed 2.500 Gbps
Memory Bus Width: 384-bit
Number Of Pixel Pipelines: 32
Number Of Unified Shaders: 1792
Video Bus: PCIe v3.0 x16 (8.0 Gb/s) @ x16 (2.5 Gb/s) <== note due to my board is PCIe x16 v1.1.

From Sapphire site
http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&pid=2194&lid=1

GPU
850(Boost:940) MHz Core Clock
28 nm Chip
1792 x Stream Processors

Video Memory
3072 MB Size
384 -bit GDDR5
5000 MHz Effective

Dimension
262(L)X112(W)X34(H) mm Size. 2 x slot

Connection required
2 x 6pin PCIe connectors


----------



## Devildog83

Quote:


> Originally Posted by *o*****
> 
> Added
> From HWinfo
> 
> ATI/AMD Radeon HD 7950 (TAHITI PRO) <== R9 280
> GPU Thermal Diode 43.0 °C
> GPU VDDC 1.213 V
> GPU Fan 1348 RPM
> GPU Clock 940.0 MHz
> GPU Memory Clock 1250.0 MHz
> GPU Utilization 0.0 %
> GPU Fan Speed 29.000 %
> PCIe Link Speed 2.500 Gbps
> Memory Bus Width: 384-bit
> Number Of Pixel Pipelines: 32
> Number Of Unified Shaders: 1792
> Video Bus: PCIe v3.0 x16 (8.0 Gb/s) @ x16 (2.5 Gb/s) <== note due to my board is PCIe x16 v1.1.
> 
> From Sapphire site
> http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&pid=2194&lid=1
> 
> GPU
> 850(Boost:940) MHz Core Clock
> 28 nm Chip
> 1792 x Stream Processors
> 
> Video Memory
> 3072 MB Size
> 384 -bit GDDR5
> 5000 MHz Effective
> 
> Dimension
> 262(L)X112(W)X34(H) mm Size. 2 x slot
> 
> Connection required
> 2 x 6pin PCIe connectors


I am really sorry I wasn't more specific, I just meant the clock speeds like it says on the 1st post. Sorry you went to all of that trouble but nice job anyhow.


----------



## Devildog83

Quote:


> Originally Posted by *Devildog83*
> 
> I am really sorry I wasn't more specific, I just meant the clock speeds like it says on the 1st post. Sorry you went to all of that trouble but nice job anyhow.


You have been added! Welcome!!!


----------



## oshit

lol, its ok.

I was also doing some housekeeping anyway.


----------



## maximsilentfoot

Quote:


> Originally Posted by *Devildog83*
> 
> [/SPOILER]
> 
> I will add you. Welcome !!! Do you have any pics?


Thanks! Happy to be here. Here's a pic of the dude paired with his z97 gaming 5 buddy









I've installed afterburner and dragged the core and mem to the maximum allowed (1050/1500). I moved up from a gtx 460 and so far it feels great to be gaming at 1080p with higher settings than before. To be honest this card is quite temporary, I'm going to see what AMD comes up with in the 3XX series and might move up then.


----------



## Devildog83

Quote:


> Originally Posted by *maximsilentfoot*
> 
> Thanks! Happy to be here. Here's a pic of the dude paired with his z97 gaming 5 buddy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've installed afterburner and dragged the core and mem to the maximum allowed (1050/1500). I moved up from a gtx 460 and so far it feels great to be gaming at 1080p with higher settings than before. To be honest this card is quite temporary, I'm going to see what AMD comes up with in the 3XX series and might move up then.


Nice, I altered the specs. What CPU do you have in there?


----------



## maximsilentfoot

Quote:


> Originally Posted by *Devildog83*
> 
> Nice, I altered the specs. What CPU do you have in there?


4790k. just picked up a evo 212 cooler to go with it. going to need to go through some guides getting it installed! i guess the air cooling forum here can be of help


----------



## Devildog83

Quote:


> Originally Posted by *maximsilentfoot*
> 
> 4790k. just picked up a evo 212 cooler to go with it. going to need to go through some guides getting it installed! i guess the air cooling forum here can be of help


Should not be that tough to install. Nice CPU. I just bought the 4670k. Ever think about at least an AIO for cooling?


----------



## NameUnknown

I'm looking at going Skylake if it has the performance boost when it comes out. At the same time should I stay with my 280X or upgrade with another or go 3xx? Any thoughts?


----------



## Agent Smith1984

Quote:


> Originally Posted by *NameUnknown*
> 
> I'm looking at going Skylake if it has the performance boost when it comes out. At the same time should I stay with my 280X or upgrade with another or go 3xx? Any thoughts?


Skylake would push 2x 280x really well, but you may run into VRAM capacity by the time that rolls around.

I would say go ahead and grab a 300 series card, and build around when you decide to upgrade the CPU.

Your 1090T (if overclocked), has about enough headroom for one last gen of graphics card before it sputters out.

I went from 280x to 290 (OC) with no bottlenecks, but I can see the that CPU usage on 1080P is to a point where one more GPU upgrade would peg it out.


----------



## maximsilentfoot

Quote:


> Originally Posted by *Devildog83*
> 
> Should not be that tough to install. Nice CPU. I just bought the 4670k. Ever think about at least an AIO for cooling?


Eventually yes prob go for AIO. My carbide 330r has space for a top mounted radiator. I have this weird preference of incremental changes to my rig so I'm going from stock to air then AIO. Might be me wanting to compare the difference and experiencing each one. Or maybe I just prefer tinkering with it from time to time rather than setup and leave it alone









I also never expected the stock fan to be this loud on load btw. Atrocious.


----------



## tabascosauz

Quote:


> Originally Posted by *maximsilentfoot*
> 
> Eventually yes prob go for AIO. My carbide 330r has space for a top mounted radiator. I have this weird preference of incremental changes to my rig so I'm going from stock to air then AIO. Might be me wanting to compare the difference and experiencing each one. Or maybe I just prefer tinkering with it from time to time rather than setup and leave it alone
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I also never expected the stock fan to be this loud on load btw. Atrocious.


Pretty sure Intel stock cooler design hasn't changed since Core 2. And plus the 4790K runs hot. Damn...the entire room gets toasty when this thing hits up load.

How are your temps in that case? It appears that the 330R, being a silence-oriented case, doesn't do that well when it comes to thermals; maybe an AIO would be better unless you've got some really strong fans to move air through the case, then maybe you might enjoy a D15 or TC14PE?


----------



## Arkanon

Well, it had to happen sometime. Instead of upgrading my old rig's cpu and mobo i had the chance to trade it in for a clevo gaming notebook with GTX765M. Wasn't entirely convinced by that card, so guess what. Tomorrow i'm trading that notebook in for one with a 7970M, which is essentially a power optimised 7870/270x in MXM format. First thing on the menu is bios modding that card and overvolting to around 1.075v in which i hope to achieve around stock 270x clocks. Should be a fun little experiment.


----------



## Devildog83

Quote:


> Originally Posted by *tabascosauz*
> 
> Pretty sure Intel stock cooler design hasn't changed since Core 2. And plus the 4790K runs hot. Damn...the entire room gets toasty when this thing hits up load.
> 
> How are your temps in that case? It appears that the 330R, being a silence-oriented case, doesn't do that well when it comes to thermals; maybe an AIO would be better unless you've got some really strong fans to move air through the case, then maybe you might enjoy a D15 or TC14PE?


I heard the 4670k can get hot too and that's why I am happy to have a full dedicated loop right now just for the CPU. The G3258 doesn't even really need too much. I had this little Pentium up too 4.3 Ghz and never went above 55c under any stress tests. I could have took the thing to 4.7/4.8 easy.


----------



## maximsilentfoot

Quote:


> Originally Posted by *tabascosauz*
> 
> Pretty sure Intel stock cooler design hasn't changed since Core 2. And plus the 4790K runs hot. Damn...the entire room gets toasty when this thing hits up load.
> 
> How are your temps in that case? It appears that the 330R, being a silence-oriented case, doesn't do that well when it comes to thermals; maybe an AIO would be better unless you've got some really strong fans to move air through the case, then maybe you might enjoy a D15 or TC14PE?


On idle my cpu hangs around 40 and system around 35.

On load cpu goes up to 60/70 and system around 45. CPU fan gets annoyingly loud tho!


----------



## tabascosauz

Quote:


> Originally Posted by *maximsilentfoot*
> 
> On idle my cpu hangs around 40 and system around 35.
> 
> On load cpu goes up to 60/70 and system around 45. CPU fan gets annoyingly loud tho!


Are you sure that's full load?...

My U9B manages only 84°C on full load on OCCT. I don't think we're speaking of the same thing here.


----------



## maximsilentfoot

Oh sorry.. That temp was just me gaming or running fire strike. I don't have occt but will get it and let you know the temps!


----------



## maximsilentfoot

Quote:


> Originally Posted by *tabascosauz*
> 
> Are you sure that's full load?...
> 
> My U9B manages only 84°C on full load on OCCT. I don't think we're speaking of the same thing here.


tried OCCT and hit 86! might be installing the evo 212 tonight


----------



## tabascosauz

Quote:


> Originally Posted by *maximsilentfoot*
> 
> tried OCCT and hit 86! might be installing the evo 212 tonight


OCCT will make your temps run higher if you let it run for more than 5 minutes. The 212 evo will be a good choice for your 4790K.


----------



## b0uncyfr0

Hi guys, im currently debating grabbing a 280x in the next week or so but im stuck between two cards. The Asus 280x DirectCU ll TOP and MSI's 280x Gaming 3G. I know both cards perform about he same with the MSI being a tad quieter. I want to know which one would be the better overclocker. In this thread, do the MSi 280's consistently beat the Asus's, or wise verser? Any info would be greatly appreciated.


----------



## Klem

Quote:


> Originally Posted by *Arkanon*
> 
> Well, it had to happen sometime. Instead of upgrading my old rig's cpu and mobo i had the chance to trade it in for a clevo gaming notebook with GTX765M. Wasn't entirely convinced by that card, so guess what. Tomorrow i'm trading that notebook in for one with a 7970M, which is essentially a power optimised 7870/270x in MXM format. First thing on the menu is bios modding that card and overvolting to around 1.075v in which i hope to achieve around stock 270x clocks. Should be a fun little experiment.


Warning! We all should know that the user Arkanon is a liar and a cheater. He deceived me, now it can also deceive other people. Be careful in dealing with him!


----------



## Agent Smith1984

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Hi guys, im currently debating grabbing a 280x in the next week or so but im stuck between two cards. The Asus 280x DirectCU ll TOP and MSI's 280x Gaming 3G. I know both cards perform about he same with the MSI being a tad quieter. I want to know which one would be the better overclocker. In this thread, do the MSi 280's consistently beat the Asus's, or wise verser? Any info would be greatly appreciated.


I personally think it's a bad time to buy 280x cards...

They have gone back up some in price, and what you are getting is essentially a good card, but it has ran it's course.
I have watched them fall to high 100's, to low 200's, (of course this is USD, but still), and now they are all back in the mid 200's again....

I bought mine last July used for $125, then sold it 4 months later for $130....

I then bought my 290 tri-x used for $160, when they were getting down to $200-220 new, and now they are back near $300 again.

280x is a great card, but I would avoid the asus top if you do get one because they have well known VRAM issues....


----------



## M1kuTheAwesome

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *b0uncyfr0*
> 
> Hi guys, im currently debating grabbing a 280x in the next week or so but im stuck between two cards. The Asus 280x DirectCU ll TOP and MSI's 280x Gaming 3G. I know both cards perform about he same with the MSI being a tad quieter. I want to know which one would be the better overclocker. In this thread, do the MSi 280's consistently beat the Asus's, or wise verser? Any info would be greatly appreciated.
> 
> 
> 
> I personally think it's a bad time to buy 280x cards...
> 
> They have gone back up some in price, and what you are getting is essentially a good card, but it has ran it's course.
> I have watched them fall to high 100's, to low 200's, (of course this is USD, but still), and now they are all back in the mid 200's again....
> 
> I bought mine last July used for $125, then sold it 4 months later for $130....
> 
> I then bought my 290 tri-x used for $160, when they were getting down to $200-220 new, and now they are back near $300 again.
> 
> 280x is a great card, but I would avoid the asus top if you do get one because they have well known VRAM issues....
Click to expand...

If the whispers are true AMD's 300 series cards might launch sometime in April so if you can put up with that upgrade itch, wait until then.
And yeah, I remember when the Internet was full of people whose Asus 280x cards had VRAM issues... Although they might have fixed it on newer cards, if you want to avoid trouble, avoid the Asus TOP just in case.


----------



## tomytom99

Alrighty, I'm thinking of actually upgrading my GPU (*gasp goes my wallet*). I've got the 270x right now, but I'm actually thinking of getting a 290 on Newegg ($270 right now) as to smooth the bottleneck I might have with my other HW. (In the sig, whatta trooper). I'm not sure how much it'll be worth, but since I do the 3x1 eyefinity, and I do play all of those games that are GPU intensive. I'd like to get a view from all of you guys, I don't want to go splurge the money without input. I'm planning on putting the old 270x in a different build, and then a friend is buying my 7870 black edition for $100. I'm pretty sure about getting the 290, but I just want an opinion. (It's Sapphire's R9 290)


----------



## Agent Smith1984

Quote:


> Originally Posted by *tomytom99*
> 
> Alrighty, I'm thinking of actually upgrading my GPU (*gasp goes my wallet*). I've got the 270x right now, but I'm actually thinking of getting a 290 on Newegg ($270 right now) as to smooth the bottleneck I might have with my other HW. (In the sig, whatta trooper). I'm not sure how much it'll be worth, but since I do the 3x1 eyefinity, and I do play all of those games that are GPU intensive. I'd like to get a view from all of you guys, I don't want to go splurge the money without input. I'm planning on putting the old 270x in a different build, and then a friend is buying my 7870 black edition for $100. I'm pretty sure about getting the 290, but I just want an opinion. (It's Sapphire's R9 290)


The 290 Tri-x is like $220 after rebate right now. Not a bad buy at all, and the cooling is great


----------



## tomytom99

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The 290 Tri-x is like $220 after rebate right now. Not a bad buy at all, and the cooling is great


Not that I do rebates (ugh the hassle), but I might end up doing it, just to make me feel better. I guess only in theory spending $120 is a flawless grab for a 290. Knowing the cooling on my 270x toxic, I'm exited to see the triple fans with the amazing cooler design.

Oh, another thing, do I need to get a DP MST for using multiple monitors on the DP, or would a standard "passive" splitter work with the r9 290's? I've been reading inconsistent information.
AMD: "Up to 6 displays using a DP MST"
Other Forums: "Not needed with the Hawaii cards"
I'm trying to figure out for when I need to use more than three monitors. Since I'd like to try to get a 5x1 array going.


----------



## tabascosauz

Quote:


> Originally Posted by *tomytom99*
> 
> Not that I do rebates (ugh the hassle), but I might end up doing it, just to make me feel better. I guess only in theory spending $120 is a flawless grab for a 290. Knowing the cooling on my 270x toxic, I'm exited to see the triple fans with the amazing cooler design.
> 
> Oh, another thing, do I need to get a DP MST for using multiple monitors on the DP, or would a standard "passive" splitter work with the r9 290's? I've been reading inconsistent information.
> AMD: "Up to 6 displays using a DP MST"
> Other Forums: "Not needed with the Hawaii cards"
> I'm trying to figure out for when I need to use more than three monitors. Since I'd like to try to get a 5x1 array going.


Do it do it do it do it

Seriously, my 280X cost me $319 after rebate. Do the rebate.


----------



## tomytom99

Quote:


> Originally Posted by *tabascosauz*
> 
> Do it do it do it do it
> 
> Seriously, my 280X cost me $319 after rebate. Do the rebate.


Alright, will do.
Anybody know about the display converter situation?


----------



## JackCY

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 280x is a great card, but I would avoid the asus top if you do get one because they have well known VRAM issues....


What VRAM issues? The 1.6V and some chips not handling it well especially when not cooled? RMA, simple as that. No memory chips seem cooled on DC2T. The Hynix are fine. Sure having mem. volt. unlocked would be better.

---

Dunno when 300 will be out, summer probably. With 380x being a reworked 290 or something, which is a shame and only 390 will be a new card, that's all. So might be worth a wait since 280x is a veeeeery old chip right now and not worth buying for a long time unless you are buying it second hand for cheap.


----------



## Agent Smith1984

Quote:


> Originally Posted by *JackCY*
> 
> What VRAM issues? The 1.6V and some chips not handling it well especially when not cooled? RMA, simple as that. No memory chips seem cooled on DC2T. The Hynix are fine. Sure having mem. volt. unlocked would be better.
> 
> ---
> 
> Dunno when 300 will be out, summer probably. With 380x being a reworked 290 or something, which is a shame and only 390 will be a new card, that's all. So might be worth a wait since 280x is a veeeeery old chip right now and not worth buying for a long time unless you are buying it second hand for cheap.


Well, there are people all over the place who sent 2, 3, even 4 cards back and still had the issues.....
They finally changed the cards over to 1.5v 1500MHz I believe, but I'd imagine there are some V1 cards still floating around.

Mine actually did pretty good with the 14.7 beta driver though, and the memory would hit a hair over 1800MHz, and the core did 1250 @ 1.3v


----------



## diggiddi

I'm kinda stuck in the same dilemma either get a 280x for crossfire or pickup a 290x (both used).
The cost will be the same (approx $250) cos I need to get a pcie sound card for $100( soundblaster z I'm looking at you) if I crossfire, but it seems Cry 3 bench with crossfire is much better than a single 290x, so thats what I'm leaning more towards, unless someone can nudge me the other way


----------



## tabascosauz

Quote:


> Originally Posted by *diggiddi*
> 
> I'm kinda stuck in the same dilemma either get a 280x for crossfire or pickup a 290x (both used).
> The cost will be the same (approx $250) cos I need to get a pcie sound card for $100( soundblaster z I'm looking at you) if I crossfire, but it seems Cry 3 bench with crossfire is much better than a single 290x, so thats what I'm leaning more towards, unless someone can nudge me the other way


Are you sure that temperatures will be acceptable in that case? Both 280X CFX and a 290X produce a lot of heat.


----------



## diggiddi

Quote:


> Originally Posted by *tabascosauz*
> 
> Are you sure that temperatures will be acceptable in that case? Both 280X CFX and a 290X produce a lot of heat.


Both sides of the case are off so I don't anticipate heat being an issue


----------



## JackCY

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, there are people all over the place who sent 2, 3, even 4 cards back and still had the issues.....
> They finally changed the cards over to 1.5v 1500MHz I believe, but I'd imagine there are some V1 cards still floating around.
> 
> Mine actually did pretty good with the 14.7 beta driver though, and the memory would hit a hair over 1800MHz, and the core did 1250 @ 1.3v


:|
I have V1, I'm not sure I ever saw user pictures of the V2, even in shops most were V1, now all 280xs are going out of shops. V2 is 3 slot, with 1500MHz / 6GHz.

I run mine on 1750MHz / 7GHz to be on the safe side, it can do 1800MHz / 7.2GHz but it's more moody. It's an issue with one badly optimized game, it doesn't even push the GPU to max or anything, 60-80%.
Stock it goes fine I think. At least when I lower the OC it doesn't seem to have any issues.
Benches are useless, it ran 3DMark torture with higher clocks than are usable when gaming.
Quote:


> Originally Posted by *diggiddi*
> 
> I'm kinda stuck in the same dilemma either get a 280x for crossfire or pickup a 290x (both used).
> The cost will be the same (approx $250) cos I need to get a pcie sound card for $100( soundblaster z I'm looking at you) if I crossfire, but it seems Cry 3 bench with crossfire is much better than a single 290x, so thats what I'm leaning more towards, unless someone can nudge me the other way


Why do you need a soundcard if you CF?
Isn't it better to steer clear from the consumer grade soundcards anyway? And get something proper when planning to make the investment?

Most recent Mobo have Realtek ALC1150, which is pretty much fine and renders those consumer grage soundcards obsolete.

1 GPU > CF/SLI for equal money


----------



## diggiddi

Quote:


> Originally Posted by *JackCY*
> 
> Why do you need a soundcard if you CF?
> Isn't it better to steer clear from the consumer grade soundcards anyway? And get something proper when planning to make the investment?
> 
> Most recent Mobo have Realtek ALC1150, which is pretty much fine and renders those consumer grage soundcards obsolete.
> 
> 1 GPU > CF/SLI for equal money


Ummm because the on board sound is not as good as a discrete soundcard, 2nd card'll block the PCI slot, hence need for pcie soundcard

No, at least not on the Sabertooth (ALC892),"something proper" like what?

No they do not render sound cards obsolete I dunno where you are getting that idea from

Thats your opinion, but it still doesn't answer my question, care to elaborate why, wrt to the specific title i'm looking at?


----------



## tabascosauz

Quote:


> Originally Posted by *diggiddi*
> 
> Ummm because the on board sound is not as good as a discrete soundcard, 2nd card'll block the PCI slot, hence need for pcie soundcard
> 
> No, at least not on the Sabertooth (ALC892),"something proper" like what?
> 
> No they do not render sound cards obsolete I dunno where you are getting that idea from
> 
> Thats your opinion, but it still doesn't answer my question, care to elaborate why, wrt to the specific title i'm looking at?


When in a nicely-designed circuit, the ALC1150 can provide a level of audio quality that most people cannot distinguish from that of a PCIe soundcard.

Many Asus Z97 boards, despite flaunting the Crystal Sound 2 moniker, isolated paths, and EMI shielding, only have an ALC892.

Is Crysis 3 the only game you will be playing for the next 2 years? If not, a 290X is the better and more reliable experience, especially when AMD stops giving a crap about supporting R9 200 CF.


----------



## JackCY

ALC892 is to be avoided, but can you hear the difference of it compared to other audio codecs?
ALC1150 is often good enough that you won't hear a difference in terms of quality.
And I'm not sure a human ear can hear the difference, so it's mostly a benchmark stuff.

Focusrite scarlett 2i2 is a popular one for example.
I would say the prices start around $200 a "sound card".
The bonus is they can also look like this









Spoiler: Akai EIE Pro







If a sound card is made by someone like Asus, Creative, etc. it's a general consumer stuff on par with ALC1150 or even using ALC1150 hidden behind some fancier name. Asus hides ALC1150 on motherboards behind fancy names.
It very much depends on what you want to connect to your PC, at what quality, what budget, do you need an amplifier, ...
Replacing some of the latest ALCs with an Asus/Creative kind of PCIe card is IMHO a waste of money. It's not even isolated from the PC, it's still stuck in it and receiving noise and other interferences.
I'm not sure OC.net is the best place to ask about audio, try checking head-fi.org and other audiophile places









Personally if I had to choose between 280x CF and 290/290x I would go for 290/290x even if audio wasn't in question.


----------



## tomytom99

My mobo has the ALC1150. By far some of the clearest audio I've heard from a motherboard. I love how my mobo's got those gold plated connectors, so I can really grab the potential my Razer Tiamat 7.1's have. I never hear any popping, there's no interference, and it's just about as good as I've ever heard. I've never used a recent sound card, but I did use a Creative Audigy 2 a long time back.

I've noticed that ASUS boards do have those fancy names and crud to cover up the lowly sounding ALC chips they use. Also, for some reason my motherboard does have the ALC1150, but also has the Creative software that controls it alongside the Realtek stuff.


----------



## Devildog83

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *tabascosauz*
> 
> When in a nicely-designed circuit, the ALC1150 can provide a level of audio quality that most people cannot distinguish from that of a PCIe soundcard.
> 
> Many Asus Z97 boards, despite flaunting the Crystal Sound 2 moniker, isolated paths, and EMI shielding, only have an ALC892.
> 
> Is Crysis 3 the only game you will be playing for the next 2 years? If not, a 290X is the better and more reliable experience, especially when AMD stops giving a crap about supporting R9 200 CF.






After five months *tabascosauz* has finally been added. Sorry I missed it sir. Shoulda' gave me a Gilligan whack a long time ago. A belated welcome to you.


----------



## doza

i sended vapor-x 7970ghz edition on service couse i had problems with display output loose conectors...
now i got vapor-x 280x couse 7970 was out of production, and soon as i installed it i did some oc







just to see how mutch gpu can go.

first to say i did check asic and it was 64.7.... then i saw type of memory and it was ELPIDA









i did google and i saw that ELPIDA is worst memory on 280x lineup and can go to 1600max :S

now max memory that runs without artifacts is 1750mhz in valley benchmark but in games and firestrike it can go to 1775mhz and stable :S

http://postimg.org/image/rngp445w5/full/
adult image sharing

so basic thing is is elpida memory such a bad overclocker as people say?


----------



## Devildog83

Quote:


> Originally Posted by *tomytom99*
> 
> My mobo has the ALC1150. By far some of the clearest audio I've heard from a motherboard. I love how my mobo's got those gold plated connectors, so I can really grab the potential my Razer Tiamat 7.1's have. I never hear any popping, there's no interference, and it's just about as good as I've ever heard. I've never used a recent sound card, but I did use a Creative Audigy 2 a long time back.
> 
> I've noticed that ASUS boards do have those fancy names and crud to cover up the lowly sounding ALC chips they use. Also, for some reason my motherboard does have the ALC1150, but also has the Creative software that controls it alongside the Realtek stuff.


As far as I can tell from research the Surpreme FX on the Asus boards are not ACL chips. On the FX lll at least , the codec is incorporated into the board so it saves PCI-E lanes. I had a CHVFZ and it was the best sound I ever got out of a motherboard. Unless you use an external amp or a very expensive card it's really tough to beat the Surpreme FX. I don't know where the rumors come from but I have seen no evidence yet that it is a ACL chip. I think it's even Analog.

Not all of the new Z97 boards have ALC 892. The z97-a does and the z97-e but the Sabertooth and the Pro both have ALC1150 and the ROG and Pro Gamer have Surpreme FX. The lower price boards don't have as good audio but that's to be expected.


----------



## Devildog83

Quote:


> Originally Posted by *doza*
> 
> i sended vapor-x 7970ghz edition on service couse i had problems with display output loose conectors...
> now i got vapor-x 280x couse 7970 was out of production, and soon as i installed it i did some oc
> 
> 
> 
> 
> 
> 
> 
> just to see how mutch gpu can go.
> 
> first to say i did check asic and it was 64.7.... then i saw type of memory and it was ELPIDA
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i did google and i saw that ELPIDA is worst memory on 280x lineup and can go to 1600max :S
> 
> now max memory that runs without artifacts is 1750mhz in valley benchmark but in games and firestrike it can go to 1775mhz and stable :S
> 
> http://postimg.org/image/rngp445w5/full/
> adult image sharing
> 
> so basic thing is is elpida memory such a bad overclocker as people say?


I guess it's just depends on the chis themselves. I had Elpida on my 7870 and 270x and they rocked. The core give the most improvement anyhow.


----------



## daffy.duck

I also have Elpida ram and the max I can squeeze out of them is 1550MHz


----------



## Devildog83

Quote:


> Originally Posted by *daffy.duck*
> 
> I also have Elpida ram and the max I can squeeze out of them is 1550MHz


I have heard a lot of stories about Elpida. I guess my success could be because Devil cards use 6Ghz RAM but stock clocks were about 4.8 Ghz.


----------



## hypespazm

So I am an original 780 Nvidia user but recently I purchased a r9 280x and a 7970... How good do they scale, and roughly how much wattage do the use up/ heat do they dissipate?


----------



## gnemelf

Quote:


> Originally Posted by *hypespazm*
> 
> So I am an original 780 Nvidia user but recently I purchased a r9 280x and a 7970... How good do they scale, and roughly how much wattage do the use up/ heat do they dissipate?


use the search function on the thread lots of good info.


----------



## bigaza2151

Has anyone in here had experience throwing an accelero 280x on a twin frozr model? Im considering it but im looking for step by step kind of thing. Little hesitant to start dismantling my gpu, its about the only thibg i havent messed with going on my 3rd build


----------



## JackCY

Quote:


> Originally Posted by *Devildog83*
> 
> As far as I can tell from research the Surpreme FX on the Asus boards are not ACL chips. On the FX lll at least , the codec is incorporated into the board so it saves PCI-E lanes. I had a CHVFZ and it was the best sound I ever got out of a motherboard. Unless you use an external amp or a very expensive card it's really tough to beat the Surpreme FX. I don't know where the rumors come from but I have seen no evidence yet that it is a ACL chip. I think it's even Analog.
> 
> Not all of the new Z97 boards have ALC 892. The z97-a does and the z97-e but the Sabertooth and the Pro both have ALC1150 and the ROG and Pro Gamer have Surpreme FX. The lower price boards don't have as good audio but that's to be expected.


Except you find this in every other in depth review:
Quote:


> The Maximus VII Hero uses the SupremeFX 2014 audio hardware which is basically the same Realtek ALC1150 audio CODEC used for all current production ROG motherboards. The SupremeFX version of the implementation has the same isolated audio PCBs as the standard motherboards have. The Maximus VII Hero has a built in amplifier which is capable of detecting the impedance of connected headphones if connected to the front panel connector. ASUS refers to this feature as SenseAmp. The audio CODEC is shielded and uses dedicated audio capacitors.


Other brands also add amps. Like TI NE5532, etc. or even swappable amps (some GB mobo have that I think).

---

bigaza: What gain do you expect by replacing a good air cooler with another good air cooler? Pics and reviews of the cooler and videos have a simple how to if it's too complicated to grasp. I would see it more as wasting $70 and gaining nothing.
I think this kind of swap was only advantageous on reference blower fan coolers on 290/290x.


----------



## radier

*AMD Catalyst™ 15.3 Beta Driver for Windows®*

http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


----------



## sage101

Quote:


> Originally Posted by *radier*
> 
> *AMD Catalyst™ 15.3 Beta Driver for Windows®*
> 
> http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


Any performance update with this beta driver?


----------



## radier

First of all learn one simple thing. DON'T QUOTE POST DIRECTLY ABOVE !

About the driver you can download them and test if you want. You will find your answer then.


----------



## Sho Minamimoto

I have a magic XFX 280X DD.
57.9% ASIC
Stock voltage @ 1.2v
1150 MHz core 1550 MHz memory so far. I can't believe how high this thing can go. I'm testing it with 15 minute Furmark tests.


----------



## boot318

Quote:


> Originally Posted by *Sho Minamimoto*
> 
> I have a magic XFX 280X DD.
> 57.9% ASIC
> Stock voltage @ 1.2v
> 1150 MHz core 1550 MHz memory so far. I can't believe how high this thing can go. I'm testing it with 15 minute Furmark tests.


I hope you have a strong PSU then. Using Furmark to test (mid to high end) AMD GPUs is just wasting 100+ watts no game will ever demand from a PSU.

Personally, after passing Furmark for 20 minutes, Tomb Raider crashed in 10 minutes of playing. lol


----------



## radier

I have found Unigine Valley good stability tests.
The scores are very reliable so it is easy to find when GDDR5 error correction kicks in.


----------



## Sho Minamimoto

Currently at 1200/1550. No crashes during Furmark yet.

I'll play some video games and come back to it later.

Edit: GPU drivers crashed once. Might have to Furmark longer then.


----------



## diggiddi

How are the new drivers for you guys?


----------



## bigaza2151

Any improvement on battlefield 4 crappy framerate since last patch?


----------



## MiladEd

Hey guys, I was playing Call of Juarez Gunslinger last night, with my card (Sapphire R9 280X Dual-X) OC to 1100 MHz and memory to 1600 MHz. I was getting some "screen blackouts", screen would turn off for a fraction of a second and then turn back on, I thought it was a game bug, as I'd been previously playing Assassin's Creed Rogue at this clocks and it was fine. Then, suddenly, I got some weird artifacts on the screen. Particles with different colors appeared. I quickly checked the temps on AMD Catalyst center, and they seemed fine (around 55 C) but since it took me a while to do so, they must have dropped down a bit. Then I lowered the clock to 1050 MHz, exited the game and rebooted. After the aforementioned problems didn't occur anymore and my temps seemed fine, I kept checking them suing HWiNFO, highest I got was 78 C.

What was the reason for those artifacts? Could my card be damaged, or should I be worried? Those problems didn't happen in any other game, but the only game I've played on that clock is AC Rogue which isn't very demanding.

Thanks in advance.


----------



## radier

The card is unstable. Lower your OC clocks. During black screen the driver is reseting.

Wysłane z mojego GT-N7000


----------



## MiladEd

Quote:


> Originally Posted by *radier*
> 
> The card is unstable. Lower your OC clocks. During black screen the driver is reseting.
> 
> Wysłane z mojego GT-N7000


Oh, OK thanks... I got FuMark to check for instability and find the highest stable OC.


----------



## JackCY

Quote:


> Originally Posted by *Sho Minamimoto*
> 
> Currently at 1200/1550. No crashes during Furmark yet.
> 
> I'll play some video games and come back to it later.
> 
> Edit: GPU drivers crashed once. Might have to Furmark longer then.


Furmark is useless stability tester.

Games, especially those that are rather picky, are best. Even Firestrike is useless.
Sometimes madVR can help but mostly games do the trick. It's long and tedious to test, but then you can play and play, well until it crashes or artifacts.

---

Beta drivers, no difference in Heaven 4.0. Games? That's about game updates and their optimizations for specific brands or cards, not drivers really.


----------



## radier

I told you. Unigine Valey is very reliable and quick way to test OC.


----------



## aaronsta1

Quote:


> Originally Posted by *radier*
> 
> I told you. Unigine Valey is very reliable and quick way to test OC.


i think BF4 is best.. ive had cards pass UV for hours and not even heat up. BF4 for 20 minutes textures start flashing and the computer blue screens.


----------



## GoLDii3

Quote:


> Originally Posted by *aaronsta1*
> 
> i think BF4 is best.. ive had cards pass UV for hours and not even heat up. BF4 for 20 minutes textures start flashing and the computer blue screens.


BF4 is a pain. It's the type of game where you can be stable on various games but not on BF4.


----------



## marto7

What is the lowest voltage for 280x Toxic ? and which column i need to change in VBE7 - 6 or 0 or both ?


----------



## JackCY

What is VBE offering that you can't do with normal OC? Apart from risking to brick the card, loosing UEFI support etc.


----------



## marto7

I dont need UEFI.Dint know even how to enter in it.what is normal OC ?


----------



## tabascosauz

Quote:


> Originally Posted by *marto7*
> 
> I dont need UEFI.Dint know even how to enter in it.what is normal OC ?


What exactly are you even trying to do? Underclock? Overclock?

Without UEFI boot, graphics cards can run into problems (supposedly) with an OS installation that is UEFI. Is your Windows installation UEFI?


----------



## JackCY

Normal OC as in you will OC the GPU via drivers, control panel/some tool, it's only being applied when you want to but it's booting stock.
Where as changing the firmware/vbios means the card will boot with those settings as well which brings it's own sort of pitfalls when booting and changing the firmware with not so up to date VBE with not all features being OK to change in it for every card. If it was as simple, yeah I want this clock and this voltage for every performance level there is including video decode and 2D, hey I would use it but it doesn't work that way and only some changes do work with 280x and you have to know which do for your specific card and what you can setup before the vbios goes nuts due to powerplay etc.

What do you gain by OCing with VBE instead of soft applying the OC? Does VBE allow you to go beyond some limit of soft OC? Say higher voltage than 1.3V which often is already cooking the chips enough anyway?


----------



## Alfshizzle

Anyone know what the last version of GPU tweak was that allowed over 1.3v on 280x top? Can't remember what it was and just lost all my data on backup.. Annoying.


----------



## marto7

I undervolting to lower the temps of the card.I flash the bios because i was used to do with the old card and continue doing with the new one.I change only voltages when the card is unrer load.Booting the card is the same voltage and clocks as original.Only in games changing the values and if very low just freeze the game and i need to up voltage.


----------



## tabascosauz

Quote:


> Originally Posted by *marto7*
> 
> I undervolting to lower the temps of the card.I flash the bios because i was used to do with the old card and continue doing with the new one.I change only voltages when the card is unrer load.Booting the card is the same voltage and clocks as original.Only in games changing the values and if very low just freeze the game and i need to up voltage.


Why go through the hassle of flashing a BIOS and risking a brick? I run my 280X in 685core/1400mem whenever a game doesn't demand anything more, and set the voltage a bit lower @ around 1.14. I don't even need to care whether the card is applying that voltage; I can set the fan to near-silent speeds and still enjoy comfortably low temperatures in the 60s.

If your temps are that high, you've got more urgent matters to tend to than worrying about underclocking through VBE. What kind of numbers are we looking at here?


----------



## Megagoth1702

Hello my friends!

I am about to get a 280X because sadly the memory chips on my great Gigabyte HD 7950 (1000/1250) fail occasionally and I need a replacement.

There are so many different 280X versions out there and I am very confused on what to buy. I want a card that is at least a BIT overclockable and not too loud. But I can not tell what fan layout is loud and what card lets me overclock it more.

Can you guys help me out here?

Thanks a lot in advance!


----------



## Devildog83

Quote:


> Originally Posted by *Megagoth1702*
> 
> Hello my friends!
> 
> I am about to get a 280X because sadly the memory chips on my great Gigabyte HD 7950 (1000/1250) fail occasionally and I need a replacement.
> 
> There are so many different 280X versions out there and I am very confused on what to buy. I want a card that is at least a BIT overclockable and not too loud. But I can not tell what fan layout is loud and what card lets me overclock it more.
> 
> Can you guys help me out here?
> 
> Thanks a lot in advance!


Here is what I did, I upgraded from X-Fire 7870/R9270x to a used R9 290. I just saw one now on E-Bay with a Kraken G10 mount and Corsair H55 cooler for $250 and it includes the original Twin Frozer heat sink and fans. My 290 is very strong and I don't regret it one bit. I don't mean to push you away from a 280x but it's not the most bang for the buck right now. Here are the cards. He is selling 2 of them.


----------



## tomytom99

Quote:


> Originally Posted by *Devildog83*
> 
> Here is what I did, I upgraded from X-Fire 7870/R9270x to a used R9 290. I just saw one now on E-Bay with a Kraken G10 mount and Corsair H55 cooler for $250 and it includes the original Twin Frozer heat sink and fans. My 290 is very strong and I don't regret it one bit. I don't mean to push you away from a 280x but it's not the most bang for the buck right now. Here are the cards. He is selling 2 of them.


Agreed, bang for buck is a big deal in GPU's. If it cost just a few ($10-$35) to get better performance than the "lower" card, then just go for the extra power. You'll thank yourself that you won't be saying "gee, why didn't I spend the extra money? Now I have to invest in a whole new set of GPU's." That's what I'm dealing with right now. :|


----------



## Megagoth1702

Quote:


> Originally Posted by *Devildog83*
> 
> Here is what I did, I upgraded from X-Fire 7870/R9270x to a used R9 290. I just saw one now on E-Bay with a Kraken G10 mount and Corsair H55 cooler for $250 and it includes the original Twin Frozer heat sink and fans. My 290 is very strong and I don't regret it one bit. I don't mean to push you away from a 280x but it's not the most bang for the buck right now. Here are the cards. He is selling 2 of them.


Quote:


> Originally Posted by *tomytom99*
> 
> Agreed, bang for buck is a big deal in GPU's. If it cost just a few ($10-$35) to get better performance than the "lower" card, then just go for the extra power. You'll thank yourself that you won't be saying "gee, why didn't I spend the extra money? Now I have to invest in a whole new set of GPU's." That's what I'm dealing with right now. :|


What should I go for then guys? 290x?

I live in germany, so bang for the euro still matters here but I wonder what kind of money to spend here? Are there "bang for the buck" charts?


----------



## Spork13

Tome hardware do one every month or so. US prices though - make me jealous.








http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107-7.html


----------



## LocoDiceGR

R9 280X / 280 & 270X / 270

they will support dx12 right?

because im looking to buy one of these, i cant decide yet.


----------



## JackCY

GCN1.0+ should.
Quote:


> DirectX 12 will be essentially supported on all Fermi and later Nvidia GPUs, on AMD's GCN-based chips and on Intel's Haswell and later processors' graphics units.


But unlike Nvidia you can use Mantle today, not wait a year from now for Dx12 on which MS started to work after being prodded by AMD.


----------



## mak1skav

I just bought one Sapphire 280x Vapor-x and it gives me a headache, although I have tested it with Furmark and can run it stable for a long time in games I have a very different experience. Every time I run a game either the game will crash after little time and it will send me to the desktop or the whole system will hang and I will get the message "Monitor going to sleep" forcing me to turn off my computer using the power switch.

I have tried clean install for the drivers, 3-4 different versions of them from the newest to the oldest, but the problem remains. I guess probably there must be something wrong with my card but i would like to ask if there is anyone else with the same experience who can help me.


----------



## tabascosauz

Quote:


> Originally Posted by *mak1skav*
> 
> I just bought one Sapphire 280x Vapor-x and it gives me a headache, although I have tested it with Furmark and can run it stable for a long time in games I have a very different experience. Every time I run a game either the game will crash after little time and it will send me to the desktop or the whole system will hang and I will get the message "Monitor going to sleep" forcing me to turn off my computer using the power switch.
> 
> I have tried clean install for the drivers, 3-4 different versions of them from the newest to the oldest, but the problem remains. I guess probably there must be something wrong with my card but i would like to ask if there is anyone else with the same experience who can help me.


Clocks too high or voltage too low? Have you changed either one?

Did you have an Nvidia card before? If yes, I hope you used DDU before installing the 280X.


----------



## mak1skav

Card is running with stock clocks and the voltage is locked so i can't change it. I have even tried to run some games with clocks under-clocked to 900MHz (from 1100) for GPU and 1200MHZ (from 1500) for RAM but i still got the same problems.


----------



## tabascosauz

Quote:


> Originally Posted by *mak1skav*
> 
> Card is running with stock clocks and the voltage is locked so i can't change it. I have even tried to run some games with clocks under-clocked to 900MHz (from 1100) for GPU and 1200MHZ (from 1500) for RAM but i still got the same problems.


How are temps?

This is highly unusual; Furmark is incredibly demanding for AMD cards, causing them to generate copious amounts of heat and draw monstrous amounts of power. It's usually much more demanding than any game.

If your temps don't seem out of place, I think you'll have to go for an RMA. Sapphire's RMA is a pain in the ass.


----------



## Catscratch

Quote:


> Originally Posted by *mak1skav*
> 
> I just bought one Sapphire 280x Vapor-x and it gives me a headache, although I have tested it with Furmark and can run it stable for a long time in games I have a very different experience. Every time I run a game either the game will crash after little time and it will send me to the desktop or the whole system will hang and I will get the message "Monitor going to sleep" forcing me to turn off my computer using the power switch.
> 
> I have tried clean install for the drivers, 3-4 different versions of them from the newest to the oldest, but the problem remains. I guess probably there must be something wrong with my card but i would like to ask if there is anyone else with the same experience who can help me.


Quote:


> Originally Posted by *tabascosauz*
> 
> Clocks too high or voltage too low? Have you changed either one?
> 
> Did you have an Nvidia card before? If yes, I hope you used DDU before installing the 280X.


Quote:


> Originally Posted by *mak1skav*
> 
> Card is running with stock clocks and the voltage is locked so i can't change it. I have even tried to run some games with clocks under-clocked to 900MHz (from 1100) for GPU and 1200MHZ (from 1500) for RAM but i still got the same problems.


Quote:


> Originally Posted by *tabascosauz*
> 
> How are temps?
> 
> This is highly unusual; Furmark is incredibly demanding for AMD cards, causing them to generate copious amounts of heat and draw monstrous amounts of power. It's usually much more demanding than any game.
> 
> If your temps don't seem out of place, I think you'll have to go for an RMA. Sapphire's RMA is a pain in the ass.


The card is either faulty or somehow just incompatible with your current rig (gotta try on another machine to be sure)

I read a lot about 280x before I buy my 280x
http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=2456&psn=&lid=1&leg=0
http://www.newegg.com/Product/Product.aspx?Item=N82E16814202125&cm_re=sapphire_280x-_-14-202-125-_-Product

The sapphire forums have a big 280x thread and it seemed all the people suffering from Black Screen or crashes had boost models, Toxic, vapor-x, tri-x. I went for the non-boost version and never had a problem. I wonder if it's really about the boost feature or simply the cards are faulty.


----------



## JaredLaskey82

Quote:


> Originally Posted by *Catscratch*
> 
> The card is either faulty or somehow just incompatible with your current rig (gotta try on another machine to be sure)
> 
> I read a lot about 280x before I buy my 280x
> http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=2456&psn=&lid=1&leg=0
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202125&cm_re=sapphire_280x-_-14-202-125-_-Product
> 
> The sapphire forums have a big 280x thread and it seemed all the people suffering from Black Screen or crashes had boost models, Toxic, vapor-x, tri-x. I went for the non-boost version and never had a problem. I wonder if it's really about the boost feature or simply the cards are faulty.


I have never had a problem with my two Asus DCU2 280x cards


----------



## zappian

Hello guys.


----------



## KingCry

Quote:


> Originally Posted by *zappian*
> 
> Hello guys.


Took you long enough to come over to OCN


----------



## afaque

hey guys! i made a tutorial on how to make radeon gpus fanless with vbe7 here on youtube:

be sure to check it out.. big thanks to developer for making this software..









https://www.youtube.com/watch?v=e53CO6K8Xdo&list=UUNbEvt90Lvrs_v8aJJ2eGFw


----------



## Vinicius

can anyone please upload the stock bios from xfx r9 280 NON-X black double dissipation edition (core 1000 mem 1300)?


----------



## JaredLaskey82

Quote:


> Originally Posted by *zappian*
> 
> Hello guys.


What scores are you getting on Firestrike extreme?


----------



## JaredLaskey82

Well I think I am done.

The two ASUS 280X DCU2 cards are running smooth and the whole computer is nice and silent.











Can do a spec list if anyone wants. Bus can't be bothered making it if no one is bothered.


----------



## tabascosauz

Quote:


> Originally Posted by *Vinicius*
> 
> can anyone please upload the stock bios from xfx r9 280 NON-X black double dissipation edition (core 1000 mem 1300)?


Have you checked TPU's VGA BIOS database?


----------



## Vinicius

Quote:


> Originally Posted by *tabascosauz*
> 
> Have you checked TPU's VGA BIOS database?


Yes, I have looked and its not there


----------



## tabascosauz

Quote:


> Originally Posted by *Vinicius*
> 
> Yes, I have looked and its not there


Are you SURE?



Pretty sure that the Black part refers to the extra customer incentives offered by XFX and maybe a slightly higher clock. You could just save your own BIOS right now in GPU-Z if you're still stock.


----------



## Lyxchoklad

Hi. I have a pair of MSI R9 270x itx in my build that I blacked out and painted the logo flipper blue. My build was nominated for MOTM, but it's not looking good for me. http://www.overclock.net/t/1546811/ocn-mod-of-the-month-march-2015-amateur-class-vote-now


----------



## Vinicius

Quote:


> Originally Posted by *tabascosauz*
> 
> Are you SURE?
> 
> 
> 
> Pretty sure that the Black part refers to the extra customer incentives offered by XFX and maybe a slightly higher clock. You could just save your own BIOS right now in GPU-Z if you're still stock.


the bios for black double dissipation has core clock 1000mhz and memory 1250mhz so its not uploaded there
i lose the stock bios backup


----------



## tabascosauz

Quote:


> Originally Posted by *Vinicius*
> 
> the bios for black double dissipation has core clock 1000mhz and memory 1250mhz so its not uploaded there
> i lose the stock bios backup


Is it really that important? I don't think Catalyst limits will keep you from OCing to 1000MHz.

Were you trying to do VBE7 or something?


----------



## Traace

Quote:


> Originally Posted by *tabascosauz*
> 
> Is it really that important? I don't think Catalyst limits will keep you from OCing to 1000MHz.
> 
> Were you trying to do VBE7 or something?


Bios edit should work, if not i can provide u mine.

Catalyst limits my R9 280 to 1200Mhz Core. It's an Asus DirectCU II Top. I let em stay at1000mhz, it's enough.

However, u can run that AMD Tahiti GPU's on stable 1500mhz Memory Clock.
(Vendor default: 1300Mhz here)


----------



## burning chrome

is there a non triple slot aftermarket cooler for the 280x? i'm getting quite annoyed by my turboduos fans breaking.


----------



## Devildog83

Quote:


> Originally Posted by *burning chrome*
> 
> is there a non triple slot aftermarket cooler for the 280x? i'm getting quite annoyed by my turboduos fans breaking.


This one. You might also consider the AIO adapters from corsair and NZXT if you have space for a 120mm rad
http://www.newegg.com/Product/Product.aspx?Item=N82E16835426026&cm_re=gpu_cooler-_-35-426-026-_-Product

I am not sure it fits the 280 or 280x. It does fit the 270x/290/290x for sure. You would have to check on that.


----------



## gnemelf

Theres the other version made for the 280x same cooler just for the 280x


----------



## gnemelf

Heres a updated shot (new mobo/proc 2nd card etc.)


----------



## Catscratch

Quote:


> Originally Posted by *JaredLaskey82*
> 
> I have never had a problem with my two Asus DCU2 280x cards


I know there are lotsa people with no issues with 280x from day one. Then there are people with the same exact rigs but lotsa problems.

Nice clean rig btw.


----------



## SmXme

Hi guys,
First of all, i'm french so please, excuse me for my bad english.
So, i've a little problem : I've Crossfire R9 280x : One elpida (912-v277-057) and One hynix -912-v277-067).
I've some problem in game (Assassins Creed Unity, with some freezes, lags, etc). So i decided to buy another Hynix.
I found one, but the seller is telling me that the card is 912-v277-068.
I didn't even know that was real.
Can someone tell me what are the difference between my 912-v277-067 and his 912-v277-068 ?
Can i have some issues in game or other with those 2 cards?
Thanks !


----------



## burning chrome

Quote:


> Originally Posted by *Devildog83*
> 
> This one. You might also consider the AIO adapters from corsair and NZXT if you have space for a 120mm rad
> http://www.newegg.com/Product/Product.aspx?Item=N82E16835426026&cm_re=gpu_cooler-_-35-426-026-_-Product
> 
> I am not sure it fits the 280 or 280x. It does fit the 270x/290/290x for sure. You would have to check on that.


Quote:


> Originally Posted by *gnemelf*
> 
> Theres the other version made for the 280x same cooler just for the 280x


tyvm.


----------



## Devildog83

I really need to get this card under water Huh?


----------



## tabascosauz

Quote:


> Originally Posted by *Devildog83*
> 
> I really need to get this card under water Huh?


RIP ears

that Z97-pro looks great


----------



## Devildog83

Quote:


> Originally Posted by *tabascosauz*
> 
> RIP ears
> 
> that Z97-pro looks great


Thanks now to figure out how to overclock with it.


----------



## b0uncyfr0

How good are the XFX 280x cards? I just acquired a Double D Black edition and im gonna try and overclock it as much as i can without blowing it up. Will be housed in a FT02 so it should be sufficiently cooled. From the main page, i didn't see any xfx 280x reach 1200mhz


----------



## JaredLaskey82

Quote:


> Originally Posted by *Catscratch*
> 
> I know there are lotsa people with no issues with 280x from day one. Then there are people with the same exact rigs but lotsa problems.
> 
> Nice clean rig btw.


Why thank you.

I am all done now and enjoying PC bliss.


----------



## Catscratch

Quote:


> Originally Posted by *SmXme*
> 
> Hi guys,
> First of all, i'm french so please, excuse me for my bad english.
> So, i've a little problem : I've Crossfire R9 280x : One elpida (912-v277-057) and One hynix -912-v277-067).
> I've some problem in game (Assassins Creed Unity, with some freezes, lags, etc). So i decided to buy another Hynix.
> I found one, but the seller is telling me that the card is 912-v277-068.
> I didn't even know that was real.
> Can someone tell me what are the difference between my 912-v277-067 and his 912-v277-068 ?
> Can i have some issues in game or other with those 2 cards?
> Thanks !


Those are memory serial numbers ? You don't need to match memory numbers. Normally, you don't need to match memory brands (elpida, hynix) either. I think your problems with Assassin's Creed is because of the game itself. Did you search for Assassin's Creed problems on internet ? Do you have problems with other games ?


----------



## Catscratch

Quote:


> Originally Posted by *JaredLaskey82*
> 
> Why thank you.
> 
> I am all done now and enjoying PC bliss.




Woah tidy. I must be the untidiest Virgo on the planet.


----------



## SmXme

Quote:


> Originally Posted by *Catscratch*
> 
> Those are memory serial numbers ? You don't need to match memory numbers. Normally, you don't need to match memory brands (elpida, hynix) either. I think your problems with Assassin's Creed is because of the game itself. Did you search for Assassin's Creed problems on internet ? Do you have problems with other games ?


Yeah it happens, FarCry 4 or others, but not rly graphic bugs with them.
Apparently, i've problems with msaa2x 4x , etc. When i'm with fxaa, there is nothing. And my games are pretty smooth (Shadow Mordor Maxed Out 60fps avg).


----------



## JaredLaskey82

Quote:


> Originally Posted by *Catscratch*
> 
> 
> 
> Woah tidy. I must be the untidiest Virgo on the planet.


It just takes time to get everything looking tidy. A lot of planning went into it before getting it to the stage I have it now.


----------



## MiladEd

Anybody here has gotten GTA V? I'm curious to know how it runs on our hardware. Specially R9 280X.

Also. Can somebody mirror the 15.4 catalyst drivers please? I can't access them from my country. I'd really appreciate it.


----------



## Phantasia

Curious to see how GTA V behaves with your girls


----------



## tomytom99

Quote:


> Originally Posted by *MiladEd*
> 
> Anybody here has gotten GTA V? I'm curious to know how it runs on our hardware. Specially R9 280X.
> 
> Also. Can somebody mirror the 15.4 catalyst drivers please? I can't access them from my country. I'd really appreciate it.


Windows 7 64bit drivers, right?

GTA 5 would not appreciate just a 270x and my 3x1 4372*1022 eyefinity.


----------



## MiladEd

Quote:


> Originally Posted by *tomytom99*
> 
> Windows 7 64bit drivers, right?
> 
> GTA 5 would not appreciate just a 270x and my 3x1 4372*1022 eyefinity.


Well that's really high resolution! I think only a R9 290 or better can run that...

About the drivers, a nice member already PMed me a mirror, but thanks!


----------



## mcmilk11

I have a PowerColor R9 270X PCS+, actually i'm playing BTF4 and the crew.

For BTF4 i play with ultra configuration. My temps are 74-78 on full load.

It's normal?

So i want to know if any of you have an Universal GPU WaterBlock.

Thanks.


----------



## b0uncyfr0

Could anyone enlighten me on how to get VSR support or some form of Downsampling going on my 280x? Ive tried The VSR batch files with 15.3, that didnt work. Ive tried the reg hack, didint work. And finally CRU gives me 1440p but my monitor refuses to use it saying "Its not supported" when i know for a fact it can do at least 3200x1800 at stock timings.


----------



## bichael

Quote:


> Originally Posted by *mcmilk11*
> 
> I have a PowerColor R9 270X PCS+, actually i'm playing BTF4 and the crew.
> 
> For BTF4 i play with ultra configuration. My temps are 74-78 on full load.
> 
> It's normal?
> 
> So i want to know if any of you have an Universal GPU WaterBlock.
> 
> Thanks.


I've got the exact same model of GPU. Temps are not really anything to worry about, is it overclocked? My temps were around 70 at stock and 75 overclocked (maybe up near 80 in Metro LL), that's in a small case with probably less than ideal airflow. My old GTX550Ti went up to 90 even at stock but still ran fine.

I've since gone to a full watercooled loop, primarily so I could put the rad external and keep my computer in a semi-closed cupboard under the TV but also for super low noise. It only got me a minor boost in OC on the card as I don't think it's possible to up the voltage.

I went with the semi-universal Alphacool GPX block and would definitely recommend it. Nice mid ground between universal core only and full cover. Also relatively good value especially when you factor in cheap upgrades.
http://www.overclock.net/t/1539537/first-time-water-cool-external-rad-mitx-case#post_23499454

edit - just to note on air i was running around 1150/1475 and on water I'm at 1175/1475, both backed down a bit from what it could run in Firestrike for full game stability.

edit - sorry I should check who I'm talking to, sorry for telling you the same thing twice on different threads! hope all is going well with your upgrades.


----------



## Grzesiek1010

HI ALL

i have big problem with my graphic

MY FULL SPEC

ntel core i5 3570k 3,4ghz
msi z77-gd55
ram kingoston hyperx ddr 1600mhz 8gb dual
OCZ modXstream 500w
SSD 120GB kingston k3
HDD 500GB samsung
and .... MSI r9 270x 2g gaming stock 1080/1400 and i want some OC
i read a lots of reveiw and benchmark and see how they OC card 1200/1600 ! 1180/1550 etc etc and no bsod and artifacts

First of all sorry if i make mistake in my english








every time when i try play after 1 - 2h in screen show artifacts, colorful squares or normlany game freezes and shut down








i try a lots of guide for forum

install a lots of amd CCC drivers BETA alpha omega etc ...
install new bios using ati winflash and bootable flash disk a little rise up vcore with msi afterburner i I do not know what else ...
always the same problem ... i really disappointed

all games are original
titanfall - artifacts
Watch dogs - artifatcs
mortal kombat x - artifacts
evolve - bsod
Battlefield after 2 - 3h colorful figures and blur

temperatures are always in good condition max 75' C
i try msi afterburner

core voltage+100
power +20
1200 / 1600

core voltage +0
power +20
1120/1400

and more variants always artifacts ... MY GRAPHIC IS NOT TO OC :/

my curret bios TV303MS.102

any solution ?
maybe my msi afterburner settins is wrong ?



my amd ccc driver 15.4 Beta

PLS HELP SOMEONE u my last change if nobody help i gives GPU for warranty


----------



## Mr-Dark

Quote:


> Originally Posted by *Grzesiek1010*
> 
> HI ALL
> 
> i have big problem with my graphic
> 
> MY FULL SPEC
> 
> ntel core i5 3570k 3,4ghz
> msi z77-gd55
> ram kingoston hyperx ddr 1600mhz 8gb dual
> OCZ modXstream 500w
> SSD 120GB kingston k3
> HDD 500GB samsung
> and .... MSI r9 270x 2g gaming stock 1080/1400 and i want some OC
> i read a lots of reveiw and benchmark and see how they OC card 1200/1600 ! 1180/1550 etc etc and no bsod and artifacts
> 
> First of all sorry if i make mistake in my english
> 
> 
> 
> 
> 
> 
> 
> 
> every time when i try play after 1 - 2h in screen show artifacts, colorful squares or normlany game freezes and shut down
> 
> 
> 
> 
> 
> 
> 
> 
> i try a lots of guide for forum
> 
> install a lots of amd CCC drivers BETA alpha omega etc ...
> install new bios using ati winflash and bootable flash disk a little rise up vcore with msi afterburner i I do not know what else ...
> always the same problem ... i really disappointed
> 
> all games are original
> titanfall - artifacts
> Watch dogs - artifatcs
> mortal kombat x - artifacts
> evolve - bsod
> Battlefield after 2 - 3h colorful figures and blur
> 
> temperatures are always in good condition max 75' C
> i try msi afterburner
> 
> core voltage+100
> power +20
> 1200 / 1600
> 
> core voltage +0
> power +20
> 1120/1400
> 
> and more variants always artifacts ... MY GRAPHIC IS NOT TO OC :/
> 
> my curret bios TV303MS.102
> 
> any solution ?
> maybe my msi afterburner settins is wrong ?
> 
> 
> 
> my amd ccc driver 15.4 Beta
> 
> PLS HELP SOMEONE u my last change if nobody help i gives GPU for warranty


First of all not all card oc the same

you need to clear all setting and start oc the core to find max stable oc without crash or artfact or smething then go to the memory do the same

your oc wrong now the core/memory in the same time maby your core cant oc or maby its your memory thats it

GL


----------



## Devildog83

Quote:


> Originally Posted by *Mr-Dark*
> 
> First of all not all card oc the same
> 
> you need to clear all setting and start oc the core to find max stable oc without crash or artfact or smething then go to the memory do the same
> 
> your oc wrong now the core/memory in the same time maby your core cant oc or maby its your memory thats it
> 
> GL


This^.

Also what settings and resolution are you playing at? Take your time and do the OC right and on every bump check the most demanding stress test or game you can to make sure it's stable and then bump some more..

Can you RMA a GPU nowadays because it doesn't overclock well? I could see it if it had issues at stock.


----------



## Grzesiek1010

Today i try this
In msi afterburner
+100vcore
-20% power
1200/1600

And...
Furmark no crash and arrifacts in stres and benchmark
In heaven benchmark Extreme no problem in test

I try in games MK:X and Evolve


----------



## Grzesiek1010

I think i found it!
Oc
1200/1450
Vcore +100
Power - 20%

Gta V all high 1080 runs great no problem the same with mk x and Evolve tomorrow i try add +10mhz on memory and check


----------



## Devildog83

Quote:


> Originally Posted by *Grzesiek1010*
> 
> I think i found it!
> Oc
> 1200/1450
> Vcore +100
> Power - 20%
> 
> Gta V all high 1080 runs great no problem the same with mk x and Evolve tomorrow i try add +10mhz on memory and check


Nice. If you are running the core that high it's probably why the memory has to be lower than you might like. Core gives the best performance increase though so if it works run with that. The main thing in games is stability and decent enough FPS to run smooth. What kind of FPS are you getting?


----------



## Grzesiek1010

I try add some to core


----------



## Agent Smith1984

Quote:


> Originally Posted by *Grzesiek1010*
> 
> I think i found it!
> Oc
> 1200/1450
> Vcore +100
> Power - 20%
> 
> Gta V all high 1080 runs great no problem the same with mk x and Evolve tomorrow i try add +10mhz on memory and check


You do mean+ 20% power right?
-20 would throttle the card badly..... It'll appear to be stable, but thats cause the clocks are actually much lower...


----------



## mutatedknutz

Guys i wanted to ask if all upcoming amd r9 380 above cards will have hbm? or hbm versions of card will be expensive? I was going to get a gtx 970 from my current r9 280 2 months back, but decided to wait as 3.5gb issue showed up and news of upcoming cards. I just dont want the price to be more than gtx 970


----------



## GoLDii3

Quote:


> Originally Posted by *mutatedknutz*
> 
> Guys i wanted to ask if all upcoming amd r9 380 above cards will have hbm? or hbm versions of card will be expensive? I was going to get a gtx 970 from my current r9 280 2 months back, but decided to wait as 3.5gb issue showed up and news of upcoming cards. I just dont want the price to be more than gtx 970


I think i saw somewhere that the R9 380 and 380X will be rebranded R9 290 and R9 290X so i think the cards with HBM will be the R9 390 and 390X. Either way just wait,it's only a few months and with your R9 280 you can still do good and the price of a 970/290 can't do anything more than get lower.


----------



## End3R

Quote:


> Originally Posted by *mutatedknutz*
> 
> Guys i wanted to ask if all upcoming amd r9 380 above cards will have hbm? or hbm versions of card will be expensive? I was going to get a gtx 970 from my current r9 280 2 months back, but decided to wait as 3.5gb issue showed up and news of upcoming cards. I just dont want the price to be more than gtx 970


The 3.5+.5 vram issue with 970s is a none-issue imo. Only people that have issues are people trying to force max AA settings at above 1080p resolutions just to say they are. Hell even my 270x can play all the big games like shadows of mordor and far cry 4 at smooth framerates @1080p, and I only have 2gb of vram. Only game that's made it crawl on high settings so far is AC Unity, and that's because it's intentionally unoptimized to sell more powerful cards.

As for your actual question though, I have no idea.


----------



## mutatedknutz

Yes thanks guys, ill may be wait for the 300 series to release, hope the hbm ones are not very expensive like gtx 980 or else i wont be able to get it. I wanted to switch cause my friend is building new budget pc so ill sell him my current r9 280 and get a new nvidia or amd series, thats the plan as of now. Hope amd dont delay till Q4


----------



## Devildog83

Quote:


> Originally Posted by *mutatedknutz*
> 
> Yes thanks guys, ill may be wait for the 300 series to release, hope the hbm ones are not very expensive like gtx 980 or else i wont be able to get it. I wanted to switch cause my friend is building new budget pc so ill sell him my current r9 280 and get a new nvidia or amd series, thats the plan as of now. Hope amd dont delay till Q4


Here's my 2 cents - I had a 7870/270x X-fire set-up and the 7870 pooped out on me. I had to make a decision as to what to do. After searching and pondering I decided on a used 290 flashed to 290x for $240. I could have waited a bit for the 970 and 980 to come out and spent more for no reason but chose to get the best price to performance I could and trust me I did. A 970 is over $300 for less performance and the 980 is over $500 more performance. Even new the 290 is still the best price to performance out there. As stated earlier the 380's are just a reboot of the 290's and the 390's will most likely cost as much as the 980's and who know if they will perform better.

If you want the newest gadget and or plan to try and run 4K with one card then wait and see what they come out with. If you want a good beastly card that runs multiple monitors at say 1440p then a good 290/290x is a real bargain. Heck 2 of mine is still less than a 980 and would blow the [email protected]!!s off a charging Rhino at 60 paces.


----------



## mutatedknutz

Quote:


> Originally Posted by *Devildog83*
> 
> Here's my 2 cents - I had a 7870/270x X-fire set-up and the 7870 pooped out on me. I had to make a decision as to what to do. After searching and pondering I decided on a used 290 flashed to 290x for $240. I could have waited a bit for the 970 and 980 to come out and spent more for no reason but chose to get the best price to performance I could and trust me I did. A 970 is over $300 for less performance and the 980 is over $500 more performance. Even new the 290 is still the best price to performance out there. As stated earlier the 380's are just a reboot of the 290's and the 390's will most likely cost as much as the 980's and who know if they will perform better.
> 
> If you want the newest gadget and or plan to try and run 4K with one card then wait and see what they come out with. If you want a good beastly card that runs multiple monitors at say 1440p then a good 290/290x is a real bargain. Heck 2 of mine is still less than a 980 and would blow the [email protected]!!s off a charging Rhino at 60 paces.


Yes i know, even am the bang for buck guy.
I got this current r9 280 in October and i live in india and price drops are not much here. The r9 290A version costs 26000INR(around 420$) and the gtx 970 costs almost the same. But what i want is latest card cause if get a r9 290a and if some features are not supported then ill be disappointed








and i play only on 1080p, monitor and hdtv both are 1080p. I just want to use full settings in game and play on 60fps, thats it. And as youll mentioned that r9 380x will be rebranded version of 290x then will it have no difference in performance? And will it be same like 7950 to 280 and 7970 to 280x? Cause that may kind of suck :| and ill end up getting 290a or gtx 970.
The current price of r9 290x in india is 44000inr(710$)


----------



## MerkageTurk

Sold the 780 TI

Just got the 280x XFX black DD temporary

Waiting for 390x


----------



## Horsemama1956

Quote:


> Originally Posted by *mutatedknutz*
> 
> Yes thanks guys, ill may be wait for the 300 series to release, hope the hbm ones are not very expensive like gtx 980 or else i wont be able to get it. I wanted to switch cause my friend is building new budget pc so ill sell him my current r9 280 and get a new nvidia or amd series, thats the plan as of now. Hope amd dont delay till Q4


Best to just wait and see? There will be plenty of cards to choose from when the new cards come out, so why not just wait and see if the 390\X is worth it. If you go with the new cards you really won't have to worry about upgrading for a while.


----------



## MerkageTurk

I am new to amd and everything is so confusing

The drivers which one to download and what settings are the best performance.


----------



## zindir

Hello, i'm new on this forum and amd graphic card









I bought Sapphire R9 280 Dual-X last week. My card is working with no problem in first bios. After i click the sapphire logo on graphic card and change graphic card bios to second one, there is no image(black screen) in monitor. I can't even see bios screen. My graphic card is broken?

Asus P8Z68-V
8 GB Ram
2600k
Windows 7 64 bit Uefi


----------



## neurotix

Quote:


> Originally Posted by *Devildog83*
> 
> If you want the newest gadget and or plan to try and run 4K with one card then wait and see what they come out with. If you want a good beastly card that runs multiple monitors at say 1440p then a good 290/290x is a real bargain. *Heck 2 of mine is still less than a 980 and would blow the [email protected]!!s off a charging Rhino at 60 paces.*


Trust me, I know









Maxing Dragon Age at 5760x1080 is pretty nice.

And, I exist


----------



## mutatedknutz

Lol as i said this is India, here graphic cards price dont drop really fast.
r9 290 is little more expensive than gtx 970 here. And the r9 290x is around 150$ more than gtx 970, so yeah. And it will stay that way even when new cards come out. Cause its India they only know to make huge profits.
I was wondering if the price of new 380 and 380x will be around gtx 970 range? cause gtx 970 costs 400$ here. And r9 280x around 380$


----------



## Devildog83

Quote:


> Originally Posted by *mutatedknutz*
> 
> Lol as i said this is India, here graphic cards price dont drop really fast.
> r9 290 is little more expensive than gtx 970 here. And the r9 290x is around 150$ more than gtx 970, so yeah. And it will stay that way even when new cards come out. Cause its India they only know to make huge profits.
> I was wondering if the price of new 380 and 380x will be around gtx 970 range? cause gtx 970 costs 400$ here. And r9 280x around 380$


So it's just AMD/ATI cards that are more expensive in India? I guess not being able to buy from Ebay must be a bummer. At those prices my rig would have cost around $3,000.


----------



## MiladEd

Any GTA V players around? What are your settings for getting 60 fps on 1080p? With everything almost maxed out I can get over 60 in less demanding scenes but in crowded areas it's down to 30.


----------



## IronHands

I've been playing GTA V at 1920X1200 and my FPS seems to be very smooth but I do not know what my actual FPS is. I have a 280x but I play the game in window mode sense I have 3 screens and I'm mostly always doing something else like having IE open and watching streams on twitch. I assume my fps would be much better than what it is but my 280x will not come out of 2d mode if I'm doing stuff in IE.


----------



## tabascosauz

Quote:


> Originally Posted by *IronHands*
> 
> I've been playing GTA V at 1920X1200 and my FPS seems to be very smooth but I do not know what my actual FPS is. I have a 280x but I play the game in window mode sense I have 3 screens and I'm mostly always doing something else like having IE open and watching streams on twitch. I assume my fps would be much better than what it is but my 280x will not come out of 2d mode if I'm doing stuff in IE.


FPS is "smooth" as in smooth at 60fps or smooth at 20fps?

It's the general consensus that the 280X can manage a pretty easy 60-80 frames on 1080p; some tests have the 280X pegged at comfortably delivering 60fps at 1440p.


----------



## MiladEd

Mine is quite smooth as well, it's just I get some frame drops on occasion, is more demanding scenes.


----------



## mutatedknutz

Quote:


> Originally Posted by *Devildog83*
> 
> So it's just AMD/ATI cards that are more expensive in India? I guess not being able to buy from Ebay must be a bummer. At those prices my rig would have cost around $3,000.


Nope not just amd cards. Ok let me put it this way, suppose a cards market price is 400$ so after 6 months itll be around 400$ only, and they wont drop for 1-2 years. They purposely dont drop the price, there are 7950 more expensive than a r9 280x here.
So amd r9 290x was around 700$ here at launch, so its still the same, just little less.
And gtx 970 launched around 450$ here, so itll stay that way.
This sucks indians only want to make money. And indian ebay cannot be trusted at all. Cant even get card from us or uk from cousins, cause theyll include duty tax at the airport.







so yeah its pretty expensive when it comes to computer products here.
And in india many of them who play games are still on onboard intel graphics or around r9 260 range card, they dont go for pricey cards, plus they only think that graphic cards are more important than processors and ram so they dont even bother to upgrade and are happy with 30fps at low settings too.
Ive got some friends who play dota at 15-30fps, i cant even think of dota below 60fps.


----------



## Devildog83

Quote:


> Originally Posted by *mutatedknutz*
> 
> Nope not just amd cards. Ok let me put it this way, suppose a cards market price is 400$ so after 6 months itll be around 400$ only, and they wont drop for 1-2 years. They purposely dont drop the price, there are 7950 more expensive than a r9 280x here.
> So amd r9 290x was around 700$ here at launch, so its still the same, just little less.
> And gtx 970 launched around 450$ here, so itll stay that way.
> This sucks indians only want to make money. And indian ebay cannot be trusted at all. Cant even get card from us or uk from cousins, cause theyll include duty tax at the airport.
> 
> 
> 
> 
> 
> 
> 
> so yeah its pretty expensive when it comes to computer products here.
> And in india many of them who play games are still on onboard intel graphics or around r9 260 range card, they dont go for pricey cards, plus they only think that graphic cards are more important than processors and ram so they dont even bother to upgrade and are happy with 30fps at low settings too.
> Ive got some friends who play dota at 15-30fps, i cant even think of dota below 60fps.


You can't just pay a bit more shipping and buy from somewhere else?


----------



## tabascosauz

Quote:


> Originally Posted by *Devildog83*
> 
> You can't just pay a bit more shipping and buy from somewhere else?


I suppose it's like the situation in Canada. You can always opt for a product from an American retailer/etailer, but

1. The exchange rate between $CAD and $USD makes it completely suicidal for your wallet if you choose to buy a product from south of the border. It used to be that US products were much cheaper, then it was that they were less but still cheaper despite the rising exchange rate, and now it's just plain ******ed to go to the states to buy.

2. It's also not like you can smuggle these goods into the country if you got a good price on them. Still gotta pay duties and the crushing 13% tax once it comes over the border.


----------



## Devildog83

Quote:


> Originally Posted by *tabascosauz*
> 
> I suppose it's like the situation in Canada. You can always opt for a product from an American retailer/etailer, but
> 
> 1. The exchange rate between $CAD and $USD makes it completely suicidal for your wallet if you choose to buy a product from south of the border. It used to be that US products were much cheaper, then it was that they were less but still cheaper despite the rising exchange rate, and now it's just plain ******ed to go to the states to buy.
> 
> 2. It's also not like you can smuggle these goods into the country if you got a good price on them. Still gotta pay duties and the crushing 13% tax once it comes over the border.


I know, none of it makes sense, I sold a router on e-bay to a guy from Taiwan and I thought why didn't he just get the thing from the place that made it just down the street from him. It would probably been much cheaper.


----------



## IronHands

I don't think it's 60 FPS smooth, my FPS would be better in just about every game I play but I play in window mode while doing other things on other displays


----------



## tomytom99

Hey, quick question.
Do you guys think that I should keep my 270x in my main rig, or use 2x 6950's in it? I'm asking since a friend is buying the 7870 I have in another rig, so I'd put the 270x in there if the 2 6950's would be better than the 270x. If not, then I'd just put a single 6950 in that other rig.

In case you wonder, I have a 3x1 eyefinity (4368*1022), and I play mostly newer games, most having higher quality graphics.


----------



## neurotix

Quote:


> Originally Posted by *tomytom99*
> 
> Hey, quick question.
> Do you guys think that I should keep my 270x in my main rig, or use 2x 6950's in it? I'm asking since a friend is buying the 7870 I have in another rig, so I'd put the 270x in there if the 2 6950's would be better than the 270x. If not, then I'd just put a single 6950 in that other rig.
> 
> In case you wonder, I have a 3x1 eyefinity (4368*1022), and I play mostly newer games, most having higher quality graphics.


2 6950s would be faster than a single 270X (7870), except in games that support Mantle, probably.

For Eyefinity like that, and NEW games, you probably want a 290 minimum if you're going to use a single card.

My advice: sell both 6950s, sell the 270X, and you'd probably have enough money (or close enough) to get a used 290 with an aftermarket cooler (Tri-X etc.)

If selling things is not an option, then I'd suggest using the 2 6950s, although how much longer they'll be relevant even in Crossfire, on new games, is debatable. (They ARE 5 years old after all.)


----------



## crazymania88

when I set everything to high and no AA in GTA V, I get FPS around 100 FPS with Gigabyte R9 280x Windforce it is set to 1100mhz,
my res is 1680x1050 tho.


----------



## Roboyto

Quote:


> Originally Posted by *tomytom99*
> 
> Hey, quick question.
> Do you guys think that I should keep my 270x in my main rig, or use 2x 6950's in it? I'm asking since a friend is buying the 7870 I have in another rig, so I'd put the 270x in there if the 2 6950's would be better than the 270x. If not, then I'd just put a single 6950 in that other rig.
> 
> In case you wonder, I have a 3x1 eyefinity (4368*1022), and I play mostly newer games, most having higher quality graphics.


Quote:


> Originally Posted by *neurotix*
> 
> 2 6950s would be faster than a single 270X (7870), except in games that support Mantle, probably.
> 
> For Eyefinity like that, and NEW games, you probably want a 290 minimum if you're going to use a single card.
> 
> *My advice: sell both 6950s, sell the 270X, and you'd probably have enough money (or close enough) to get a used 290 with an aftermarket cooler (Tri-X etc.)*
> 
> If selling things is not an option, then I'd suggest using the 2 6950s, although how much longer they'll be relevant even in Crossfire, on new games, is debatable. (They ARE 5 years old after all.)


I wouldn't be going back a couple generations of GPUs, that option is silly IMHO.

You have only one option in my eyes, especially when you say you're running eyefinity. That is sell everything as @neurotix has suggested and get a bigger, badder card.

Unless your sig isn't correct, the 2GB of your 270X isn't going to be enough for eyefinity once you start pushing settings higher. At a minimum a 7950/7970 or R9 280/280X is where you will want to be.

I've been running 5760*1080 on a single 290 for a little over a year and I have been very pleased with its performance. No, you can't max every setting out, but things can look very nice if you are willing to compromise on the more demanding settings.

You can see very detailed results of what a watercooled 290 can do here: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/33460_20#post_23300417

290 VS 970? In case you were wondering: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/34380_20#post_23458320

You can grab one of the best air cooled 290(X)'s on NewEgg right now for $269 + $20 MIR; that is the PowerColor PCS+

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131549&cm_re=r9_290-_-14-131-549-_-Product

If you're willing to tinker a bit you can pick up a great used reference 290 from the marketplace for around $190 and add an AIO or go down the rabbit hole for full blown watercooling.

http://www.overclock.net/t/1533788/40x-amd-reference-r9-290s-xfx-sapphire-and-asus/0_20 Couldn't ask for a more reputable seller than here.

http://www.overclock.net/t/1550146/custom-r9-290-with-aio-cooler/0_20 Already modded with an AIO

http://www.overclock.net/t/1538259/4x-sapphire-r9-290-reference-unlocked-to-r9-290x-with-koolance-vid-ar290x-water-blocks/0_20 with waterblocks

http://www.overclock.net/t/1550487/wts-sapphire-vapor-x-r9-290/0_20 Vapor-X 290

Or a smoking deal on a 280X to eventually Xfire

http://www.overclock.net/t/1552604/gigabyte-gv-r928xoc-3gd-rev2-radeon-r9-280x-3gb/0_20


----------



## tomytom99

The setup would be temporary. I'll be using it until I get a 295x2 later this year. So as long as it's better than a single 270x I'll go for the 2x6950 setup.


----------



## [CyGnus]

A single R9 290 is enough if not get a 2nd one later down the road


----------



## Agent Smith1984

You said you have a 270x and a 7850 in another rig, and also have 2 6950's right? Why not throw one 6950 in the backup rig, crossfire the 270x and 7850 in your primary rig, and sell a single 6950? I mean, unless you are needing to recoop a certain amount of money or something....


----------



## tomytom99

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You said you have a 270x and a 7850 in another rig, and also have 2 6950's right? Why not throw one 6950 in the backup rig, crossfire the 270x and 7850 in your primary rig, and sell a single 6950? I mean, unless you are needing to recoop a certain amount of money or something....


*7870

The thing is that a friend is buying the 7870 from that rig ($100). So from what I understand is that the best way to go is to use the 270x in the spare rig and use the 6950's in the main? I won't have the 7870 once it's sold.


----------



## [CyGnus]

If i were you i would put the 270x in the other PC sell those 6950's and get a R9 290 you will get pretty satisfied with that and with the money of the 3 card's you can afford a 290 so you will not spend any cash


----------



## neurotix

Also, you gotta consider that the R9 295x2 has a pretty exotic liquid cooling solution. From what I've heard, it's made by Asetek and they are cheap and can cause problems. IF you did get a 295x2 you likely wouldn't be able to overclock it much, if at all. It's also a dual GPU card so it will still run very hot, most likely. You may be able to overclock it without raising the voltage, but any kind of voltage overclock would probably be out of the question.

I would recommend two separate cards in Crossfire personally. I'm very happy with my setup.


----------



## MerkageTurk

Just returned my 280x for a 290 as the 280x had driver crashes every time I ran a game., £20 price difference


----------



## Roboyto

Quote:


> Originally Posted by *tomytom99*
> 
> The setup would be temporary. I'll be using it until I get a 295x2 later this year. So as long as it's better than a single 270x I'll go for the 2x6950 setup.


Value of the 6950 will only be less the longer you wait.

And the availability of the 295X2 will be diminishing just as fast while you're waiting.

If you're anticipating on waiting and spending that kind of coin my suggestion would then be to get a 390(X). They are supposed to be a 30-40% performance jump from 290X which is already double your 270X. Plus you don't have to concern yourself with the possibility of Xfire difficulties.

Quote:


> Originally Posted by *neurotix*
> 
> Also, you gotta consider that the R9 295x2 has a pretty exotic liquid cooling solution. From what I've heard, it's made by Asetek and they are cheap and can cause problems. IF you did get a 295x2 you likely wouldn't be able to overclock it much, if at all. It's also a dual GPU card so it will still run very hot, most likely. You may be able to overclock it without raising the voltage, but any kind of voltage overclock would probably be out of the question.
> 
> I would recommend two separate cards in Crossfire personally. I'm very happy with my setup.


Asetek is the manufacturer for any of the AIO liquid coolers that look like this:



If you consider companies like Corsair, NZXT, Antec, Intel, AMD, Zalman, Thermaltake and the like to be cheap, then your statement holds water...but the truth of the matter is that they are not cheap. The design is quite fine which is why so many other companies have slapped their name on it and used it.

I wouldn't suggest getting one with the hoses that are depicted above (cooler master still uses these for some reason) because they are a harder plastic and are much more prone to crackingand leaking than the newer versions which use a traditional rubber hose like this:



The 295X2 with it's included cooling solution does pretty good considering it is a single 120mm radiator that is cooling (2) volcanic GPUs. You are correct when you said that there isn't much room for overclocking, but with the setup that he has described the 295X2 will be more than sufficient for his needs and overclocking would not be necessary.

If he were to need to harness some more power from a 295X2 then he has two options:

The first would inevitably void his warranty, if purchased new, but it is the cheaper route. You could remove the single 120mm radiator and install a larger one to bring temperatures down and allow a bit of overclocking/volting.

Option 2 is creating a small loop for the GPU alone to utilize a full cover waterblock. Using a radiator with the pump built into it would be the easiest implementation so you don't have to worry about sticking a pump somewhere and you may even be able to skip a separate reservoir if you filled it up properly. A few more bucks for barbs and some reusable clamps and there you would have it.


----------



## neurotix

Just saying, separate cards are going to run cooler, and thus overclock better.

If you run a 295x2 stock it will still be pretty powerful, yeah, so maybe it doesn't matter.


----------



## Catscratch

Quote:


> Originally Posted by *MerkageTurk*
> 
> Just returned my 280x for a 290 as the 280x had driver crashes every time I ran a game., £20 price difference


Which brand and version of 280x gave you problems ?


----------



## MerkageTurk

xfx dd black

got a 290x vapour X


----------



## JackCY

Quote:


> Originally Posted by *MiladEd*
> 
> Any GTA V players around? What are your settings for getting 60 fps on 1080p? With everything almost maxed out I can get over 60 in less demanding scenes but in crowded areas it's down to 30.


Code:



Code:


<?xml version="1.0" encoding="UTF-8"?>

<Settings>
  <version value="27" />
  <configSource>SMC_AUTO</configSource>
  <graphics>
    <Tessellation value="3" />
    <LodScale value="1.000000" />
    <PedLodBias value="0.200000" />
    <VehicleLodBias value="0.000000" />
    <ShadowQuality value="2" />
    <ReflectionQuality value="2" />
    <ReflectionMSAA value="0" />
    <SSAO value="2" />
    <AnisotropicFiltering value="16" />
    <MSAA value="0" />
    <MSAAFragments value="0" />
    <MSAAQuality value="0" />
    <TextureQuality value="2" />
    <ParticleQuality value="2" />
    <WaterQuality value="2" />
    <GrassQuality value="1" />
    <ShaderQuality value="2" />
    <Shadow_SoftShadows value="5" />
    <UltraShadows_Enabled value="false" />
    <Shadow_ParticleShadows value="true" />
    <Shadow_Distance value="1.000000" />
    <Shadow_LongShadows value="true" />
    <Shadow_SplitZStart value="0.930000" />
    <Shadow_SplitZEnd value="0.890000" />
    <Shadow_aircraftExpWeight value="0.990000" />
    <Shadow_DisableScreenSizeCheck value="false" />
    <Reflection_MipBlur value="true" />
    <FXAA_Enabled value="true" />
    <TXAA_Enabled value="false" />
    <Lighting_FogVolumes value="true" />
    <Shader_SSA value="true" />
    <DX_Version value="2" />
    <CityDensity value="1.000000" />
    <PedVarietyMultiplier value="1.000000" />
    <VehicleVarietyMultiplier value="1.000000" />
    <PostFX value="2" />
    <DoF value="false" />
    <HdStreamingInFlight value="true" />
    <MaxLodScale value="0.200000" />
    <MotionBlurStrength value="0.000000" />
  </graphics>
  <system>
    <numBytesPerReplayBlock value="9000000" />
    <numReplayBlocks value="36" />
    <maxSizeOfStreamingReplay value="1024" />
    <maxFileStoreSize value="65536" />
  </system>

    <Audio3d value="false" />

    <AdapterIndex value="0" />
    <OutputIndex value="0" />
    <ScreenWidth value="1920" />
    <ScreenHeight value="1200" />
    <RefreshRate value="60" />
    <Windowed value="0" />
    <VSync value="0" />
    <Stereo value="0" />
    <Convergence value="0.100000" />
    <Separation value="0.000000" />
    <PauseOnFocusLoss value="1" />
    <AspectRatio value="0" />

  <VideoCardDescription>AMD Radeon R9 200 Series (AMD Radeon R9 280X || AMD Radeon HD 7900 Series)</VideoCardDescription>
</Settings>

Runs fine, minimum around 55fps, if you disable the extra distance I added later then it barely goes below 60fps.
No MSAA, only FXAA, not an AA fan but the FXAA didn't seem to blur as it does in other games or implementations so I kept it. Reflections in mirrors are minimal in game and they always looked and look like crap in GTA, but I think cranking reflections up will also crank up all other reflections on surfaces that definitely don't need to be even that high as they are. Car mirrors never work right in GTA, bad design by R*.
Most consuming are MSAA, shadow quality and grass grass grass. The minimums are always around places out of town with grass, trees etc.
In town it happily goes even 70-80fps. Whole range is about 50-150fps, usually 60-80fps.

Built-in benchmark:

Code:



Code:


Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 45.432442, 127.351723, 75.124748
Pass 1, 40.677292, 104.182388, 65.693741
Pass 2, 29.637819, 127.077019, 75.118309
Pass 3, 28.573360, 146.205719, 77.810196
Pass 4, 27.457262, 118.649048, 72.829437

Time in milliseconds(ms). (Lower is better). Min, Max, Avg
Pass 0, 7.852269, 22.010704, 13.311193
Pass 1, 9.598552, 24.583740, 15.222151
Pass 2, 7.869244, 33.740673, 13.312334
Pass 3, 6.839678, 34.997635, 12.851787
Pass 4, 8.428218, 36.420238, 13.730712

Frames under 16ms (for 60fps): 
Pass 0: 609/698 frames (87.25%)
Pass 1: 422/615 frames (68.62%)
Pass 2: 652/698 frames (93.41%)
Pass 3: 666/724 frames (91.99%)
Pass 4: 7010/8348 frames (83.97%)

Frames under 33ms (for 30fps): 
Pass 0: 698/698 frames (100.00%)
Pass 1: 615/615 frames (100.00%)
Pass 2: 697/698 frames (99.86%)
Pass 3: 723/724 frames (99.86%)
Pass 4: 8346/8348 frames (99.98%)

Percentiles in ms for pass 0
50%,    13.00
75%,    14.00
80%,    15.00
85%,    15.00
90%,    16.00
91%,    16.00
92%,    16.00
93%,    16.00
94%,    16.00
95%,    17.00
96%,    17.00
97%,    17.00
98%,    17.00
99%,    19.00

Percentiles in ms for pass 1
50%,    15.00
75%,    16.00
80%,    16.00
85%,    16.00
90%,    16.00
91%,    17.00
92%,    17.00
93%,    17.00
94%,    17.00
95%,    17.00
96%,    17.00
97%,    18.00
98%,    19.00
99%,    22.00

Percentiles in ms for pass 2
50%,    13.00
75%,    15.00
80%,    15.00
85%,    15.00
90%,    15.00
91%,    15.00
92%,    15.00
93%,    15.00
94%,    16.00
95%,    16.00
96%,    16.00
97%,    16.00
98%,    17.00
99%,    17.00

Percentiles in ms for pass 3
50%,    12.00
75%,    14.00
80%,    14.00
85%,    14.00
90%,    15.00
91%,    15.00
92%,    15.00
93%,    16.00
94%,    16.00
95%,    16.00
96%,    16.00
97%,    16.00
98%,    17.00
99%,    17.00

Percentiles in ms for pass 4
50%,    13.00
75%,    15.00
80%,    15.00
85%,    16.00
90%,    16.00
91%,    17.00
92%,    17.00
93%,    17.00
94%,    17.00
95%,    17.00
96%,    17.00
97%,    18.00
98%,    18.00
99%,    19.00

=== SYSTEM ===
Windows 8.1 64-bit (6.2, Build 9200)
DX Feature Level: 11.0
Intel(R) Core(TM) i5-4690K CPU @ 3.50GHz (4 CPUs), ~3.5GHz
16384MB RAM
AMD Radeon R9 200 Series (AMD Radeon R9 280X || AMD Radeon HD 7900 Series), 3199MB, Driver Version 14.502.1014.0
Graphics Card Vendor Id 0x1002 with Device ID 0x6798

=== SETTINGS ===
Display: 1920x1200 (FullScreen) @ 60Hz VSync OFF
Tessellation: 3
LodScale: 1.000000
PedLodBias: 0.200000
VehicleLodBias: 0.000000
ShadowQuality: 2
ReflectionQuality: 2
ReflectionMSAA: 0
SSAO: 2
AnisotropicFiltering: 16
MSAA: 0
MSAAFragments: 0
MSAAQuality: 0
TextureQuality: 2
ParticleQuality: 2
WaterQuality: 2
GrassQuality: 1
ShaderQuality: 2
Shadow_SoftShadows: 5
UltraShadows_Enabled: false
Shadow_ParticleShadows: true
Shadow_Distance: 1.000000
Shadow_LongShadows: true
Shadow_SplitZStart: 0.930000
Shadow_SplitZEnd: 0.890000
Shadow_aircraftExpWeight: 0.990000
Shadow_DisableScreenSizeCheck: false
Reflection_MipBlur: true
FXAA_Enabled: true
TXAA_Enabled: false
Lighting_FogVolumes: true
Shader_SSA: true
DX_Version: 2
CityDensity: 1.000000
PedVarietyMultiplier: 1.000000
VehicleVarietyMultiplier: 1.000000
PostFX: 2
DoF: false
HdStreamingInFlight: true
MaxLodScale: 0.200000
MotionBlurStrength: 0.000000

CPU 4.6/4.2GHz, GPU 1.1/6.8GHz
1920x1200, 11% larger than 1080p so the above should give you pretty much 60fps minimum while playing. Benchmark counts also those strange drops to 30fps that don't show up otherwise, GTA is always loading something so that slows it down when it does.


----------



## psychok9

Hello guys, I've artifacts problem on some "big" games... (like GTA, Assassin Creed) ecc
Is there any stress tool for the memory of the GPU card?
I need to check errors/artifacts for a long time (~ 8-10 hours).


----------



## amdblack

hey dose anyone know of a full cover water block that would fit a radeon r9 270x dual x card


----------



## hypespazm

so I have a r9 280x on a pent-up g3528 @4.3 ghz I was wondering if there are any custom biosesthat make it run faster and allow some easy over clocking my asic on it is like 60


----------



## bichael

Quote:


> Originally Posted by *amdblack*
> 
> hey dose anyone know of a full cover water block that would fit a radeon r9 270x dual x card


I went with the Alphacool GPX for my 270x, they have a range that covers most models.

Not quite complete full cover but on the other hand is much better in terms of upgradability.


----------



## amdblack

I'll check that out thanks


----------



## Devildog83

Quote:


> Originally Posted by *hypespazm*
> 
> so I have a r9 280x on a pent-up g3528 @4.3 ghz I was wondering if there are any custom biosesthat make it run faster and allow some easy over clocking my asic on it is like 60


http://www.overclock.net/t/1500524/intel-pentium-g3258-performance-and-owners-club-now-with-gtx-970/3580#post_23858961


----------



## JackCY

Quote:


> Originally Posted by *psychok9*
> 
> Hello guys, I've artifacts problem on some "big" games... (like GTA, Assassin Creed) ecc
> Is there any stress tool for the memory of the GPU card?
> I need to check errors/artifacts for a long time (~ 8-10 hours).


None. It's always a trial and error watching out for it when using programs that cause the artifacts. I can run all the tests and benches at higher frequencies but the moment I play some more recent games for a long time it will randomly give polygon artifacts mostly.
Quote:


> Originally Posted by *amdblack*
> 
> hey dose anyone know of a full cover water block that would fit a radeon r9 270x dual x card


For a 270x a water block? Seems a bit pointless and useless waste of money.
For a 290/290x sure it would make sense especially for reference design but for the less power hungry cards it really brings no real advantage.

---

Beware, Driver 15.4 Beta causes not only in GTA V almost 20fps drop. No idea what they optimized but it seems terribly slower.


Spoiler: 15.4 beta



Code:



Code:


Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 39.244007, 105.037529, 57.538452
Pass 1, 30.134722, 121.591911, 51.489807
Pass 2, 41.357235, 109.053589, 57.605476
Pass 3, 35.682331, 149.663879, 62.801540
Pass 4, 30.334021, 129.205765, 58.622036







Spoiler: 15.3 beta



Code:



Code:


Frames Per Second (Higher is better) Min, Max, Avg
Pass 0, 46.080048, 128.423874, 75.269272
Pass 1, 35.476334, 115.364250, 66.256042
Pass 2, 51.678375, 104.689911, 75.782303
Pass 3, 53.852169, 146.105698, 78.083130
Pass 4, 37.047596, 126.185379, 73.006073


----------



## GoLDii3

Quote:


> Originally Posted by *JackCY*
> 
> Beware, Driver 15.4 Beta causes not only in GTA V almost 20fps drop. No idea what they optimized but it seems terribly slower.
> 
> 
> Spoiler: 15.4 beta
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Frames Per Second (Higher is better) Min, Max, Avg
> Pass 0, 39.244007, 105.037529, 57.538452
> Pass 1, 30.134722, 121.591911, 51.489807
> Pass 2, 41.357235, 109.053589, 57.605476
> Pass 3, 35.682331, 149.663879, 62.801540
> Pass 4, 30.334021, 129.205765, 58.622036
> 
> 
> 
> 
> 
> 
> 
> Spoiler: 15.3 beta
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Frames Per Second (Higher is better) Min, Max, Avg
> Pass 0, 46.080048, 128.423874, 75.269272
> Pass 1, 35.476334, 115.364250, 66.256042
> Pass 2, 51.678375, 104.689911, 75.782303
> Pass 3, 53.852169, 146.105698, 78.083130
> Pass 4, 37.047596, 126.185379, 73.006073


I doubt they messed it so bad,it's probably the game's fault.

AMD seems to have some kind of problem with v-Sync on GTA V,either it automatically sets to 59 hz causing the frames to stay at 30 fps or either even if you select 60 hz it does not apply so you have to change back to whatever frequency and then switch back to 60 hz everytime you play otherwise the fps drops to 30 again.

Have you tried without v-Sync?


----------



## JackCY

I never use v-sync due to lag it causes (and you can't get fps above your sync frequency either), all my settings are somewhere in earlier post, I'm fine with tearing until they make a proper GPU/monitor sync after all the decades it was ignored. I think the performance wasn't bad just in GTA V but I only have the numbers from GTA V since they claimed to optimize it for it and it has a simple built in benchmark.
I'm back to 15.3 and not just GTA V runs fine.


----------



## crazymania88

Last update of GTA V dropped my FPS. (r9 280x)

If you remember my earlier posts, I've said it is locked to 60, and never drops below.

now it does and it is annoying.
a friend of mine with Asus 280x claims he is playing 1080p ALL Ultra no AA over 60FPS whole time, which I find hard to beleive "All Time" part.

But according to him (his specs are lower than mine when it comes to CPU, ram same, GPU mine Giga his Asus 280x)
in a spot where I say my fps is below 60 and settings are High-VeryHigh mixed and 1050p (gpu usage topped and also mine is faster 1100mhz),
*
His 1080p all Ver high is still over 60, mine was the same before update, so does anyone know what changed?*

The thing is my GPU being utilized 99% at that very moments and faster than his by 100mhz or something :S
almos like game is using my GPU for something else after update.

*OFFTOPIC:*
@JackCY

You know how to build a system, don't you








Wanted to see what GPU you have, and found out,
Perfect Mobo Choice,
Perfect PSU Choice (Really, 850G2 is amazing, one of the best and its price more amazing and 10 years warranity),
Perfect CPU Choice for the mobo,
Future-Proof Memory Size, which came in handy just now with GTA V actually









Perfect GPU Model choice.

it is obvious you didn't have/want huge amount of spare money to spend on a PC, and didn't go crazy with a single part. Perfectly balanced and picky.

Just GPU brand was a gamble it seems because more than some Asus 280x' are broken, I avoided Asus 280x. Read many confused stuff about Gigabyte 280x, it was because of different Revisions.
I still beleive Powercolor 280x was the second-best 280x to buy (which is not usual with PColor)

But if your GPU is one of the "Good" asus 280x, then nothing matters after all









I don't know why, but you made my day.
Here was no Extreme4, but since it has 6-Phase,
I would still probably get the Gigabyte z97x Gaming5.

but later I had a little regret, I could exchange 4phases with a good soundcard for the same price,
which is Gigabyte Z97 G1.Sniper. Has 4 Phases, but a better soundcard.
I am only giving 1.272v at most to the CPU, so it could handle, well I've done miss-calculations.

(Thought I would go 1.35V at least, still in future I may tho. and good thing, I always can get a soundcard but I can't get phases)

Fınally I've seen a perfect build.
Well done


----------



## marto7

Can Accelero Xtreme Plus fit the 280x Toxic ? I know for this card is accelero xtreme iv but i can taka Xtreme plus for good price and i wonder if it fit to my card ?


----------



## Kloud

Wow. As there are very few members in the 280 Section, I'd like for you to add me!


Spoiler: Proof






I have a Gigabyte WindForce AMD Radeon R9 280 OC 3GB GDDR5


----------



## Devildog83

*Kloud* has been added to the club, Welcome !!!


----------



## crazymania88

Quote:


> Originally Posted by *Kloud*
> 
> Wow. As there are very few members in the 280 Section, I'd like for you to add me!
> 
> 
> Spoiler: Proof
> 
> 
> 
> 
> 
> 
> I have a Gigabyte WindForce AMD Radeon R9 280 OC 3GB GDDR5


How is your temps? and your memory type?

mine was Rev2 if I remember right, it has Elpida and hotter then others a little bit it seems.
69C-70c under load, didn't touch fan profile

Rev3.0 seems to be longer than Rev2.0 and looks like it has a different cooler, I'll get so sad if it has 5 pipes instaed of 3..

Edit:









Rev3.0 has 6 pipes, instead of 3 heatpipes, so it means less noise, lower temps








from gigabyte rev3.0 280x page:

"Keeping the same 450W Cooling capacity using only 2 slots, the New WINDFORCE 3X 450W cooling system is equipped with two 8mm and four 6mm cooper heat-pipes , inclined fans, and GIGABYTE "Triangle Cool" technology. This not only provides an effective heat dissipation capacity but also enables higher performance due to a lower temperature."

From Gigabyte R9 280x Rev2.0 page:

"GIGABYTE introduces the latest exclusive "Triangle Cool" Technology to reach a better cooling performance. The latest patented technology combines fin with clip module in a special triangle shape. With the original anti-turbulence structure plus the new triangle cooling design, it enhances the efficiency of heat dissipation dramatically by minimizing the flow of turbulence between fans. Therefore, the "Triangle Cool" Technology provides a more efficient air flow for the cooling system."

Basically:
R9 280x Rev3 -> 2 8mm, 4 6MM heatpipes.
R9 280x Rev2.0 -> 3 8mm Heatpipes, this is not fair, because the 6 heatpipe cooler was already out before they've released 280x series







.

Anyway,
Rev3 wasn't even present whenI bought this one


----------



## Kloud

Quote:


> Originally Posted by *crazymania88*
> 
> How is your temps? and your memory type?
> 
> mine was Rev2 if I remember right, it has Elpida and hotter then others a little bit it seems.
> 69C-70c under load, didn't touch fan profile
> 
> Rev3.0 seems to be longer than Rev2.0 and looks like it has a different cooler, I'll get so sad if it has 5 pipes instaed of 3..
> 
> Edit:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Rev3.0 has 6 pipes, instead of 3 heatpipes, so it means less noise, lower temps
> 
> 
> 
> 
> 
> 
> 
> 
> from gigabyte rev3.0 280x page:
> 
> "Keeping the same 450W Cooling capacity using only 2 slots, the New WINDFORCE 3X 450W cooling system is equipped with two 8mm and four 6mm cooper heat-pipes , inclined fans, and GIGABYTE "Triangle Cool" technology. This not only provides an effective heat dissipation capacity but also enables higher performance due to a lower temperature."
> 
> From Gigabyte R9 280x Rev2.0 page:
> 
> "GIGABYTE introduces the latest exclusive "Triangle Cool" Technology to reach a better cooling performance. The latest patented technology combines fin with clip module in a special triangle shape. With the original anti-turbulence structure plus the new triangle cooling design, it enhances the efficiency of heat dissipation dramatically by minimizing the flow of turbulence between fans. Therefore, the "Triangle Cool" Technology provides a more efficient air flow for the cooling system."
> 
> Basically:
> R9 280x Rev3 -> 2 8mm, 4 6MM heatpipes.
> R9 280x Rev2.0 -> 3 8mm Heatpipes, this is not fair, because the 6 heatpipe cooler was already out before they've released 280x series
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Anyway,
> Rev3 wasn't even present whenI bought this one


It's running at 40 Degrees Celcius while browsing (not gaming), then again it is like 100 degrees in my room anyways lol. I can't test the temps as I'm waiting for my new CPU before I can even use the full cap of my gpu.


----------



## crazymania88

is yours rev3?
doesn't seem like.


----------



## marto7

what is the legnht of 280x Toxic mounting holes ? I need this for understand can i put Accelero Xtreme Plus to my card because i have a good offer for him.Sorry for bad english.Thanks


----------



## Devildog83

Quote:


> Originally Posted by *marto7*
> 
> what is the legnht of 280x Toxic mounting holes ? I need this for understand can i put Accelero Xtreme Plus to my card because i have a good offer for him.Sorry for bad english.Thanks


Even if the hoes fit I don't think the heat sink will sit down on it flush. I would spend a few extra bucks and go with the eccelero 280x IV which should be sure to fit. Can I ask why you would want to go from the Toxic cooler to an accelero? An AIO water cooler could be had for the about the same price and would do a good job. Doesn't make sense to me to go from air to air.


----------



## marto7

The idea is was i can take Xtreme plus for 20 Euro and the accelero is better than stock toxic cooler.I have no problems but i thinked the deal is good.I will pass for now.In future if i have problems may got the water...


----------



## DizzlePro

just installed a new MSI R9 280x but it appears to having an issue install the latest beta drivers on windows 8

ive made a thread here with more detail

http://www.overclock.net/t/1554628/msi-r9-280x-driver-issue-or-defective/0_50

i have no issues installing catalyst 14.4 or below but when i install 14.12 or 15.4 it doesn't install the display drivers,

this is on a fresh windows 8 install

Edit

turns out that win 8 was the culprit

gotta be on 7 or 8.1 to install 15.4


----------



## JackCY

Quote:


> Originally Posted by *DizzlePro*
> 
> just installed a new MSI R9 280x but it appears to having an issue install the latest beta drivers on windows 8
> 
> ive made a thread here with more detail
> 
> http://www.overclock.net/t/1554628/msi-r9-280x-driver-issue-or-defective/0_50
> 
> i have no issues installing catalyst 14.4 or below but when i install 14.12 or 15.4 it doesn't install the display drivers,
> 
> this is on a fresh windows 8 install
> 
> Edit
> 
> turns out that win 8 was the culprit
> 
> gotta be on 7 or 8.1 to install 15.4


8 has been updated to 8.1 long time ago. 15.4 is slower, much slower for me than 15.3 or the non beta driver.


----------



## tomytom99

Quote:


> Originally Posted by *JackCY*
> 
> 8 has been updated to 8.1 long time ago. 15.4 is slower, much slower for me than 15.3 or the non beta driver.


Yeah, since Microsoft actually dropped Windows 8 (not 8.1) support, developers aren't going to make software and drivers work based on 8.
I've noticed that 15.4 is slower. Probably because it has some errors going on for it.


----------



## DizzlePro

well thats annoying

Msi Gaming 280x 3G does allow voltage ajustment (tried trix, AF, gputweak)

model 912-v277-067

even tried flashing this http://www.techpowerup.com/vgabios/146769/msi-r9280x-3072-131009.html


strange though, ive never seen an msi card with doesn't support voltage adjustment within afterburner

just gonna sell it & use my 7850 till the R9 380/x is released


----------



## sage101

How well does the R9 270X run the witcher 3? Any1 here ran the game with a 270X or 270?


----------



## radier

I do. It feels fine for me on High preset with SSAO, AA, blur/bloom off.

Taptaptap Mlais M52


----------



## sage101

Quote:


> Originally Posted by *radier*
> 
> I do. It feels fine for me on High preset with SSAO, AA, blur/bloom off.
> 
> Taptaptap Mlais M52


Thanks what's the fps like? Nice OC on your 270X mine only does 1200.


----------



## Carlos Danger

so i bought a SAPPHIRE DUAL-X 100373L Radeon R9 280 on sale @ new egg and it has shipped but im now having some doubts
my questions are as follows
i mainly play FPS like arma 3, csgo, Dayz, etc...
how will this card do?
should i just send it back and get a 290
i can afford the difference and i kinda bought the 280 on impulse
also how does this card work in crossfire
a second card would also be an option

i am running a 8150
16 gb ram
990x board ASUS M5A97 LE R1.0
900 watt PS
swiftec 240 cooler
Phanteks Enthoo Luxe
128 mushkin ssd
and a 1tb segate spinner

my current card is a 6950

thanks in advance


----------



## MiladEd

Quote:


> Originally Posted by *Carlos Danger*
> 
> so i bought a SAPPHIRE DUAL-X 100373L Radeon R9 280 on sale @ new egg and it has shipped but im now having some doubts
> my questions are as follows
> i mainly play FPS like arma 3, csgo, Dayz, etc...
> how will this card do?
> should i just send it back and get a 290
> i can afford the difference and i kinda bought the 280 on impulse
> also how does this card work in crossfire
> a second card would also be an option
> 
> i am running a 8150
> 16 gb ram
> 990x board ASUS M5A97 LE R1.0
> 900 watt PS
> swiftec 240 cooler
> Phanteks Enthoo Luxe
> 128 mushkin ssd
> and a 1tb segate spinner
> 
> my current card is a 6950
> 
> thanks in advance


I think a 290 would be a better option, as it's about 20 % more powerful than a 280. I've a 280X and it's a very good card, but nowhere near as good as a 290. A 290 is about as good as a 970, if there are no proprietary nVidia features (gameworks, hairworks, physX) at work.


----------



## neurotix

Erm an 8150 will probably be a severe bottleneck for a 290.

It most definitely would bottleneck really anything in Crossfire/SLI.

I'm not saying to replace it, just don't expect to get the most of the card with that CPU.

For a single card it will be okay but Crossfire will see a 25% reduction in performance compared to Intel, possibly more.

(I used to run an 8350 @ 5ghz and I ran it with two 290s. 95 FPS in Valley Extreme HD. I switched to Intel and the Intel at stock got 110 FPS, overclocked it gets 130 FPS, and that's with the cards at the same clocks with both CPUs. All the other Futuremark benchmarks got a great deal higher scores with Intel as well. So I'm not speaking out of my ass.)
Quote:


> Originally Posted by *MiladEd*
> 
> I think a 290 would be a better option, as it's about 20 % more powerful than a 280. I've a 280X and it's a very good card, but nowhere near as good as a 290. A 290 is about as good as a 970, if there are no proprietary nVidia features (gameworks, hairworks, physX) at work.


The 290 is more than 20% more powerful than a 280, actually. It's better than a 7970/280X as well.

7970: http://hwbot.org/submission/2599500_neurotix_3dmark11___performance_radeon_hd_7970_11884_marks

290: http://hwbot.org/submission/2677513_neurotix_3dmark11___performance_radeon_r9_290_17186_marks

That's a 44% increase.

Of course, that's just 3dmark11 but I have other benches too.

7970:



290:



49.9% better in Valley Extreme HD (and both cards were overclocked heavily and run with my i7 at 4.5ghz)

Don't forget, the 290 has twice the ROPs and a substantial amount greater TMUs.

The 7970 has 32 ROPs and 128 TMUs...

The 290 has 64 ROPs and 160 TMUs.

This means at higher resolutions, and especially in some older games, the 290 performance can even be double (yes, double) the 7970. I've even gotten similar scores in some benchmarks with two 290s to quad 7970 systems. (3dmark11 Extreme, I can provide evidence if needed.)

A 280 is a gimped 7970 to boot. Keep in mind it only has 112 TMUs and slightly less shaders. You can expect 7870 Ghz levels of performance out of a stock 280. Even some overclocked 280s are neck and neck with heavily OC'ed 270X. I would say, if you can get more than 45 FPS out of a 280 in Valley Extreme HD, you are lucky.

Granted, the 280 and 270X are still good cards but realistically, the 290 is still an amazing card (if power hungry). My setup is even beating SLI 970s, I have yet to see SLI 970s beat my Valley score in the Valley thread. They are getting 120 FPS with the cards at 1500mhz+, where I get 130 fps at around 1150mhz. It doesn't matter how high you can clock them if they can't even do similar performance at higher clocks... and clocking them that high erodes or eliminates the power consumption advantage too. So all the usual Nvidia claims are bull. The Titan X is a different story however, I can't touch those...

I hope my experience helps you guys out. If you want more benches just ask.


----------



## tabascosauz

Quote:


> Originally Posted by *Carlos Danger*
> 
> so i bought a SAPPHIRE DUAL-X 100373L Radeon R9 280 on sale @ new egg and it has shipped but im now having some doubts
> my questions are as follows
> i mainly play FPS like arma 3, csgo, Dayz, etc...
> how will this card do?
> should i just send it back and get a 290
> i can afford the difference and i kinda bought the 280 on impulse
> also how does this card work in crossfire
> a second card would also be an option
> 
> i am running a 8150
> 16 gb ram
> 990x board ASUS M5A97 LE R1.0
> 900 watt PS
> swiftec 240 cooler
> Phanteks Enthoo Luxe
> 128 mushkin ssd
> and a 1tb segate spinner
> 
> my current card is a 6950
> 
> thanks in advance


Arma 3? On an FX-8150? GGGGGGGG

On a more serious note, you should just stop thinking about Crossfire. A single R9 290 would be quite a mouthful for an FX-8150 to keep up with. I'll shy away from the term "bottlenecking", as I don't quite think it'll come to that, but you could do a lot better than the FX-8150. On the bright side, the FX-8320E offers about as much single-threaded perf as the FX-8150 (read: not much), so Bulldozer isn't actually THAT far behind...

I don't know about CS:GO and its "Multi-Core Rendering" feature. Both my rigs do just fine, but neither of them rely on AMD on the CPU side (i3-4160 + R7 265, i7-4790K + R9 280X). How did you even manage to play Arma with Bulldozer?? It's horribly optimized, insanely CPU-bound but also demanding on the GPU.

The R9 290 is a healthy amount of power; most 280X can't even reach 290 stock performance when under mild OC - that's how wide the performance rift between Tahiti and Hawaii is. Go for it, but that FX-8150 needs to go as soon as Skylake or Zen hits the market. The R9 290 alone will alleviate issues, but it won't save you in Arma III.

A good explanation from @neurotix as always, but remember that the R9 280 is only a rebrand of the HD 7950; you'll start to feel the effects of cutting down Tahiti XT/XTX/XT2 (HD 7970 and R9 280X) into Tahiti Pro (HD 7950 and R9 280) not only in heat and power consumption but also in performance. A lot of people boast that the R9 280 easily overclocks into an R9 280X; at best, it's something to take with a grain of salt and I personally think it's BS. The R9 280X throws energy efficiency straight out the window (in some cases, the HD 7970 GHz was even worse than the R9 290X in efficiency) but it's every last drop of performance that Tahiti has to offer. The R9 280 doesn't offer that. And now that the R9 290/290X cards have been brought down to insanely low prices across North America, there's not a lot of reason to buy the R9 280X, let alone the R9 280.

Tahiti is so old. It's had its run, and then some. People cry all the time about how Pitcairn is ancient but still gets used in new mobile GPUs today; Tahiti is two months older than that relic. Go with the R9 290.


----------



## Carlos Danger

So im sending back the 280 today
There should be some kind of deal on a 290 this weekend
As far as the cpu goes do you think a 8350 oe even an 8320 would be better for gaming
I really want to stay all amd if possible
The 9 series chips would require a new mb and i dont really want to do that
Thanks


----------



## rdr09

Quote:


> Originally Posted by *Carlos Danger*
> 
> So im sending back the 280 today
> There should be some kind of deal on a 290 this weekend
> As far as the cpu goes do you think a 8350 oe even an 8320 would be better for gaming
> I really want to stay all amd if possible
> The 9 series chips would require a new mb and i dont really want to do that
> Thanks


290 and 8300 chip @ 4.5GHz.


----------



## neurotix

Quote:


> Originally Posted by *tabascosauz*
> 
> snip
> 
> A good explanation from @neurotix as always, but remember that the R9 280 is only a rebrand of the HD 7950; you'll start to feel the effects of cutting down Tahiti XT/XTX/XT2 (HD 7970 and R9 280X) into Tahiti Pro (HD 7950 and R9 280) not only in heat and power consumption but also in performance. A lot of people boast that the R9 280 easily overclocks into an R9 280X; at best, it's something to take with a grain of salt and I personally think it's BS. The R9 280X throws energy efficiency straight out the window (in some cases, the HD 7970 GHz was even worse than the R9 290X in efficiency) but it's every last drop of performance that Tahiti has to offer. The R9 280 doesn't offer that. And now that the R9 290/290X cards have been brought down to insanely low prices across North America, there's not a lot of reason to buy the R9 280X, let alone the R9 280.
> 
> Tahiti is so old. It's had its run, and then some. People cry all the time about how Pitcairn is ancient but still gets used in new mobile GPUs today; Tahiti is two months older than that relic. Go with the R9 290.


Thank you.

I too disagree that an overclocked 7950/280 can "match or outperform" a 7970/280X. Though, my experience with this is limited to my 7970 vs a 7870 XT I had for benching, and sold. The 7870 XT was for all purposes almost identical to my 270X in performance, and both fall pretty short of my 7970, though the 7970 is on average only 8 FPS faster in Valley, when both are overclocked.
Quote:


> Originally Posted by *Carlos Danger*
> 
> So im sending back the 280 today
> There should be some kind of deal on a 290 this weekend
> As far as the cpu goes do you think a 8350 oe even an 8320 would be better for gaming
> I really want to stay all amd if possible
> The 9 series chips would require a new mb and i dont really want to do that
> Thanks


A FX-8350 is only going to be about 15% faster than an FX-8150. The FX-8320 is basically the same as the FX-8350 but clocked lower (3.5ghz vs 4.0ghz). If you're overclocking anyway, it's likely both of them will do 4.5ghz or higher. My buddy even runs an 8320 at 5.2ghz under a massive water loop (2x 480 rads). So, go with the cheaper 8320.

The problem with this, however, is that 1) 8150 -> 8320 isn't much of an upgrade, and 2) to run the Piledriver CPUs at a level that is competitive with Intel (e.g. 5ghz) you need a high end motherboard and high end cooling to keep it cool, as it's a 220 watt chip. (Actually, more when overclocked.) For the same amount of money you'd spend on say, a Crosshair V Formula, 8320 and high end cooler (h100 class), you could have an i5 4690k, $100 motherboard and cheaper air cooler. The Intel would severely outperform the 5ghz 8320 even at stock. You would get more fps with the Intel at stock in almost every game.

If you have $400 to upgrade your motherboard, CPU and cooler, that's what I'd do.


----------



## radier

Quote:


> Originally Posted by *sage101*
> 
> Thanks what's the fps like? Nice OC on your 270X mine only does 1200.


45 FPS on average.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *radier*
> 
> 45 FPS on average.


thats what im getting as well but with those turn on.. I did turn off hairworks completely and in ccc set a profile to 64x tessalation


----------



## Velimere

Quote:


> Originally Posted by *neurotix*
> 
> I too disagree that an overclocked 7950/280 can "match or outperform" a 7970/280X.


Right, so here we have a chart of 1080p FurMark benchmark tests:

http://www.geeks3d.com/20120413/furmark-opengl-benchmark-scores-comparative-charts/

Please note the 7970 score of 3363 points below the 6970 CF score of 3505 points. Now, take a look at _my_ score:


Spoiler: Click for pic!



http://imgur.com/B9AdSVU



I'm sure chips with better ASIC Quality percentages than mine (which is rather low) can perform even better too.


----------



## neurotix

Quote:


> Originally Posted by *Velimere*
> 
> Right, so here we have a chart of 1080p FurMark benchmark tests:
> 
> http://www.geeks3d.com/20120413/furmark-opengl-benchmark-scores-comparative-charts/
> 
> Please note the 7970 score of 3363 points below the 6970 CF score of 3505 points. Now, take a look at _my_ score:
> 
> 
> Spoiler: Click for pic!
> 
> 
> 
> http://imgur.com/B9AdSVU
> 
> 
> 
> I'm sure chips with better ASIC Quality percentages than mine (which is rather low) can perform even better too.


1. Your GPU-Z window is covered so I can't even tell what card you have. You covered it with the ASIC window.

2. Who the heck uses Furmark benchmark seriously? It's not even DirectX11, it's OpenGL.

3. The difference between the 7950 and 7970 is slim, but it's there. If you can get your 7950 to stock 7970 Ghz levels in something like Unigine Valley Extreme HD preset, or Fire Strike, then I'll pay attention.


----------



## tabascosauz

Quote:


> Originally Posted by *Velimere*
> 
> -snip-


Your point being?...

No one uses Furmark except to gauge temperatures on AMD cards. GCN suffers horribly under Furmark.

ASIC quality in GPU-Z...I laughed. Do you seriously think that it is a legitimate gauge of the quality of your GPU? Go and take a look at the thousands of threads that have already debunked this.

Your GPU-Z window is also covered. Comparing the 7970 to 6970 CF? It's getting hard for me to take you seriously.


----------



## End3R

So long Team Red. My GTX 970 arrived today, and while I've enjoyed my time here, I must be going.

That being said, having experienced team red for myself, anytime I hear an nvidia fanboy saying they can't compare, I'll be sure to defend them and let people know they can hold their own. Honestly the only reason I even upgraded was the 970 came with witcher 3 and batman arkham knight for free, was hard to pass up a 4gb card for essentially $230.

I was really shocked at the size difference though. o.o


----------



## rdr09

Quote:


> Originally Posted by *End3R*
> 
> So long Team Red. My GTX 970 arrived today, and while I've enjoyed my time here, I must be going.
> 
> That being said, having experienced team red for myself, anytime I hear an nvidia fanboy saying they can't compare, I'll be sure to defend them and let people know they can hold their own. Honestly the only reason I even upgraded was the 970 came with witcher 3 and batman arkham knight for free, was hard to pass up a 4gb card for essentially $230.
> 
> I was really shocked at the size difference though. o.o
> 
> 
> Spoiler: Warning: Spoiler!


it's 3.5 + 0.5







Congrats.


----------



## crazymania88

Quote:


> Originally Posted by *rdr09*
> 
> it's 3.5 + 0.5
> 
> 
> 
> 
> 
> 
> 
> Congrats.


I probably gonna switch to R9 380x if something like that comes out.
or a 970 in a year or two.


----------



## End3R

Quote:


> Originally Posted by *rdr09*
> 
> it's 3.5 + 0.5
> 
> 
> 
> 
> 
> 
> 
> Congrats.


@1080p it's a non-issue, anyone who thinks that's a big deal might as well be saying that having 2 sticks of 4gb ram is just 4+4 instead of 8.


----------



## JackCY

Quote:


> Originally Posted by *End3R*
> 
> @1080p it's a non-issue, anyone who thinks that's a big deal might as well be saying that having 2 sticks of 4gb ram is just 4+4 instead of 8.


I like my 8+8, and it runs at full bandwidth both of them.

---

1080p, 3GB is generally enough unless you push crazy ultra in which case the core won't keep up anyway.
With the new monitors going 4k finally or having triple screen or VR, having 970 with only 4GB can be an issue since it has tons of power to drive high pixel resolutions but potentially not enough memory to do so. Which affects 980 as well.
I'm already peaking or even breaking 3GB if I want to in GTAV at 1200p with a 280x. Can't imagine running proper high density textures in a moded game at higher resolutions, the 4GB would just not be enough.

Nvidia would have had to cut a smaller or bigger portion of the die units to not have to use the old memory trick they again pulled off. You can be happy NV didn't cut more and could have gotten even better yields from those chips and give you only 3.5GB or be happy that they cut less to give you more computing units and a bit of spare memory.

NV or AMD or who ever else, I don't care, best bang for buck does it for me. Have had both and a few other in the past. But anything beats the second place company in the market at performance/money ratio. The AMD cards sold hell a lot and with enough memory to be good even at these times, can be picked up second hand for cheap. Try picking up a used NV 770, oh no, it only has 2GB VRAM and the 4GB version is rare. Or 780, oh no only 3GB.
Look at 290, 4GB, cheap second hand, what more can you ask for. Don't need so much power and memory, 280x second hand, there you go, still beats the 285 that replaced it because the new card is chopped down, hell might even give a beating to the upcoming 380 if it turns out to be a 285 rebrand as it seems.

NV cards are nice but the pricing is outrageous ($400-$500 for a 970, haha no thanks, EU pricing, none of your oh I just picked up a cheap 970 at microcenter for $275 or some such) and always a tiny bit short on the memory amount. Fast memory, more premium chips I guess but less of them than I want for the power the card has.

Enjoy your card, which ever one you get or have.


----------



## End3R

Quote:


> Originally Posted by *JackCY*
> 
> NV cards are nice but the pricing is outrageous ($400-$500 for a 970, haha no thanks, EU pricing, none of your oh I just picked up a cheap 970 at microcenter for $275 or some such) and always a tiny bit short on the memory amount. Fast memory, more premium chips I guess but less of them than I want for the power the card has.
> 
> Enjoy your card, which ever one you get or have.


My 970 was $340 and came with both witcher 3, and the upcoming batman, so it was hard to turn it down for what is essentially $220.

I agree with you about the brand not mattering though, like I said I enjoyed my time with team red, and my 270x is still a beast - I plan on using it in a rig for my brother - I just couldn't pass up such a good deal.


----------



## MiladEd

Just a quick tip for better performance with borderline unrecognizable difference in quality for Witcher 3:

On my system (280X 1100 MHz, 8320 4.4 GHz), I was running with everything on Ultra. I got acceptable framerate (mostly 30-40 range) but sometimes, specially in places like Crookback bog, it would dip in middle 20s which is unbearable. So, I tried a few tweaks:

Turn foliage visibility down. I just set it to High instead of Ultra and in the same location, (Crookback bog) I got a massive 10 FPS boost. If look really closely, you can see a light difference in further areas, but it's not that visible.

Also, try setting the Shadow Quality to high. Not much difference (more noticeable that foliage range, though) and it will give you a slight 5 FPS boost.

Also, Foliage range has a very low effect in cities, naturally. But FPS tended to be high enough in those locations anyway.

And keep HairWorks off.

Now my FPS is about 35-55 mostly in the higher 40s.

Hope I helped!


----------



## neurotix

A few other tips:

* Use 15.5 drivers if you can. These provide a very nice performance boost. However, on my setup, they cause black screens while playing video (youtube or MPEG in WMP) so I can't use them. With all the settings the same, I was getting 45 fps in the starting area with 14.12 and 55-60 fps with 15.5.

* Turn post processing options OFF. These are:

Motion blur
Blur
Anti-aliasing
Bloom
Sharpening
Depth of Field
Chromatic Aberration
Vignetting
Light Shafts
NVidia HairWorks
VSync

I turned all of them off except bloom, depth of field and light shafts and saw my fps on 14.12 go from 45 to 55-60 fps. (Keep in mind I run 2x 290s in Crossfire at 5760x1080.)

Turning all of these off might not be necessary if you run a single monitor. You can try playing with these options to see what they do. Use MSI Afterburner, Playclaw, Fraps etc to monitor fps. See how the game looks with the various settings on or off and see how they affect your fps. I couldn't really tell a difference between having them on or off except for the vignetting and chromatic aberration options, which seem to just make the textures a little darker and sharper. (Stuff like grass looked like it had been outlined in ink or shaded, it's a really nice effect and worth keeping on.)

It's even advised by AMD if you haven't seen it (and yes, with 15.5 drivers too) to totally disable anti-aliasing (AA). This alone should get you 10+ fps. I believe it has something to do with AMD cards and drivers and the fact that the game was developed on Nvidia hardware, the way the game implements anti-aliasing makes it run very slow on AMD hardware.

Also, disabling tessellation in the drivers will get you a nice boost as well. Remember, AMD cards are bad at tessellation.

Finally, use SSAO instead of HBAO+, I believe HBAO+ is Nvidia specific (like Hairworks) and will chug on AMD cards.

Hope this helps.


----------



## MiladEd

I wanted to ask a few question about OCing Sapphire R9 280X Dual-X. I've the Voltages set to 1256, clock Core 1125 and mem clock to 1600 MHz, which gives me stable results. Any higher clock or lower volts causes artifacts in GTA V, although other games (Witcher 3 for instance) seem to run fine. How far can I go up the voltages before I risk damage?


----------



## Deicidium

Sometimes I'm having this problem. It's my first time experiencing this. When the bug hits, this will happen.



the normal would be



look at the clocks. Why it's happening and how to solve it?


----------



## JackCY

Quote:


> Originally Posted by *Deicidium*
> 
> Sometimes I'm having this problem. It's my first time experiencing this. When the bug hits, this will happen.
> 
> the normal would be
> 
> look at the clocks. Why it's happening and how to solve it?


Close Flash, as in disable any application that is forcing the card into one of the other modes, like 2D.
I got this all the time with Flash when a 280x would only clock to 970/1600.
Don't ask me why such a trivial thing as mod selection doesn't work properly.


----------



## GoLDii3

Quote:


> Originally Posted by *Deicidium*
> 
> Sometimes I'm having this problem. It's my first time experiencing this. When the bug hits, this will happen.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> the normal would be
> 
> 
> 
> 
> 
> look at the clocks. Why it's happening and how to solve it?


Looks like the card is throttling,which is a normal problem with 280X's.

Try setting your powerlimit at max. and see what happens. Also disable ULPS.

If you can't fix it in no other way,try looking at your bios,i heard the boost feature of the 280/280X's caused throttling,i think you can fix it by using the non-Boost bios if you have one or flashing a bios without boost


----------



## JackCY

I don't know of a 280x that would throttle due to power limit. Maybe some poorly cooled ones could due to temperature limit. Mine runs full clocks even with 90% power limit.
From his PC specs it looks like he has a 270x anyway. And it looks like 2D clocks to me.


----------



## Deicidium

Quote:


> Originally Posted by *JackCY*
> 
> Close Flash, as in disable any application that is forcing the card into one of the other modes, like 2D.
> I got this all the time with Flash when a 280x would only clock to 970/1600.
> Don't ask me why such a trivial thing as mod selection doesn't work properly.


how would I know what application uses flash aside from the browser? And I think the browsers content when I opened that game does not have any flash related content.
Quote:


> Originally Posted by *GoLDii3*
> 
> Looks like the card is throttling,which is a normal problem with 280X's.
> 
> Try setting your powerlimit at max. and see what happens. Also disable ULPS.
> 
> If you can't fix it in no other way,try looking at your bios,i heard the boost feature of the 280/280X's caused throttling,i think you can fix it by using the non-Boost bios if you have one or flashing a bios without boost


I tried setting it to High Performance during that game and it did nothing. Then, I disabled ULPS from MSI Afterburner and restart my pc. It works after that. I'll look into it this week.

By the way, what's "Disabled ULPS". I just read it somewhere while googling.

Lastly, I switched from Balanced to High Performance profile and Disabled ULPS. Will my power consumption increase after doing this? I'm just concern about my electric bill.


----------



## JackCY

Quote:


> Originally Posted by *Deicidium*
> 
> how would I know what application uses flash aside from the browser? And I think the browsers content when I opened that game does not have any flash related content.


Check running programs, if Flash is there, close/kill it. It is enough to have one Youtube tab loaded in Firefox when using Flash for YT.

ULPS is power saving as well as Windows profiles change the PCIe link state power management, which changes stuff related to sleep state, which might be from when the PCIe device/GPU is idle not necessarily Windows sleep, much like there is a similar setting for SATA.

I'm not sure what those low clocks correspond to on your GPU, it looks like some 2D for middle load. Maybe they don't go higher on 270x.


----------



## MiladEd

Question about overlclocking 280X. Mine is currently set to 1130 MHz core clock, and it's stable. However, If I try a higher clock like 1150 MHz, and increase the voltages accordingly, no matter how much more volts I'm giving it, it will start artifacting. Even more so at higher voltages-strangely.

My theory is that at higher voltages the VRMs start overheating and they can't provide GPU with a high enough voltage, and it starts artifacting. Can that be the case?

My card is a Sapphire Daul-X version BTW. Oh and my full load temp on 1150 is 76 C and 75 C on 1130.


----------



## JackCY

Quote:


> Originally Posted by *MiladEd*
> 
> Question about overlclocking 280X. Mine is currently set to 1130 MHz core clock, and it's stable. However, If I try a higher clock like 1150 MHz, and increase the voltages accordingly, no matter how much more volts I'm giving it, it will start artifacting. Even more so at higher voltages-strangely.
> 
> My theory is that at higher voltages the VRMs start overheating and they can't provide GPU with a high enough voltage, and it starts artifacting. Can that be the case?


No. Unless your VRM cooling is awful but then the VRMs would burn instead.

All chips have some limit.
I can run mine up to 1120, on stock voltage but the more I push it over 1070 the more I may encounter some weird behavior/artifact in games. Benchmarking is pointless except Heaven with specific AA to discover the instability. All chips have a wall and no matter how much voltage you feed it it won't clock higher, at least not without sub-zero cooling.
Overall AMD GPUs are one of the worst overclockers compared to how much extra you can milk out of Nvidia and Intel chips. Some are lucky to get up to +10% but that's about it. No wonder AMD GPUs used to have base clock below 1GHz, they simply don't clock high.


----------



## MiladEd

Quote:


> Originally Posted by *JackCY*
> 
> No. Unless your VRM cooling is awful but then the VRMs would burn instead.
> 
> All chips have some limit.
> I can run mine up to 1120, on stock voltage but the more I push it over 1070 the more I may encounter some weird behavior/artifact in games. Benchmarking is pointless except Heaven with specific AA to discover the instability. All chips have a wall and no matter how much voltage you feed it it won't clock higher, at least not without sub-zero cooling.
> Overall AMD GPUs are one of the worst overclockers compared to how much extra you can milk out of Nvidia and Intel chips. Some are lucky to get up to +10% but that's about it. No wonder AMD GPUs used to have base clock below 1GHz, they simply don't clock high.


Okay, thanks. Seems like 1130 is my limit then.


----------



## ZealotKi11er

I dont know if this a issue but i cant get BF4 to run with Mantle with R9 270. In DX11 @ 1080p CPU Bottleneck is huge. Once i go Mantle fps is not stable and lags a lot. Drops to single digits. DX11 i get 35-45 fps. Mantle i see 60-65 fps and then super low fps .


----------



## tabascosauz

Is anyone else having these artifacting issues in GTA V on the 280X that cannot be fixed? I don't have a video but here's a video showcasing an identical issue. https://www.youtube.com/watch?v=tKwZcDq9xDw

It's not the small squares in the corner type, so lowering settings doesn't help me. It used to be (2 weeks ago) that it would get worse and eventually I would have to quit the game because the entire screen would be flashing; now, it is a bit milder but it's still very annoying when I'm flying by on the freeway and a small chunk of the city maybe 500m x 500m will be artifacting like crazy as I drive past. And no, it doesn't only happen when I'm driving.

I've tried lowering/removing my OC, even going below stock, but to the best of my knowledge this doesn't fix anything. Stock profile actually fares better than some of the lower-than-stock values (1070 core/1550 mem/1.2V) but doesn't remove the artifacting completely. For a short time, I might have been running GTA V with some very high temperatures (90-95 C) because I forgot to check the temps; that didn't carry on for a long time and I have since slapped the X31/G10 back on, so temperatures are sitting at a comfy 55 during gaming.

Could my 280X be permanently damaged from the week during which I played at 90-95 degrees? I want to get the R9 Nano when it comes out but I need the 280X to last me until then...


----------



## radier

Turn off AA.

Taptaptap Mlais M52


----------



## tabascosauz

Quote:


> Originally Posted by *radier*
> 
> Turn off AA.
> 
> Taptaptap Mlais M52


This isn't even remotely helpful. MSAA is off and FXAA off doesn't fix anything, only makes things look horrendous.


----------



## MiladEd

I get the squares on MSAA on but with FXAA on it doesn't happen.


----------



## DizzlePro

i get these graphical glitches on my 280x, tested with 15.4 & 15.5 (haven't tried 15.6)





Disabling msaa removes the artifacts in the bottom right


----------



## JackCY

The drivers optimized for specific games or so they are said to be are rather disappointing.

Settings.
Benches in the GTAV thread comparing the driver performances on 280xDC2T.
15.3 worked best for me, didn't try older and the newer were losing performance a lot.

As far as artifacts go, I believe it's simply because the card is poorly made, nothing to do with drivers really. For me that would be high VRAM voltage and poor quality chip. Nonetheless I ran GTAV quite artifact free with a slight OC while I get occasional issues elsewhere with it.


----------



## Agent Smith1984

Guys, you are all just seeing typical 280x "shard-ifacts"

Been a problem with those cards for a long time.

It can happen in some games, others not.

Try latest BIOS if not mentioned/tried so far.

Also try downclocking VRAM.

Really sad to see this still going on.

Asus DCUII is the worst of the bunch, with the Matrix and Toxic in the mix also.

Edit:

14.7 beta driver was the closest thing to relieving the issues, but that driver will sadly not include any optimizations for newer titles.

I got rid of the card before the release of 14.9+ betas.

Good luck!


----------



## BradleyKZN

Finally got a powercolor R9 270 turboduo GPU and its surprisingly quick! Im very impressed with it!


----------



## JackCY

Quote:


> Originally Posted by *BradleyKZN*
> 
> Finally got a powercolor R9 270 turboduo GPU and its surprisingly quick! Im very impressed with it!


For 720p maybe XD


----------



## bichael

Quote:


> Originally Posted by *BradleyKZN*
> 
> Finally got a powercolor R9 270 turboduo GPU and its surprisingly quick! Im very impressed with it!


Have you tried overclocking it yet? I've heard the 270 can reach 270x levels though not sure how typical that is. My powercolor 270x runs around 1160/1475 and I have to admit I've been very happy with the performance at 1080p. Just shy of 7000 gpu score in Firestrike (6938) though that was with a higher OC which didn't prove game stable.

Does make me wonder what games everyone else is playing which needs such expensive cards


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyKZN*
> 
> Finally got a powercolor R9 270 turboduo GPU and its surprisingly quick! Im very impressed with it!


Can you run 3DMark stock and OC. OC does not seem to make a difference for me.


----------



## Horsemama1956

Quote:


> Originally Posted by *bichael*
> 
> Have you tried overclocking it yet? I've heard the 270 can reach 270x levels though not sure how typical that is. My powercolor 270x runs around 1160/1475 and I have to admit I've been very happy with the performance at 1080p. Just shy of 7000 gpu score in Firestrike (6938) though that was with a higher OC which didn't prove game stable.
> 
> Does make me wonder what games everyone else is playing which needs such expensive cards


The cards with only 1x 6 Pin can be pretty crappy overclockers. I have tried a few that barely got over 1Ghz where quite a few 270x can do 1100+ on stock voltage.


----------



## tabascosauz

Quote:


> Originally Posted by *bichael*
> 
> Have you tried overclocking it yet? I've heard the 270 can reach 270x levels though not sure how typical that is. My powercolor 270x runs around 1160/1475 and I have to admit I've been very happy with the performance at 1080p. Just shy of 7000 gpu score in Firestrike (6938) though that was with a higher OC which didn't prove game stable.
> 
> Does make me wonder what games everyone else is playing which needs such expensive cards


1160MHz is not "270X levels". It is most certainly not. The 270X reference comes in at 1GHz - 1050 boost. Getting over the Gigahertz barrier would already be "270X levels" for a R9 270, and a lot of those 270s can do it. Expecting 1160MHz out of Pitcairn while only supplying it with about 150W max is a pretty unrealistic expectation.

Have you played GTA V on Very High settings yet? That game in all its anti-AMD optimization will give you the answer to your wonders


----------



## BradleyKZN

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Can you run 3DMark stock and OC. OC does not seem to make a difference for me.


I will try get ahold of 3D mark and see what I can do


----------



## JackCY

Quote:


> Originally Posted by *bichael*
> 
> Does make me wonder what games everyone else is playing which needs such expensive cards


Pretty much any modern 3D game from last couple years. If you want to stay over 60fps to have minimal lag, 280x won't run max settings, but sometimes near max works, sometimes it's in the middle, depending on how brutal the graphics settings in particular game are and what it was optimized for.
Nvidia and AMD have different number of resources on their GPU and that causes one or the other to lose if something is optimized for one as it will bottleneck the other. Although overall I always found Nvidia cards to be more of a raw computing power, where as AMD being more "unusual"/unconventional blasts in some but crawls in other aspects of the GPU.

1200p 280x being ok unless you expect ultra settings
1600p 390x
4k 980 Ti and more, sorry Fiji but at equal price 980 Ti is better

Of course if you don't mind the 30fps laggy choppy film like experience, move it all by one notch.


----------



## bichael

Quote:


> Originally Posted by *tabascosauz*
> 
> 1160MHz is not "270X levels". It is most certainly not. The 270X reference comes in at 1GHz - 1050 boost. Getting over the Gigahertz barrier would already be "270X levels" for a R9 270, and a lot of those 270s can do it. Expecting 1160MHz out of Pitcairn while only supplying it with about 150W max is a pretty unrealistic expectation.
> 
> Have you played GTA V on Very High settings yet? That game in all its anti-AMD optimization will give you the answer to your wonders


I was just wondering how a Powercolor 270 overclocked compared to a Powercolor 270x so showed my OC for comparison, wasn't trying to suggest that was the 270x levels that a 270 would reach.

Haven't got/tried GTA V yet. I think Shadow of Mordor probably the most demanding newish game I have which I'm just starting so haven't played around with the settings much yet, guessing 2Gb VRAM together with a dual core will probably mean medium textures - though at least it still seems to look okay.
Quote:


> Originally Posted by *JackCY*
> 
> Pretty much any modern 3D game from last couple years. If you want to stay over 60fps to have minimal lag, 280x won't run max settings, but sometimes near max works, sometimes it's in the middle, depending on how brutal the graphics settings in particular game are and what it was optimized for.
> Nvidia and AMD have different number of resources on their GPU and that causes one or the other to lose if something is optimized for one as it will bottleneck the other. Although overall I always found Nvidia cards to be more of a raw computing power, where as AMD being more "unusual"/unconventional blasts in some but crawls in other aspects of the GPU.
> 
> 1200p 280x being ok unless you expect ultra settings
> 1600p 390x
> 4k 980 Ti and more, sorry Fiji but at equal price 980 Ti is better
> 
> Of course if you don't mind the 30fps laggy choppy film like experience, move it all by one notch.


My "Does make me wonder what games everyone else is playing which needs such expensive cards wink.gif" comment was obviously slightly tongue in cheek. People will obviously have different standards etc. and I'm certainly not saying there is anything wrong with wanting max settings at 60fps+. Just that the 270/270x can generally deliver high settings at say 50fps on 1080p which is still pretty playable for me at least. Of course I still wish I had a more powerful gpu though


----------



## Jay1ty0

Hey guys, received my Sapphire Vapor-X R9 280x today







(got it for 180€ new, the cheapest we had here on Portugal was 250€+







)
Finally upgraded from my old GTX260








I ran Heaven benchmark 4.0
Tesselation extreme, Quality Ultra, 8XAA, 1080p and got a score of 774 (I don't really care for benchmarks, games only), but I'm only asking if it's normal so I can have an idea if the card is performing as it should.
Cheers!


----------



## Agent Smith1984

Quote:


> Originally Posted by *Jay1ty0*
> 
> Hey guys, received my Sapphire Vapor-X R9 280x today
> 
> 
> 
> 
> 
> 
> 
> (got it for 180€ new, the cheapest we had here on Portugal was 250€+
> 
> 
> 
> 
> 
> 
> 
> )
> Finally upgraded from my old GTX260
> 
> 
> 
> 
> 
> 
> 
> 
> I ran Heaven benchmark 4.0
> Tesselation extreme, Quality Ultra, 8XAA, 1080p and got a score of 774 (I don't really care for benchmarks, games only), but I'm only asking if it's normal so I can have an idea if the card is performing as it should.
> Cheers!


Wow A 260 TO A 280X? That's a drastic upgrade!


----------



## Jay1ty0

Yes, it's a strange feeling to be able to run games at max settings








I had that card since 2009, and only upgraded because I'm missing out on a ton of DX11 only titles.
Hopefully this card will last 6 years too ahaha


----------



## JackCY

I upgraded from NV go7300 / C2D 2x1.6GHz / 2GB (2006) to AMD 280x / 4690K 4x4.5GHz / 16GB (2014). XD
That's what I call drastic


----------



## Phantasia

Quote:


> Originally Posted by *Jay1ty0*
> 
> Hey guys, received my Sapphire Vapor-X R9 280x today
> 
> 
> 
> 
> 
> 
> 
> (got it for 180€ new, the cheapest we had here on Portugal was 250€+
> 
> 
> 
> 
> 
> 
> 
> )
> Finally upgraded from my old GTX260
> 
> 
> 
> 
> 
> 
> 
> 
> I ran Heaven benchmark 4.0
> Tesselation extreme, Quality Ultra, 8XAA, 1080p and got a score of 774 (I don't really care for benchmarks, games only), but I'm only asking if it's normal so I can have an idea if the card is performing as it should.
> Cheers!


Nice upgrade and find! 

Might I ask where you got it at that price? I'm from Portugal as well and been thinking of either doing crossfire with my 280x, or going up. 

Enjoy it!


----------



## Jay1ty0

Thank you, I was extremely lucky.
I was after a 4gb g1 R9 380, the store I contacted at the time decided to bump the price from 250 to 270, after emailing me that it wasn't on stock








I canceled the order, went to another store (globaldata), and they had a 25% off on this card, but It was the last one, I believe they removed it from their website now


----------



## tabascosauz

Quote:


> Originally Posted by *Jay1ty0*
> 
> Hey guys, received my Sapphire Vapor-X R9 280x today
> 
> 
> 
> 
> 
> 
> 
> (got it for 180€ new, the cheapest we had here on Portugal was 250€+
> 
> 
> 
> 
> 
> 
> 
> )
> Finally upgraded from my old GTX260
> 
> 
> 
> 
> 
> 
> 
> 
> I ran Heaven benchmark 4.0
> Tesselation extreme, Quality Ultra, 8XAA, 1080p and got a score of 774 (I don't really care for benchmarks, games only), but I'm only asking if it's normal so I can have an idea if the card is performing as it should.
> Cheers!


Tri-X version? congrats on the card.


----------



## neurotix

Wow, GTX 260 to Vapor-X 280X, congrats!

Excellent choice in card and cooler.


----------



## Jay1ty0

Regular Vapor-X not Tri-X








Thank you all! It was really a great deal, I don't believe I'm ever going to be this lucky again ahaha
Haven't really stressed the card yet, but I managed to play Shadow of Mordor with 1100/1600, with +30%, not sure if the +30% is safe or if the overclock is stable at all, but these cards don't OC that much, right?
I'm not that concerned in that department, because it runs everything I throw at it really smooth, and since I need to format maybe a clean install with just a few games on the hdd will increase the performance a little bit









Edit:
also I've read that the bios that this card has
015.041.xxxxx
is quite troublesome, and that the 015.039 is "better".
Will it void my warranty if I change the bios?
Are the improvements that better?
Thanks in advance


----------



## GoLDii3

Quote:


> Originally Posted by *Jay1ty0*
> 
> Regular Vapor-X not Tri-X
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you all! It was really a great deal, I don't believe I'm ever going to be this lucky again ahaha
> Haven't really stressed the card yet, but I managed to play Shadow of Mordor with 1100/1600, with +30%, not sure if the +30% is safe or if the overclock is stable at all, but these cards don't OC that much, right?
> I'm not that concerned in that department, because it runs everything I throw at it really smooth, and since I need to format maybe a clean install with just a few games on the hdd will increase the performance a little bit
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit:
> also I've read that the bios that this card has
> 015.041.xxxxx
> is quite troublesome, and that the 015.039 is "better".
> *Will it void my warranty if I change the bios?*
> Are the improvements that better?
> Thanks in advance


No,just back up the original BIOS and flash it back if you ever need to RMA.


----------



## JackCY

ASUS 280x DC2T has 015.041.xxxxx as an official vbios update, flashed via ASUS tool.
I would be vary of some 3rd party tools and vbios downloaded from god knows where originally for who knows what.

No difference observed after update to 041 from 039.


----------



## Jay1ty0

I believe that the new 015.041. bios makes the card throttle if the VRM temperature reaches 72ºC, the previous 039 bypassed that I believe. Still I haven't surpassed the 60ºC mark, so I guess it's all good


----------



## JackCY

This is more of an issue of card HW design than vbios version. I get no throttling even at only 90% power limit. With stock 100% never any thermal throttle, ever. Even with quiet fan and reaching beyond 80C on core and VRMs. Throttling is not temperature dependent but power dependent, at least until some crazy high core temp. like 95C then sure the core might slow down to try preserve itself from certain destruction.


----------



## Jay1ty0

Understood, thank you very much!


----------



## mrmariosgta4

Hello Guys









I have the Sapphire Radeon R9 280 3GB Dual-X Boost OC and I have just overclocked and i want to ask you 2 questions.

1. Am I good with this settings ??
2. Am I safe with my whole system and gpu ( core voltage at 1.25 ?? ) the stock core voltage of gpu is 1.123.

MAX Temp 70-71 C





My Rest PC :

Motherboard : Asus m4a87td evo
CPU : amd phenom ii 955 @ 3.6 ghz C3 with 1.36 core voltage ( overclocked ) with cpu/nb frequency at 2200mhz and cpu/nb voltage 1.175 V
Ram : 8 gb ddr3 1600mhz
HDD 1 : 320 gb sata 3
SSD : Samsung 830 128GB Sata 3
Case : Thermaltake Commander MS-I
PSU : Thermaltake Smart SE 730W

I wont have problems right ???

Also i want to say that i have tested it with Heaven Benchmark 4.0 for 15 minutes and no problems









Thank you and have a nice day


----------



## buttface420

i have two of those dual x 280x's but i actually never overclocked them..maybe i steal your settings


----------



## mrmariosgta4

@buttface420

Hahaha do it man you will get awesome performance








Change your GPU core voltage from trixx









EDIT : I have just tested again with these settings :

I tried to get the core clock higher to 1120 but got some artifacts so i returned it to 1100.
I think i will stick with these settings








GPU Core Voltage 1.25











Stable MAX Temp 72 C







( 3 laps of Heaven Benchmark 4.0







) for 15 minutes i think









I think i am safe with these right ??


----------



## JackCY

Play games, Heaven and Firestrike are quite worthless to test stability.

280 != 280x, those settings would even be a downgrade


----------



## mrmariosgta4

Of course i will test it with games like GTA V and many others









Are my temperatures and gpu core voltage safe for my whole pc ??









GPU Temp : 72 C and GPU Core Voltage 1.25









PSU : Thermaltake Smart SE 730 W


----------



## JackCY

The stock vbios and tools won't allow you to go too high.


----------



## murzyn

Hey guys i got 280x Gigabyte and i have problem with OC. Everything higher than 1100mhz on core clock = artifacts.
I dont know do u guys use some special settings on msi afterburner or what but i tried every option on power limit and also i even modded bios to unlock 1.3V on card.
Temps are fine, everything is fine on stock but when i OC anything i got problem








Idk what to do ...


----------



## GoLDii3

Did anyone on Windows 10 try DX12 with the new 15.7 drivers or it's only for GCN 1.1 only¿?


----------



## neurotix

Quote:


> Originally Posted by *murzyn*
> 
> Hey guys i got 280x Gigabyte and i have problem with OC. Everything higher than 1100mhz on core clock = artifacts.
> I dont know do u guys use some special settings on msi afterburner or what but i tried every option on power limit and also i even modded bios to unlock 1.3V on card.
> Temps are fine, everything is fine on stock but when i OC anything i got problem
> 
> 
> 
> 
> 
> 
> 
> 
> Idk what to do ...


This just means you probably have an exceptionally poor card when it comes to overclocking. It's the same for CPUs and GPUs really, sometimes no matter how much voltage you give them, they won't go past a certain clock. What's your ASIC %?

You can, however, try lowering the memory clocks and then see if you can reach a higher core clock. Higher core clocks give many times more performance than memory clocks. Reduce your memory overclock and then try again.


----------



## Casey Ryback

Quote:


> Originally Posted by *murzyn*
> 
> Hey guys i got 280x Gigabyte and i have problem with OC. Everything higher than 1100mhz on core clock = artifacts.
> I dont know do u guys use some special settings on msi afterburner or what but i tried every option on power limit and also i even modded bios to unlock 1.3V on card.
> Temps are fine, everything is fine on stock but when i OC anything i got problem
> 
> 
> 
> 
> 
> 
> 
> 
> Idk what to do ...


You just have an average chip, Gigabyte have already OC'd the card quite high for that architecture.

The 7970 was originally released with about 950mhz on the cores. The results aren't really surprising.


----------



## tabascosauz

Quote:


> Originally Posted by *Casey Ryback*
> 
> You just have an average chip, Gigabyte have already OC'd the card quite high for that architecture.
> 
> The 7970 was originally released with about 950mhz on the cores. The results aren't really surprising.


The 7970 was released sub-GHz. However, most 280X seem to be based on revised Tahiti XT (not sure if XT2 or XTX) so it would make sense for them to outperform the OG 7970.

On the other hand, the "ref" 280X is supposed to be sub-900MHz, so 1100MHz is already pretty impressive. My guess is that some of the OC-oriented 280X came with leftover Tahiti XTs intended for the 7970 GHz?


----------



## Agent Smith1984

On my son's box right now, and about to upgrade to 15.7....

Everyone having good luck with the 15.7 drivers on tahiti?


----------



## radier

Is the best driver.

Taptaptap Mlais M52


----------



## sage101

Quote:


> Originally Posted by *radier*
> 
> Is the best driver.
> 
> Taptaptap Mlais M52


Hey Radier, I realize that we got very similar rig can you tell me your oc settings of the 2500k? I can't seem to get a stable oc over 4.5ghz


----------



## jason387

I have the R9 285? Are there any members here with the same GPU?


----------



## diggiddi

Quote:


> Originally Posted by *radier*
> 
> Is the best driver.
> 
> *Taptaptap Mlais M52*


Whaat!


----------



## jason387

Quote:


> Originally Posted by *diggiddi*
> 
> Whaat!


Can you post your best firestrike or 3D mark 11 score with an 7950/280 or a 7970/280X?


----------



## diggiddi

Quote:


> Originally Posted by *jason387*
> 
> Can you post your best firestrike or 3D mark 11 score with an 7950/280 or a 7970/280X?


Will do later on, right now shuteye is calling


----------



## jason387

Quote:


> Originally Posted by *diggiddi*
> 
> Will do later on, right now shuteye is calling


Thanks


----------



## diggiddi

Quote:


> Originally Posted by *jason387*
> 
> Thanks


http://www.3dmark.com/3dm11/10045479

P9449
GFX 10579
Physics 7285
Combined 6976

Note: latest Driver 15.7, latest version of 3Dmark 11,
GPU 1165/1300 +20 power limit Stock voltage,
Cpu stock ,
Ram 16gb 2400mhz 11-13-13-31 1.65v


----------



## jason387

Quote:


> Originally Posted by *diggiddi*
> 
> http://www.3dmark.com/3dm11/10045479
> 
> P9449
> GFX 10579
> Physics 7285
> Combined 6976
> 
> Note: latest Driver 15.7, latest version of 3Dmark 11,
> 
> 
> 
> 
> 
> 
> 
> 
> GPU 1165/1300 +20 power limit Stock voltage,
> Cpu stock ,
> Ram 16gb 2400mhz 11-13-13-31 1.65v


Thanks based on what you sent me it appears that AMD did make changes to the R9 285 and the R9 285 is closer to the R9 280X than it it is to the R9 280
I'm suing 15.7 as well.
Stock Score(965/1400Mhz)- http://www.3dmark.com/3dm11/10040912
Overclocked Scrore(1160/1500Mhz)- http://www.3dmark.com/3dm11/10033012


----------



## JackCY

280x, 15.7 vs 15.3beta (the beta between those were bad bad bad = slow for single GPU)
Quote:


> Unigine Heaven Benchmark 4.0
> *FPS:
> 48.5
> Score:
> 1221
> Min FPS:
> 24.5
> Max FPS:
> 91.7*
> 
> System
> Platform:
> Windows NT 6.2 (build 9200) 64bit
> CPU model:
> Intel(R) Core(TM) i5-4690K CPU @ 3.50GHz (3498MHz) x4
> GPU model:
> AMD Radeon R9 200 Series *14.502.1014.0* (3072MB) x1
> 
> Settings
> Render:
> Direct3D11
> Mode:
> 1920x1200 4xAA fullscreen
> Preset
> Custom
> Quality
> Ultra
> Tessellation:
> Normal


Quote:


> Unigine Heaven Benchmark 4.0
> *FPS:
> 48.5
> Score:
> 1221
> Min FPS:
> 24.7
> Max FPS:
> 91.8*
> 
> System
> Platform:
> Windows NT 6.2 (build 9200) 64bit
> CPU model:
> Intel(R) Core(TM) i5-4690K CPU @ 3.50GHz (3498MHz) x4
> GPU model:
> AMD Radeon R9 200 Series *15.200.1046.0* (3072MB) x1
> 
> Settings
> Render:
> Direct3D11
> Mode:
> 1920x1200 4xAA fullscreen
> Preset
> Custom
> Quality
> Ultra
> Tessellation:
> Normal


Total War Attila:
*15.3: 57.1fps
15.7: 58.6fps*

Minimal improvement, but hey at least unlike the CF optimized betas there were recently it doesn't sack the fps by 10







And actually something has been done for the majority of single GPU users.
Virtual super resolution VSR works, but beside screenshots it's IMHO useless as it blurs everything especially text even in games. Which is what you always get when downsizing or upsizing an image. Maybe it would work OK if one changes DPI accordingly in the same ratio but games don't have that, "ever". So for screenshots and better AA it is mostly.
FRTC, seems to work, hurray for finally having one place to limit fps for any game, not have to add that extra line in game configs, or so I hope. Will see how well it does compared to some ingame fps limiters. Hmm not in this instance, FRTC is just the same as when the game is off it's own fps limiter, input lag :/
Otherwise FRTC works where I tried it, Fraps shows fps limited to the value I set 82-83fps when 83 is set.


----------



## Jay1ty0

Hey guys, I've tried raising the voltage through MSI afterburner, but to no avail.
I have a Sapphire Vapor-X R9 280x
Max clocks I can push: 1145/1620, at 1.094V (tried unlocking it and raising it and nothing)
With luck with 1.2 I can get to 1200mhz









Edit: Solved with Trixx!
Going to try to reach 1.2Ghz :v

Edit 2:
1200/1700 seems to be my sweet spot at 1.150V
higher Voltage makes my VRM reach 72ºC and throttle my card, still I'm quite happy with what I've achieved!


----------



## JackCY

1145 at 1.094V? wow

After I reinstalled drivers and chose to install CCC again, Asus Tweak doesn't apply settings on system start because CCC starts after it and overrides it's settings, again








Afterburner has various voltage controls but none worked for me as it should. Pretty much you have to force the voltage via various options it has so it applies it, problem is it may force that voltage all the time not only when it is needed in 3D.
Sapphire trixx should work for you. Again unless you have a conflict with CCC overriding your settings each restart.

---

15.7 nice, but today after I start TWA, all the time all the clouds = transparent textures, cause artifacts.
Hard to catch the yellow, green, weird texture, shooting artifacts on screens but they do show on video capture.
This is another that was steady and could be captured on screenshot. Simply that cloud is messed up.



This happens at any clock and voltage...
Even at 850MHz core, 5GHz memory. Even with voltage down to 1.1V.

Sure looks like driver issue, and the card is also running 99% usage which is when these things happen, dunno why, usually it doesn't run 99% unless there are tons of those clouds. These clouds are not good in TWA, load the GPU a lot. At least I have some footage for RMA if it I decide to go that route finally.
These cards were never made to work I'm afraid. Asus 280x DC2T.

Something loaded wrong, had other issues elsewhere too and elevated usage with MadVR. Strange driver.


----------



## Jay1ty0

Damn,







maybe a fresh windows install and 15.7? But it's too much work for one game only...
After a couple of heaven 4.0 and shadow of mordor benchmarks, I had to lower to 1200/1650 and bump the volts to 1.2V
No thermal throttling, I'm happy with these results.


----------



## Agent Smith1984

Quote:


> Originally Posted by *JackCY*
> 
> 1145 at 1.094V? wow
> 
> After I reinstalled drivers and chose to install CCC again, Asus Tweak doesn't apply settings on system start because CCC starts after it and overrides it's settings, again
> 
> 
> 
> 
> 
> 
> 
> 
> Afterburner has various voltage controls but none worked for me as it should. Pretty much you have to force the voltage via various options it has so it applies it, problem is it may force that voltage all the time not only when it is needed in 3D.
> Sapphire trixx should work for you. Again unless you have a conflict with CCC overriding your settings each restart.
> 
> ---
> 
> 15.7 nice, but today after I start TWA, all the time all the clouds = transparent textures, cause artifacts.
> Hard to catch the yellow, green, weird texture, shooting artifacts on screens but they do show on video capture.
> This is another that was steady and could be captured on screenshot. Simply that cloud is messed up.
> 
> 
> 
> This happens at any clock and voltage...
> Even at 850MHz core, 5GHz memory. Even with voltage down to 1.1V.
> 
> Sure looks like driver issue, and the card is also running 99% usage which is when these things happen, dunno why, usually it doesn't run 99% unless there are tons of those clouds. These clouds are not good in TWA, load the GPU a lot. At least I have some footage for RMA if it I decide to go that route finally.
> These cards were never made to work I'm afraid. Asus 280x DC2T.
> 
> Hate to say it bro, but you'd better move on to another GPU.....
> Those Asus DCU2's have been problems for people since day one.
> 
> Something loaded wrong, had other issues elsewhere too and elevated usage with MadVR. Strange driver.


Hate to say it bro, but you'd better move onto another GPU.
Those cards have been problematic since day one!


----------



## JackCY

Yeah but one only finds out after getting one








This chip has been around for years yet barely people warn about it. Not that other GPU chips and cards didn't have their own issues, sometimes it's like a minefield picking a GPU.

OBS crashes when I click on the encoding tab, probably doesn't like the 15.7 drivers.

After a restart things seem to work fine, no artifacts except the OBS exits when trying to configure it.

Problem is there really isn't a replacement card for 280x.
285/380 are newer but slower.
960 is slower.

970 is 1.5x more expensive as is 390.
290 was always an expensive step up over 280x and even now if still found in a shop it's at the price levels of 390.

There is nothing like a more powerful 960Ti or 380x that would sit in the middle mainstream. Either get a crappy cheap 960/380 or spend a lot more for 970/390. Nothing in between. And up from there it goes even higher, but nothing in the middle range where 770 and 7970/280x were.

If I RMA it, assume it can be done, there is no close replacement really. How do they handle it? Plus they put through a test and find out no artifacts because it's not thing this card does 100% or even 50% of the time, just some bad luck of drivers being moody.


----------



## jason387

Quote:


> Originally Posted by *JackCY*
> 
> Yeah but one only finds out after getting one
> 
> 
> 
> 
> 
> 
> 
> 
> This chip has been around for years yet barely people warn about it. Not that other GPU chips and cards didn't have their own issues, sometimes it's like a minefield picking a GPU.
> 
> OBS crashes when I click on the encoding tab, probably doesn't like the 15.7 drivers.
> 
> After a restart things seem to work fine, no artifacts except the OBS exits when trying to configure it.
> 
> Problem is there really isn't a replacement card for 280x.
> 285/380 are newer but slower.
> 960 is slower.
> 
> 970 is 1.5x more expensive as is 390.
> 290 was always an expensive step up over 280x and even now if still found in a shop it's at the price levels of 390.
> 
> There is nothing like a more powerful 960Ti or 380x that would sit in the middle mainstream. Either get a crappy cheap 960/380 or spend a lot more for 970/390. Nothing in between. And up from there it goes even higher, but nothing in the middle range where 770 and 7970/280x were.
> 
> If I RMA it, assume it can be done, there is no close replacement really. How do they handle it? Plus they put through a test and find out no artifacts because it's not thing this card does 100% or even 50% of the time, just some bad luck of drivers being moody.


Truth is, a 285/380 isn't slower. That's a grave misconception.


----------



## Jay1ty0

Yup, I was going to get a R9 380 4Gb, G1 version. It was 260€, but I only bought this 280x because I found a huge deal (180€), otherwise I would've gone the 380 route. Those new architectures sure are tempting, and with time, newer drivers/dx12 and better support I believe it will surpass the 280x, the gap is bridging really fast!


----------



## jason387

Quote:


> Originally Posted by *Jay1ty0*
> 
> Yup, I was going to get a R9 380 4Gb, G1 version. It was 260€, but I only bought this 280x because I found a huge deal (180€), otherwise I would've gone the 380 route. Those new architectures sure are tempting, and with time, newer drivers/dx12 and better support I believe it will surpass the 280x, the gap is bridging really fast!


Yeah it is in fact. There is a thread open here for discussion on it actually- http://www.overclock.net/t/1564304/r9-280-vs-r9-285


----------



## JackCY

At stock? 285/380 vs 280x? Don't think so. Close but not there in performance. They cut down the Tahiti chip and added color compression, tessellation, freesync, improvements yes, some in performance but it didn't bring it to a full Tahiti level.
Sure 280 vs 285/380 is another matter which the thread is about but not 280x. Plus one has to get the more expensive 4GB version of 380 otherwise it's gonna be VRAM starved even at 1080p with 2GB.
285/380 is equal to 280.

---

VCE seems to be removed from 15.7








Found it on OBS that people can actually open the encoding tab but VCE cannot be selected, not supported/reported by the driver. Dunno if it's all cards or only some.
I don't see a codec I had installed that uses VCE either, gone, doesn't show in a list in Dxtory anymore.

Messed up VCE :/


----------



## radier

Nope.

Taptaptap Mlais M52


----------



## jason387

Quote:


> Originally Posted by *JackCY*
> 
> At stock? 285/380 vs 280x? Don't think so. Close but not there in performance. They cut down the Tahiti chip and added color compression, tessellation, freesync, improvements yes, some in performance but it didn't bring it to a full Tahiti level.
> Sure 280 vs 285/380 is another matter which the thread is about but not 280x. Plus one has to get the more expensive 4GB version of 380 otherwise it's gonna be VRAM starved even at 1080p with 2GB.
> 285/380 is equal to 280.


The 380 4GB is going to be priced at 200$.


----------



## jason387

Quote:


> Originally Posted by *JackCY*
> 
> At stock? 285/380 vs 280x? Don't think so. Close but not there in performance. They cut down the Tahiti chip and added color compression, tessellation, freesync, improvements yes, some in performance but it didn't bring it to a full Tahiti level.
> Sure 280 vs 285/380 is another matter which the thread is about but not 280x. Plus one has to get the more expensive 4GB version of 380 otherwise it's gonna be VRAM starved even at 1080p with 2GB.
> 285/380 is equal to 280.


My overclocked 285-
http://www.3dmark.com/3dm11/10033012

http://www.3dmark.com/3dm/7693268


----------



## SabbathHB

My overclocked 280x for comparison.

http://www.3dmark.com/fs/3970038


----------



## Agent Smith1984

My old 280x at a mild 1150/1700
http://www.3dmark.com/fs/2915511

Then at my max clock of 1260/1800
http://www.3dmark.com/fs/3176256


----------



## jason387

Quote:


> Originally Posted by *Agent Smith1984*
> 
> My old 280x at a mild 1150/1700
> http://www.3dmark.com/fs/2915511
> 
> Then at my max clock of 1260/1800
> http://www.3dmark.com/fs/3176256


The 285 does come quite close though.


----------



## JackCY

ASUS 280x DC2T
*Graphics score, Firestrike:*
7882 - 970/1600MHz
*8589 - 1070/1600MHz stock clocks*
9182 - 1140/1800MHz
9464 - 1170/1800MHz
9695 - 1210/1800MHz


----------



## jason387

Quote:


> Originally Posted by *JackCY*
> 
> ASUS 280x DC2T
> *Graphics score, Firestrike:*
> 7882 - 970/1600MHz
> *8589 - 1070/1600MHz stock clocks*
> 9182 - 1140/1800MHz
> 9464 - 1170/1800MHz
> 9695 - 1210/1800MHz


The last score at 1210 looks rather low.


----------



## Agent Smith1984

Quote:


> Originally Posted by *jason387*
> 
> The 285 does come quite close though.


Sure does, especially considering the 256 less shaders, and the slower memory bandwidth.

If the R9 380 4GB card were closer to $199 I'd be hunting one for my wife's project, but I'll instead probably score a 290....

Unless of course they decide to launch a very compellingly priced 380x with the unlocked 2048 tonga, and 4GB?


----------



## jason387

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Sure does, especially considering the 256 less shaders, and the slower memory bandwidth.
> 
> If the R9 380 4GB card were closer to $199 I'd be hunting one for my wife's project, but I'll instead probably score a 290....
> 
> Unless of course they decide to launch a very compellingly priced 380x with the unlocked 2048 tonga, and 4GB?


I'm having a feeling they will.


----------



## Agent Smith1984

Quote:


> Originally Posted by *jason387*
> 
> I'm having a feeling they will.


Yeah, I get the feeling they are waiting to get all the 290 stock off their hands they can.

Then release a 380x that's slower, and cost $260.... shifting everything up instead of down?

With the way all this rebranding breaks down these days, we don't get new GPU's and cheaper old ones.

We get high dollar top tier GPU's, and comperabely priced old ones


----------



## JackCY

We need a Fury Micro a 50% cut down Fiji







Right on the 2048 shader count.


----------



## tabascosauz

Quote:


> Originally Posted by *JackCY*
> 
> We need a Fury Micro a 50% cut down Fiji
> 
> 
> 
> 
> 
> 
> 
> Right on the 2048 shader count.


No need for a Fury Micro if Tonga XT indeed exists. People over at AMD seem to have gone temporarily brain-dead again by making Tonga XT exclusive to the M295X and therefore Apple. Not that it should really offer a huge improvement over Tonga Pro, since it's still 256-bit GDDR5 and would rely heavily on delta color compression.


----------



## neurotix

I'd tend to agree- I wouldn't get a new R9 380, though the 4GB RAM is nice, if I were in the market for a card around $200 I'd definitely grab a used 290.

You can also get used 7870s for around $100.

Imo the R9 380 needs to be priced around $150 and the R7 370 around $100 for the level of performance they offer. Tonga (which is just a reworked Tahiti) and a weak Pitcairn (R7 370 is a 7850) are old chips. This is all 2012 tech. They're asking too much for them new imo. R7 370 (7850) still $150-$170 new? I'll take an Ebay 7870 over that.

Never had any problems with used cards either. I bought a 7870 XT from Terrere here and it was great, no problems. R7 265 on Ebay for $100, 3 days old, with the box and all packaging. ($150 on Newegg). 7770 for $60. I think used is the way to go right now.

Warranty I really don't care about because if it's not over $200 it's not a big deal and I can buy another xD


----------



## JackCY

Exactly, but new buyers especially not tech savvy don't know the differences. They just see names and prices. No idea they are actually buying a couple years old product just rebranded








Hence it passes as a strategy for companies to rebrand and stall. Using marketing to generate profit whether they have or not any innovation and resources that could be used for new products. Development is expensive but even I think it's just stalling tactics for the last few years overall. Focus and money shifted to mobile devices, low power chips that are getting more and more power hungry, bigger batteries, bigger "phones".

Nvidia, Maxwell.
AMD, Fiji.
Intel, HW, BW finally and soon SL.
That's it a summary of the last few years.
AMD was never in a good shape for a long time and their portfolio overall is the way it is, outdated. Mostly they spawn consoles nowadays.


----------



## jason387

Quote:


> Originally Posted by *JackCY*
> 
> Exactly, but new buyers especially not tech savvy don't know the differences. They just see names and prices. No idea they are actually buying a couple years old product just rebranded
> 
> 
> 
> 
> 
> 
> 
> 
> Hence it passes as a strategy for companies to rebrand and stall. Using marketing to generate profit whether they have or not any innovation and resources that could be used for new products. Development is expensive but even I think it's just stalling tactics for the last few years overall. Focus and money shifted to mobile devices, low power chips that are getting more and more power hungry, bigger batteries, bigger "phones".
> 
> Nvidia, Maxwell.
> AMD, Fiji.
> Intel, HW, BW finally and soon SL.
> That's it a summary of the last few years.
> AMD was never in a good shape for a long time and their portfolio overall is the way it is, outdated. Mostly they spawn consoles nowadays.


I guess their main focus is the Fury and Fury X. Both seem to be rather decent. Performance may increase further when voltage becomes accessible. Most users who are not very informed actually buy a GPU more for their novelty features than raw performance. From what I've learnt is that raw power is good but optimization on your side is what really counts.


----------



## kaspar737

Hey, I'm thinking about getting a used 280x but I've read about artifacting not on just one manufacturer but basically all of them. Can anyone explain why it is such a common issue on all 280X?


----------



## Agent Smith1984

Quote:


> Originally Posted by *kaspar737*
> 
> Hey, I'm thinking about getting a used 280x but I've read about artifacting not on just one manufacturer but basically all of them. Can anyone explain why it is such a common issue on all 280X?


To my knowledge it was primarily the Asus DCU cards, and some Sapphire Toxics in the mix too.

Most of the other cards performed as they should, and many were using their respective 7950/70 PCB prior to the rebrand.

Some cards had upgraded Hynix memory IC's which were then either overclocked too high, or overvolted causing issues and artifacts.

I would say avoid Asus at all cost, and if getting Sapphire, be weary of the toxic.....


----------



## kaspar737

Quote:


> Originally Posted by *Agent Smith1984*
> 
> To my knowledge it was primarily the Asus DCU cards, and some Sapphire Toxics in the mix too.
> 
> Most of the other cards performed as they should, and many were using their respective 7950/70 PCB prior to the rebrand.
> 
> Some cards had upgraded Hynix memory IC's which were then either overclocked too high, or overvolted causing issues and artifacts.
> 
> I would say avoid Asus at all cost, and if getting Sapphire, be weary of the toxic.....


Thanks. Any cards you can recommend without a doubt?


----------



## Agent Smith1984

Quote:


> Originally Posted by *kaspar737*
> 
> Thanks. Any cards you can recommend without a doubt?


The XFX DD's and the Sapphire Vapor-X cards are both good choices to my knowledge.


----------



## kaspar737

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The XFX DD's and the Sapphire Vapor-X cards are both good choices to my knowledge.


All I currently found here was MSI and Club3D, any idea about those? I know MSI's VRM overheated but some claim it was taken care of with a BIOS update.


----------



## Catscratch

Quote:


> Originally Posted by *kaspar737*
> 
> Hey, I'm thinking about getting a used 280x but I've read about artifacting not on just one manufacturer but basically all of them. Can anyone explain why it is such a common issue on all 280X?


Quote:


> Originally Posted by *Agent Smith1984*
> 
> To my knowledge it was primarily the Asus DCU cards, and some Sapphire Toxics in the mix too.
> 
> Most of the other cards performed as they should, and many were using their respective 7950/70 PCB prior to the rebrand.
> 
> Some cards had upgraded Hynix memory IC's which were then either overclocked too high, or overvolted causing issues and artifacts.
> 
> I would say avoid Asus at all cost, and if getting Sapphire, be weary of the toxic.....


Quote:


> Originally Posted by *kaspar737*
> 
> Thanks. Any cards you can recommend without a doubt?


Quote:


> Originally Posted by *Agent Smith1984*
> 
> The XFX DD's and the Sapphire Vapor-X cards are both good choices to my knowledge.


Avoid any Asus DCU 280x and old Sapphire 280x Toxic. I suggest non-boost models.

I wish they priced 380 the same as 280x and the upcoming 380x to where 380 plain right now. As of now, my only logical upgrade is loads of $$$ away from my 280x purchase


----------



## neurotix

Quote:


> Originally Posted by *jason387*
> 
> I guess their main focus is the Fury and Fury X. Both seem to be rather decent. Performance may increase further when voltage becomes accessible. Most users who are not very informed actually buy a GPU more for their novelty features than raw performance. From what I've learnt is that raw power is good but optimization on your side is what really counts.


Fury X, they need to just make one without the stupid AIO cooler. I think this was a bad idea. I'm sure the board partners could make coolers that could keep them under 70C at stock. I love Crossfire and since I got it, I'm not looking back. However, mounting two 120mm rads in my case would be awful. And I have NO idea how you could even do quadfire with these things.

Fury is much better, and seems to offer pretty great performance, but it's priced too high.
Quote:


> Originally Posted by *kaspar737*
> 
> Hey, I'm thinking about getting a used 280x but I've read about artifacting not on just one manufacturer but basically all of them. Can anyone explain why it is such a common issue on all 280X?


I have a 7970 and it doesn't artifact.

Many people here have 280X, some with huge overclocks, and they don't artifact.

I don't know where you're getting your information, but you basically heard wrong.


----------



## Catscratch

Quote:


> Originally Posted by *neurotix*
> 
> Fury X, they need to just make one without the stupid AIO cooler. I think this was a bad idea. I'm sure the board partners could make coolers that could keep them under 70C at stock. I love Crossfire and since I got it, I'm not looking back. However, mounting two 120mm rads in my case would be awful. And I have NO idea how you could even do quadfire with these things.
> 
> Fury is much better, and seems to offer pretty great performance, but it's priced too high.
> I have a 7970 and it doesn't artifact.
> 
> Many people here have 280X, some with huge overclocks, and they don't artifact.
> 
> I don't know where you're getting your information, but you basically heard wrong.


Have you been to sapphire forums ? Judging by the sample size, its a big problem. There were people hunting for nonboost bios, also for 79xx. I read before, thats why I went with the nonboost tri-x. I wanted asus dcuii but safest route was more appealing.


----------



## neurotix

I haven't.

I'll take a look at it later.

This is the first I've heard of it.

I know there were dud 7970s (Matrix Platinum







) but afaik the 280x were fine.

It's funny I'm only hearing of this now, the 280x has been out for a while yet this is the first I'm hearing of any such issue. I know a lot of people here happy with their 280x.

I always buy Sapphire and I've not had many issues with them buying new cards from them.

If you buy some bargain bin crap like Powercolor Turboduo or Club3D, then that's probably the problem.

Buy a well reviewed card from a major manufacturer and I doubt you'll have these issues, and if buying new, you should be able to RMA it with Newegg or, after 30 days, with the company itself.

I mean, if you don't find issues with it within the first 30 days Newegg RMA period (or whatever other store), then it's probably fine. You should be able to figure it out pretty quick.

I have to wonder how many of these kinds of problems are caused by "user error", that is, too high of an overclock, using the card in the middle of a friggen desert, poor case airflow, dusty components etc. Too many variables to list here. But the main point is that people don't know what they're doing.

Of course, if you're really concerned about this, buy Nvidia. Seems like every month someone comes in this thread with the exact same question.


----------



## MiladEd

My 280X artifacts at over 1130 MHz, regardless of the Voltages. It actually tends to artifact more if voltages are higher.


----------



## neurotix

Yes, well, that's called the silicon lottery. Some chips overclock low, no matter the voltage. Every chip will top out at a certain frequency. Fact of the matter is, 1100mhz+ is not a bad overclock for Tahiti. The other fact is, if the card operates at the manufacturer's stock settings, then there is nothing wrong with the card and they can't be held responsible...overclocks are not guaranteed.

I've been in possession of 3 7970s and ALL have done 1200mhz no problem.

Really, if you're buying or interested in AMD GPUs, don't expect them to overclock like Maxwell/Nvidia. 10-20% is normal, and has been for many generations. The difference is that usually 1) AMD cards are generally cheaper and 2) at the same clocks they are faster in some instances than Nvidia cards in their tier (my 1100mhz 290s getting 10 more fps than SLI 970s at 1500mhz in Valley.) See also 960 vs the AMD cards it competes with.


----------



## jason387

So far from what I've seen the overclock scaling on Maxwell isn't that good though. They may overclock like crazy but they gains aren't as good as they should be resulting from that crazy, beastly overclock.


----------



## Phantasia

My XFX DD Black Edition used to artifact, especially on desktop mode, or watching youtube videos.

I had to rma it after some time...

The old one was v3.8, the new one I received after RMA was v6.3.

Not a single artifacting issue and it overclocks reasonably well. Voltage control is locked to my knowledge, never tried to dig further into it.

Hope it helps.


----------



## gnemelf

now see I'm rocking 2 powercolor 280x cards that I have had no issues with since they first released when they where originally 299 at drop and then when powercolor redid with the black pcb and they both work great and overclocked just fine.


----------



## JackCY

FRTC, not working on DX9 and lower probably neither.
VSR is moody too especially when switching between VSR fullscreen program and native res desktop. Messes up window layout. As well as getting the VSR resolution stick in same games is impossible, it resets to native after game restart.


----------



## masteratarms

I've got a Kraken G10 and its working really well on my eah6850. I purchased a 2nd hand Powercolor 280x with noisy fan and I'm going to use the Kraken on that.
http://www.overclock.net/t/1487012/official-nzxt-kraken-g10-owners-club/3870#post_24170430

Its a reference card with 7970 PCB has 8+6 pin power connectors. What is the best bios? One with +50 power limit and what is the advantage of a non-boost bios? Should I make my own using VBE7?


----------



## diggiddi

Boost Bios allows for higher overclock =better, right







, stock bios can act as a spare,
its there so you don't brick your card with a modified bios, just switch back to it and carry on


----------



## Mussels

jumping over here from techpowerup, where we've got a bunch of threads about gigabyte 280x cards being faulty. Personally i'm on my fifth card in a month, as they keep sending me faulty cards under warranty.
The only thing iv'e found in common is most of the people (not everyone listed their specs) are running socket 1150 i5/i7's, usually 2x00 chips. (i5 2400 + i7 2600 here) - i've been wondering if its because these CPU's limite the PCI-E 3.0 boards down to 2.0, and something weird with the board design doesn't like that?

the cards will randomly crash to a grey screen doing simple things like youtube or idling in chrome. The PCI-E bandwidth test in GPU-Z is the quickest way to make them crash and heaven in DX11 also causes crashes, it almost never completes a single run.

Two test systems showing the same symptoms:

i7 2600
16GB 1600MHz
sapphire 7970 3GB / gigabyte 280x rev 2.0
windows 8.1/10
corsair HX1000
asrock Z68 extreme3 gen3
HDMI -> sony 46" HDTV

i5 2400
16GB DDR3 1333Mhz
6870 crossfire/280x
windows 8.1/10
corsair TX750
gigabyte ga-z68A-D3-B3
DVI - samsung 24"

both systems give me grey screen of death, flickering lines, and after a few crashes in windows i get artifacts/issues in BIOS and loading windows, to the point the cards wont even load anymore.

I'm getting yet another replacement card on monday, anyone got suggestions on what to do do avoid this garbage? non boost BIOS?


----------



## JackCY

CF? Have fun, that's never ending issues.

W10 isn't released and supported yet, it's a preview, again have fun debugging and getting working drivers for HW.

Start simple, narrow it down. I wonder who bothers sending you replacements over and over.

---

FRTC, makes Heaven stutter about every 1s when it's on the limiter, making it jump 10fps below the limit every 1s. But even when running only half fps than the limit the limiter gets hit on some frames and makes it stutter yet again when it hits the limiter. ALT+TAB disabled FRTC, can't check if the program returned to fullscreen exclusive or not but it did seem to do the usual blink.
FRTC and VSR seems plagued with issues.


----------



## murzyn

Quote:


> Originally Posted by *neurotix*
> 
> This just means you probably have an exceptionally poor card when it comes to overclocking. It's the same for CPUs and GPUs really, sometimes no matter how much voltage you give them, they won't go past a certain clock. What's your ASIC %?
> 
> You can, however, try lowering the memory clocks and then see if you can reach a higher core clock. Higher core clocks give many times more performance than memory clocks. Reduce your memory overclock and then try again.


My ASIC % IS 62,5% is it good? IDK
About lowering the memory clocks im gonna check this out tomorrow.


----------



## tabascosauz

Quote:


> Originally Posted by *murzyn*
> 
> My ASIC % IS 62,5% is it good? IDK
> About lowering the memory clocks im gonna check this out tomorrow.


Hasn't it already been confirmed by Sapphire and AMD that lower ASIC has more potential than higher ASIC, not the other way around? I can't find the specific thread but in Sapphire's statement it said so.


----------



## neurotix

62.5% is pretty bad. That would explain not doing more than 1150mhz.

And tabasco, it pretty much depends on the cooling. Lower ASIC does higher clocks on water and LN2. Higher ASIC does better clocks on air.

It doesn't always work this way, of course, and low or high ASIC doesn't always affect how well a card can overclock. (Eg there are cards that will do high clocks on air with low ASIC, and cards that will do poor clocks on air with high ASIC.)

So ultimately, it doesn't mean all that much, because there's exceptions. However, in my experience owning and benching 10+ Radeon cards in the last few years, the ones with higher ASIC generally overclocked better. (And in this instance, I had a 7970 in my possession with around 62% ASIC too, and it overclocked like crap.)


----------



## Bartouille

Got my new 280. I'll just leave it here LOL


----------



## neurotix

Ouch, that might be the lowest I've ever seen...

Doesn't mean it won't overclock though.


----------



## masteratarms

Does anyone know where I can get a replacement 90mm fan for the TurboDuo? I am going to Kraken G10 it but I'd like to have a working stock cooler too. http://www.powercolor.com/download_jpg.asp?dirs=Eflyer&ids=R9-280XTD


----------



## Horsemama1956

Grabbed a Club3D 280X RoyalQueen for $130 on Kijiji. Wanted to grab a newer card this week instead, but the prices in Canada are insane and I don't care enough. 960s are 300 after tax, 2GB 380 around 3-350. The Furys' are like 850 lol.


----------



## JackCY

Quote:


> Originally Posted by *Horsemama1956*
> 
> Grabbed a Club3D 280X RoyalQueen for $130 on Kijiji. Wanted to grab a newer card this week instead, but the prices in Canada are insane and I don't care enough. 960s are 300 after tax, 2GB 380 around 3-350. The Furys' are like 850 lol.


In CAD it seems the same as US prices in USD.

Replacement fans for GPUs are all over ebay and aliexpress.


----------



## DiceAir

My asic quality is almost 60% but I'm still unable to overclock above 1100mhz stock clocks. I was wondering if modifying the bios to get power limit of 50%. i can only go up to 20%. Will this help or am I wasting my time. My tdp in the bios is 221w I'm uploading my bios now so maybe someone can have a look.

I'm running 2 r9 280x royalking

https://www.dropbox.com/s/lip0yggpny6mgy9/Tahiti.rom?dl=0


----------



## Dhoulmagus

Quote:


> Originally Posted by *DiceAir*
> 
> My asic quality is almost 60% but I'm still unable to overclock above 1100mhz stock clocks. I was wondering if modifying the bios to get power limit of 50%. i can only go up to 20%. Will this help or am I wasting my time. My tdp in the bios is 221w I'm uploading my bios now so maybe someone can have a look.
> 
> I'm running 2 r9 280x royalking
> 
> https://www.dropbox.com/s/lip0yggpny6mgy9/Tahiti.rom?dl=0


I'm really not sure, I don't see what extending your power limit beyond 20% is realistically going to do for you unless you're talking about massive voltages, I could be wrong. I also believe that can be done with a little change to CCC registry entries, no need for a custom bios, again I'm not sure. If you aren't running furmark and running the card on a suicide mission it should be able to overclock on +20% like most of us use.

The conversation about asic quality prompted me to test mine out:

I'm running an Asus 280X Direct CU II
GPU-Z Reports an ASIC quality of 60.3%

I'm currently running an overclock of 1180 core, 1650 memory, power limit set to 20%. Edit: Stock voltage

It's rock solid stable after a few hours of testing for me, I've run multiple passes of Heaven (DX11, Ultra, Extreme tess, 8X aa, 1920x1080) with no artifacts and shockingly, after at least 5 benches at the current clocks the card reached an absolute max temp of 60C @ 72% fan (on air, stock TIM)

Just incase anybody is curious (Heaven results): All done with a stock 4790k
Stock (1070/1600) scores 948
1125/1600 score: 986
1160/1600 score: 1006
1180/1600 score: 1020
1180/1650 score: 1026 -- Typical evidence that GPU frequency is everything on these.

Haven't tested any higher, pretty sure it will crash at 1200. I'll go up to 1300mV and see what this specific one can do when I have some more time.

Just sharing this because the ASIC quality is quite "low" but my stock voltage overclock seems a bit above average (I believe).. If anything I would guess ASIC quality may come into play as you overvolt and have a more or less leakage, but it's still too hazy to define.

Just checked my other PC running a 7950.. Asic quality 79.4% .. I guess since that's technically a 280 I'll see what it can handle as well.


----------



## DiceAir

Quote:


> Originally Posted by *Serious_Don*
> 
> I'm really not sure, I don't see what extending your power limit beyond 20% is realistically going to do for you unless you're talking about massive voltages, I could be wrong. I also believe that can be done with a little change to CCC registry entries, no need for a custom bios, again I'm not sure. If you aren't running furmark and running the card on a suicide mission it should be able to overclock on +20% like most of us use.
> 
> The conversation about asic quality prompted me to test mine out:
> 
> I'm running an Asus 280X Direct CU II
> GPU-Z Reports an ASIC quality of 60.3%
> 
> I'm currently running an overclock of 1180 core, 1650 memory, power limit set to 20%. Edit: Stock voltage
> 
> It's rock solid stable after a few hours of testing for me, I've run multiple passes of Heaven (DX11, Ultra, Extreme tess, 8X aa, 1920x1080) with no artifacts and shockingly, after at least 5 benches at the current clocks the card reached an absolute max temp of 60C @ 72% fan (on air, stock TIM)
> 
> Just incase anybody is curious (Heaven results): All done with a stock 4790k
> Stock (1070/1600) scores 948
> 1125/1600 score: 986
> 1160/1600 score: 1006
> 1180/1600 score: 1020
> 1180/1650 score: 1026 -- Typical evidence that GPU frequency is everything on these.
> 
> Haven't tested any higher, pretty sure it will crash at 1200. I'll go up to 1300mV and see what this specific one can do when I have some more time.
> 
> Just sharing this because the ASIC quality is quite "low" but my stock voltage overclock seems a bit above average (I believe).. If anything I would guess ASIC quality may come into play as you overvolt and have a more or less leakage, but it's still too hazy to define.
> 
> Just checked my other PC running a 7950.. Asic quality 79.4% .. I guess since that's technically a 280 I'll see what it can handle as well.


Maybe I reached my max clocks. Whenever I overclock the card either artifacts or crashes. Funny thing is I tested both cards and must say club3d sucks when it comes to overclock. I learned that the highest stock clocks doesn't always indicate how it will overclock. Maybe just unlucky batch of cards or my specific setup


----------



## Dhoulmagus

Quote:


> Originally Posted by *DiceAir*
> 
> Maybe I reached my max clocks. Whenever I overclock the card either artifacts or crashes. Funny thing is I tested both cards and must say club3d sucks when it comes to overclock. I learned that the highest stock clocks doesn't always indicate how it will overclock. Maybe just unlucky batch of cards or my specific setup


I have no experience with that brand, I just took a look at the reviews on newegg and they all rate them very poorly for serious thermal (heat) issues and crashing problems. Have you monitored your temps under heavy stress (heaven benchmark, etc)? I was going to ask if you've tried bumping up the voltage a bit to see what happened, but if yours are anything like the reviewers I would suspect they are blazing hot and extra volts would just be bad news. Looks like they may just be poor cards for overclocking


----------



## DiceAir

Quote:


> Originally Posted by *Serious_Don*
> 
> I have no experience with that brand, I just took a look at the reviews on newegg and they all rate them very poorly for serious thermal (heat) issues and crashing problems. Have you monitored your temps under heavy stress (heaven benchmark, etc)? I was going to ask if you've tried bumping up the voltage a bit to see what happened, but if yours are anything like the reviewers I would suspect they are blazing hot and extra volts would just be bad news. Looks like they may just be poor cards for overclocking


I tried bumping up the voltage and temps aren't that bad. Only goes to like 83C on 100% fan speed. Remember running 2 of them in crossfire. Bumping up the voltage while having power limit at 20% oesn't help. i just can't seem to overclock this card. At least price wise it was one of the cheaper cards I could buy so I can't really complain.

I was also thinking if it's because they use a reference board so the components aren't that high quality and the board can only handle so much.


----------



## murzyn

Quote:


> Originally Posted by *DiceAir*
> 
> I tried bumping up the voltage and temps aren't that bad. Only goes to like 83C on 100% fan speed. Remember running 2 of them in crossfire. Bumping up the voltage while having power limit at 20% oesn't help. i just can't seem to overclock this card. At least price wise it was one of the cheaper cards I could buy so I can't really complain.
> 
> I was also thinking if it's because they use a reference board so the components aren't that high quality and the board can only handle so much.


You are not alone bro








I saw ppl here who had 1250 mhz on core without any problem and here i am with artifacts/crashes/bsods on 1150+ even on 1.3V and underclocked memory...


----------



## Jay1ty0

Didn't know that ASIC stuff.
GPU-Z reports 77.7%, max I've tried overclocking (stable) was 1200/1680 at 1.2V.
The 1.3v are "safe", right?
Being in an overclocking forum I'm bringing great disgrace to my family for not pushing more clocks


----------



## Dhoulmagus

Quote:


> Originally Posted by *Jay1ty0*
> 
> Didn't know that ASIC stuff.
> GPU-Z reports 77.7%, max I've tried overclocking (stable) was 1200/1680 at 1.2V.
> The 1.3v are "safe", right?
> Being in an overclocking forum I'm bringing great disgrace to my family for not pushing more clocks


1200/1680 is a hell of a boost.

1.3v depends, and safe is a bit arbitrary when it comes to Overvolting, transistors age and wear out. Your number one enemy is heat, if you are nice and cool you can assume it will still have a nice long life, but there's never a guarantee. What are your temps like under a full load at 1.2V? Typically 1.3 is the highest you'll want to put these on air (in a case with adequate air flow), water can take you a bit higher. That all depends on your temps. Bump her up 25mV at a time, push your core as far as it will go each time and see where your temps settle under stress, keep an eye on your vrm temps in gpu-z too. I prefer to just leave Heaven benchmark looping to gauge my stress temps, it should lock you in at 99% load without putting it on a Furmark suicide run.

As for what temps are safe, I would suspect under 85C should be fine, these GPUs throttle at 95 I believe. My personal preference is to stay below 70C but you can safely go beyond, I'm just supremely anal about my temps and sacrifice a bit of performance to stay nice and cool.

edit: I also personally like to set the memory frequency to default and see how far I can push the GPU first, performance gains are a bit wider on the GPU vs the memory.


----------



## Bartouille

Quote:


> Originally Posted by *murzyn*
> 
> You are not alone bro
> 
> 
> 
> 
> 
> 
> 
> 
> I saw ppl here who had 1250 mhz on core without any problem and here i am with artifacts/crashes/bsods on 1150+ even on 1.3V and underclocked memory...


Just letting you know, with boost cards on afterburner you have to into the settings and tell AB your card is a boost card. Otherwise you will see the non-boost voltage, and even tho you can move the slider the voltage will never get applied. Just to make sure you can go into gpu-z and monitor voltage. Run the render thingy in gpu-z. With 1.3v you should be getting >1.25v actual voltage.


----------



## Jay1ty0

During heaven 4.0 (I ran it like 8 times straight), max I saw was probably 65ºC with a custom fan profile
The problem are the VRM temps, it hit like 75ºC, and with the 015.041.xxxx bios these cards throttle over 72ºC VRM temp so it seems.
I can probably reach higher clocks (for the sake of speculation let's say for example 1300 mhz. I wouldn't be usable for gaming due to VRM temperatures being high :/)
The previous 015.039. bypassed the throttling, but from what I've read you can't flash to old bioses, and I'm afraid of bricking this card


----------



## tonyjones

I just bought the 270x to run triple monitors, all different ones and now I find out one of the monitors have to use display port on the 270x to be able to run all three lol so odd


----------



## murzyn

Quote:


> Originally Posted by *Bartouille*
> 
> Just letting you know, with boost cards on afterburner you have to into the settings and tell AB your card is a boost card. Otherwise you will see the non-boost voltage, and even tho you can move the slider the voltage will never get applied. Just to make sure you can go into gpu-z and monitor voltage. Run the render thingy in gpu-z. With 1.3v you should be getting >1.25v actual voltage.


I edited bios to get 1.3V when in stress im not doing anything with voltage in msi afterburner.


----------



## DiceAir

Quote:


> Originally Posted by *murzyn*
> 
> You are not alone bro
> 
> 
> 
> 
> 
> 
> 
> 
> I saw ppl here who had 1250 mhz on core without any problem and here i am with artifacts/crashes/bsods on 1150+ even on 1.3V and underclocked memory...


Ok just tried again. I played about 4 rounds with BF4 on high settings 2560x1440 @ 96hz and wasn't crashing whatsoever. I had my core on 1140mhz and memory on 1540mhz so I will leave it at that untill it crashes and if not I'm fine.


----------



## Dhoulmagus

Quote:


> Originally Posted by *Jay1ty0*
> 
> During heaven 4.0 (I ran it like 8 times straight), max I saw was probably 65ºC with a custom fan profile
> The problem are the VRM temps, it hit like 75ºC, and with the 015.041.xxxx bios these cards throttle over 72ºC VRM temp so it seems.
> I can probably reach higher clocks (for the sake of speculation let's say for example 1300 mhz. I wouldn't be usable for gaming due to VRM temperatures being high :/)
> The previous 015.039. bypassed the throttling, but from what I've read you can't flash to old bioses, and I'm afraid of bricking this card


I've never heard of a card that can't revert to a previous bios, what's probably happening is that people are flashing a bios that was meant for another PCB revision without verifying that it matches their GPUs hash code (on a sticker on the pcb). a mismatching bios will cause no post.

Do you have a 280x? Your sig says 380x, I can't even find a 380x on the market yet!

The VRM on my 280x seems to stabilize at 55C during heaven runs, that's quite a bit of a gap







any chance you can squeeze in an extra fan to move air around the VRM area and cool that sucker off ? The more fans the merrier, I added two fans to the bottom of my case and two higher powered ones to the front and my temps are fantastic on the GPU as a result of it.


----------



## Dhoulmagus

Quote:


> Originally Posted by *tonyjones*
> 
> I just bought the 270x to run triple monitors, all different ones and now I find out one of the monitors have to use display port on the 270x to be able to run all three lol so odd


You mean for eyefinity? It's been that way for a long time, just grab a DP to DVI adapter and you're golden. Yeah, it is kind of silly though, I had to buy one when I had CFX 5770s.


----------



## Jay1ty0

Damn, that's a really huge typo on my sig! It's a 280X for sure!








This NZXT doesn't allow any more fans without crazy ghetto mods







Maybe better airflow case fans would help, still it's a huge investment
Still, not sure where I would point a new fan on the cooler, even with the fans at 100%, while the core temps go down like crazy, the vrm temps aren't that affected. Still in this time of the year it's very hot here in Portugal, hopefully in the winter if I move the case near a window things will change








What bios "editor" would you recommend to revert the bios?

Edit: Ran Heaven 4.0
Score 1032
Clocks 1200/1680
fan fixed at 60%, max GPU temp 70ºC; max VRM 77ºC


----------



## dfg555

I was wondering what software to use to edit the BIOS files? This BIOS http://www.techpowerup.com/vgabios/152468/sapphire-r9270x-2048-131209.html crashes for me after "Starting Windows" screen has finished, a BSOD shows up linking to an amd driver file that has crashed. I have tried reverting to older drivers by uninstalling using DDU, still crashes? I flashed my BIOS through DOS, now if only I can just edit the factory BIOS file to set it to increased Power Limit and Voltage configurations? I'm currently reverted back to my factory BIOS.


----------



## Mussels

Quote:


> Originally Posted by *Bartouille*
> 
> Just letting you know, with boost cards on afterburner you have to into the settings and tell AB your card is a boost card. Otherwise you will see the non-boost voltage, and even tho you can move the slider the voltage will never get applied. Just to make sure you can go into gpu-z and monitor voltage. Run the render thingy in gpu-z. With 1.3v you should be getting >1.25v actual voltage.


how do you do that (the boost setting)? might help my troubleshooting when my 280x comes back from warranty (again) this week.


----------



## Bartouille

Quote:


> Originally Posted by *Mussels*
> 
> how do you do that (the boost setting)? might help my troubleshooting when my 280x comes back from warranty (again) this week.


----------



## Mussels

Quote:


> Originally Posted by *Bartouille*


text on that is different for my 7970 asking if its GHz edition, explains why i couldn't find it - it must vary between cards.

Thanks that might help a lot.


----------



## Bartouille

Quote:


> Originally Posted by *neurotix*
> 
> Ouch, that might be the lowest I've ever seen...
> 
> Doesn't mean it won't overclock though.


I bought two of them. This was the first one I tried, the second one also has extremely low asic (53%). They overclock exactly the same. The voltage scaling is nearly perfect.
1100mhz 1.2v (stock)
1200mhz 1.3v
1250mhz 1.35v

Since these cards are so low asic I'm not very scared to push a lot of voltage on them, they can take it just fine.


----------



## murzyn

Can u guys tell me how can i choose proper BIOS for my GPU? My stock bios was : 015.038.000.003.003435 should i update it to 015.041.000.003.000000 ?


----------



## radier

Ask fairy...

Taptaptap Mlais M52


----------



## Jay1ty0

Nope, update to 015.039.xxxx if you can. 015.041. makes the graphics card throttle over 72ºC VRM temperatures.


----------



## Catscratch

I never updated a gfx card bios before, my non-boost seems to be 015.046 bios. I tried to upload it to gpu-z database but it says it's already uploaded, but on the website, nothing shows up when searching 015.046. Probably same exact bios with a diffrent number. This was like 5 months ago. I haven't checked recently.

Then again i never checked if the gpu is throttling. Can't know on WoW, should check while playing Witcher 3.


----------



## Mussels

my 4th 280x is back from warranty today... wish me luck XD


----------



## JackCY

Quote:


> Originally Posted by *Jay1ty0*
> 
> Nope, update to 015.039.xxxx if you can. 015.041. makes the graphics card throttle over 72ºC VRM temperatures.


You mean reference 280x?


----------



## Catscratch

Quote:


> Originally Posted by *Mussels*
> 
> my 4th 280x is back from warranty today... wish me luck XD


Good luck. If you keep going, they may have to give you R9 380







I don't think they'll have 280x anymore. Btw, could you fill out rig details and include it in your signature ?


----------



## Mussels

Quote:


> Originally Posted by *Catscratch*
> 
> Good luck. If you keep going, they may have to give you R9 380
> 
> 
> 
> 
> 
> 
> 
> I don't think they'll have 280x anymore. Btw, could you fill out rig details and include it in your signature ?


i cleaned out my old ones, will take a bit to add the updated specs. without that modded BIOS from here i dont think the card would work at all and i'd be stuck in RMA hell.


----------



## Catscratch

Quote:


> Originally Posted by *Mussels*
> 
> i cleaned out my old ones, will take a bit to add the updated specs. without that modded BIOS from here i dont think the card would work at all and i'd be stuck in RMA hell.


You gotta love the community. Otherwise there wouldn't be solutions like that. I never had a card DOA before, or problematic. However I always think these cards have issues related to power. Power states or generally bad power feed, I'm always suspicious of power, because I knew the days where you needed power plants for "ATI 2900xt"s







I remember the days where you had to look for "Continuous" power not "MAX" for PSUs. The calculation of AMPs for a rig







Believe it or not, I still remember the slight lag of having my 1090t on CnQ, when at full speed, I'd notice the more snappy response. (Causing some drivers to cause BSODs)

So when I read about problems with electronics, because most have power states now to conserve power, I always think of not good enough power regulation. Second is the usual mobo vs gfx card incompatibility, including pci-e version. There was only one %100 incompatibility incident (cards or mobos had specific warnings on them) but it's well behind 2.0 vs 3.0 era.


----------



## Mussels

Quote:


> Originally Posted by *Catscratch*
> 
> You gotta love the community. Otherwise there wouldn't be solutions like that. I never had a card DOA before, or problematic. However I always think these cards have issues related to power. Power states or generally bad power feed, I'm always suspicious of power, because I knew the days where you needed power plants for "ATI 2900xt"s
> 
> 
> 
> 
> 
> 
> 
> I remember the days where you had to look for "Continuous" power not "MAX" for PSUs. The calculation of AMPs for a rig
> 
> 
> 
> 
> 
> 
> 
> Believe it or not, I still remember the slight lag of having my 1090t on CnQ, when at full speed, I'd notice the more snappy response. (Causing some drivers to cause BSODs)
> 
> So when I read about problems with electronics, because most have power states now to conserve power, I always think of not good enough power regulation. Second is the usual mobo vs gfx card incompatibility, including pci-e version. There was only one %100 incompatibility incident (cards or mobos had specific warnings on them) but it's well behind 2.0 vs 3.0 era.


the particular card i've got (gigabyte windforce) had a good reputation that soured super fast. Like many 280x's they seemed to be overclocked beyond what they could handle - its like they worked on the original drivers, but as time went on the drivers used more of the GPU's and they simply couldn't handle it.

now i've just got an issue with MPC-HC freezing videos if i pause them, but i think thats a catalyst 15.7/win10/crossfire bug and not a hardware issue.


----------



## JackCY

I get MPCHC with MadVR crashing here and there when opening a video, especially a new one while old one is playing. Or it starts shows play but the whole thing stopped. There is something broken in either MPCHC/MadVR/LAV or a combination of it all. Might test it when I'm not playing with some super stable GPU clocks, but I doubt that is going to help as if the GPU is unstable it will crash/restart driver when playing not when opening a video file.
Been present before 15.7. No 2way, no Win10.


----------



## Mussels

Quote:


> Originally Posted by *JackCY*
> 
> I get MPCHC with MadVR crashing here and there when opening a video, especially a new one while old one is playing. Or it starts shows play but the whole thing stopped. There is something broken in either MPCHC/MadVR/LAV or a combination of it all. Might test it when I'm not playing with some super stable GPU clocks, but I doubt that is going to help as if the GPU is unstable it will crash/restart driver when playing not when opening a video file.
> Been present before 15.7. No 2way, no Win10.


no madVR here, plain MPC-HC and LAV has the issue.

no problem without crossfire, but with crossfire the image just freezes if its left paused.

also noticed that the second GPU powers on as soon as MPC-HC is opened regardless of video playback, telling CCC to not use crossfire for it doesnt help. WIsh i knew how to stop that.


----------



## jason387

Is the power consumption tab under GPU-Z accurate?


----------



## JackCY

Accurate to what degree? Even on CPUs it's not accurate. Measuring amperage and power is more complex than measuring voltage. Especially at the draws CPU/GPU have.


----------



## cruisx

So I just bought a R9 280X toxic edition from kijiji brand new in box and it appears It is showing artifacts when gaming. It seems that many users have had this problem after searching around for a bit.

1) Does anyone know of a bios I can flash that would fix the artifacts issues on a R9 280X Toxic?

2) Also does anyone know if Sapphire can do an RMA without having the original receipt? The person I bought it from said it was gifted to him but he never got around to using it.

Wish I had know about the issue before I bought it, cant even sell the card to anyone now knowing it has issues =(


----------



## Tacoboy

Quote:


> Originally Posted by *cruisx*
> 
> So I just bought a R9 280X toxic edition from kijiji brand new in box and it appears It is showing artifacts when gaming. It seems that many users have had this problem after searching around for a bit.
> 1) Does anyone know of a bios I can flash that would fix the artifacts issues on a R9 280X Toxic?
> 2) Also does anyone know if Sapphire can do an RMA without having the original receipt? The person I bought it from said it was gifted to him but he never got around to using it.
> Wish I had know about the issue before I bought it, cant even sell the card to anyone now knowing it has issues =(


Any chance the person that sold it to you may not have been 100% truthful.
Such as he knew the graphics card had issues.

Chances are sapphire will not help you (no receipt).

Guess you might need to buy another R9 280X.


----------



## jason387

Quote:


> Originally Posted by *JackCY*
> 
> Accurate to what degree? Even on CPUs it's not accurate. Measuring amperage and power is more complex than measuring voltage. Especially at the draws CPU/GPU have.


GPU-Z gives a certain power consumption value for the GPU. Does yours give you a true picture?


----------



## JackCY

Quote:


> Originally Posted by *cruisx*
> 
> So I just bought a R9 280X toxic edition from kijiji brand new in box and it appears It is showing artifacts when gaming. It seems that many users have had this problem after searching around for a bit.
> 
> 1) Does anyone know of a bios I can flash that would fix the artifacts issues on a R9 280X Toxic?
> 
> 2) Also does anyone know if Sapphire can do an RMA without having the original receipt? The person I bought it from said it was gifted to him but he never got around to using it.
> 
> Wish I had know about the issue before I bought it, cant even sell the card to anyone now knowing it has issues =(


kijiji what?

1) no, not a VBIOS issue, it's either a chip that is faulty and the faulty sections were not disabled and chip downgraded to be used in a 280, 280x has a full uncut chip and the they simply let through too many to satisfy demand, sometimes it's the VRAM being volted too high or run too tight
2) no not without receipt, at least in EU
Quote:


> Originally Posted by *jason387*
> 
> GPU-Z gives a certain power consumption value for the GPU. Does yours give you a true picture?


I don't have any in GPU-Z, I get more readings in HWiNFO.
Proper power measurements are done during some reviews, either by estimating, Guru3D, or actually measuring the PEG and 12V feed, Tom's HW.

I rarely ever see my power in HWiNFO go above 120W, usually max is around 100W. I bet that is pretty unrealistic for a 280x.


----------



## tabascosauz

Quote:


> Originally Posted by *cruisx*
> 
> So I just bought a R9 280X toxic edition from kijiji brand new in box and it appears It is showing artifacts when gaming. It seems that many users have had this problem after searching around for a bit.
> 
> 1) Does anyone know of a bios I can flash that would fix the artifacts issues on a R9 280X Toxic?
> 
> 2) Also does anyone know if Sapphire can do an RMA without having the original receipt? The person I bought it from said it was gifted to him but he never got around to using it.
> 
> Wish I had know about the issue before I bought it, cant even sell the card to anyone now knowing it has issues =(


I'm going to go with crappy vram. R9 cards are not known for memory overclockability, seeing as most are elpida, so "factory OC" on vram is a recipe for disaster. For example, my 280X is under x31 and no voltage increase but still artifacts occasionally in GTA V. Vram is probably the culprit.

Well, nothing we can do about it. Seller was probably not truthful and Sapphire has the most stringent/ridiculous RMA policies among AMD board partners. You could try downclocking it and maybe go for a BIOS and undervolt, but it's up to you.


----------



## Tacoboy

Quote:


> Originally Posted by *cruisx*
> 
> So I just bought a R9 280X toxic edition from kijiji brand new in box and it appears It is showing artifacts when gaming. It seems that many users have had this problem after searching around for a bit.
> 
> 1) Does anyone know of a bios I can flash that would fix the artifacts issues on a R9 280X Toxic?
> 
> 2) Also does anyone know if Sapphire can do an RMA without having the original receipt? The person I bought it from said it was gifted to him but he never got around to using it.
> 
> Wish I had know about the issue before I bought it, cant even sell the card to anyone now knowing it has issues =(


Did you disable the motherboard's on-board graphics card, in the BIOS?

Is that a Yamakasi DS270 monitor you have?


----------



## Phantasia

I'm at a crossroads...

I have my XFX DD BE 280x running fine. But was pondering getting a 390, GTX 970 or even going the cheapest route... Getting another 280x used, and crossfiring both.

How do you recon a crossfire of 280x fare againts the 2 alternatives? Would it scale well for 1080p gaming? I mean for Witcher3 and GTA V etc...


----------



## flopper

Quote:


> Originally Posted by *Phantasia*
> 
> I'm at a crossroads...
> 
> I have my XFX DD BE 280x running fine. But was pondering getting a 390, GTX 970 or even going the cheapest route... Getting another 280x used, and crossfiring both.
> 
> How do you recon a crossfire of 280x fare againts the 2 alternatives? Would it scale well for 1080p gaming? I mean for Witcher3 and GTA V etc...


avoid crossfire/sli at all cost.
if you need more fps go single more expensive card.

single cards offers tranquility, always work and never have any issues when games comes out.
as a gamer I just want stuff to work and I tried crossfire but never going back.
single card has the best gaming experience and after all isnt that what we all want?

fps for example a single 380 fps and then to double the fps cost 400 more euros to a fury/980ti.
I rather lower a few settings or buy a entusiast card before going crossfire


----------



## JackCY

280x has no problem with GTA V @ 1200p.
Sure Hawaii and GM204 has more power but new costs ridiculously a lot instead of being a replacement for 280x/700 class of cards.

I would rather sell a 280x and buy a 970 than get another 280x to CF.


----------



## jason387

GTA 5 is very well optimized. Apart from shadows and MSAA, I can max it out with a single R9 285 at 1440p. FPS is between 40-60 depending upon the scenes.


----------



## Phantasia

Gotcha thanks for the replies.

I will prolly just keep it like this for now, the new cards really don't appeal much to me in a sense of the difference it makes regarding the 280x for the price of them new... That's the only reason I considered another used 280x for crossfire, or even a 7970.

Maybe I can get a good deal on a used 970 or even a 290x, that will be considerable in my opinion.

thanks once more.


----------



## jason387

I'll be getting another R9 285 in a couple of days. Will let you know how crossfire works out for me


----------



## Phantasia

Let me know how it goes, I might be asking some numbers from you


----------



## mutatedknutz

I have a small doubt, will dx12 improve performance on a major or smaller scale? That would be the only reason i would switch to windows 10.


----------



## GoLDii3

Quote:


> Originally Posted by *mutatedknutz*
> 
> I have a small doubt, will dx12 improve performance on a major or smaller scale? That would be the only reason i would switch to windows 10.


Probaly nothing major,still,Windows 10 is not the boogeyman,for me it's better than 7 and it gives you a performance boost,so why not?


----------



## masteratarms

Are the VRMs best cooled using a thermal pad, i.e. no thermal paste? Would it create a short, since aluminium is conductive and the VRMs are metal. I looked at a x1900xt VRM and now I remember why I left the original thermal pad on, some of the capacitors look flush with the top of the VRM and they would short.


----------



## jason387

Quote:


> Originally Posted by *mutatedknutz*
> 
> I have a small doubt, will dx12 improve performance on a major or smaller scale? That would be the only reason i would switch to windows 10.


That's something we can only say when DX12 games are released.


----------



## JackCY

Minor, mostly reducing the performance hit of draw calls, feeding the GPU. In some cases you can feed it more getting easier to full utilization and removing a CPU limit. Plus there are other than DX12 that do the same, just not everyone uses them. MS has been stuck doing 2D graphics for Windows and left 3D graphics for games stuck stuck stuck for years until AMD pushed it themselves first. MS finished their 2D graphics upgrades for Windows so provoked by AMD they jumped on the train and try to make a more up to date DX.

Many games are Nvidia supported/sponsored/whatever, so you won't see Mantle with those. Small developers don't have the resources to implement multiple APIs either, so you get DX11 and that's it. OGL seems mostly gone for now on Windows.


----------



## tabascosauz

I've found that the combination of Windows 10 and the new 15.7.1 drivers have actually provided a pretty significant boost in framerate and frame consistency in Far Cry 4. As opposed to around 40 fps earlier, my game runs at about 45-60 fps now and is much smoother than before. This is on my R7 265 and I haven't had time to try it out with my 280X since my 280X rig is still Win 8.1. Probably something to do with more efficient CPU work in games.


----------



## Mussels

well my gigabyte 280x was artifacting in 3D, so i asked them to send a different card to me.

getting a 290 (non X) 4GB windforce back, so hopefully the days of screwed up 280x cards are behind me XD


----------



## MiladEd

And a nice upgrade too! Congrats, and I'm jealous!


----------



## JackCY

Who is processing your warranty? US or EU?


----------



## Mussels

Quote:


> Originally Posted by *MiladEd*
> 
> And a nice upgrade too! Congrats, and I'm jealous!


i was expecting a reference card or a rev 3.0 or something, but i guess they decided this was easiest. now i've got hundreds of pages of 290x overclocking, modding and tweaking to read up on XD

Quote:


> Originally Posted by *JackCY*
> 
> Who is processing your warranty? US or EU?


Au, i'm from the deep south


----------



## JackCY

So the manufacturer/authorized service center or seller?


----------



## Mussels

Quote:


> Originally Posted by *JackCY*
> 
> So the manufacturer/authorized service center or seller?


Gigabyte AU directly. There was a warranty card in the box with their email, and they've been great.

Contacting gigabyte through their website got me someone else who was completely useless and refused warranty.


----------



## JackCY

Yeah it's a hit or miss depending on who's processing the request. They've been unbelievably cooperative with yours.


----------



## Wickedtt

Hey everyone figured this would be a good spot to ask this. i got 2 toxic 270x a few days ago. i have them overclocked too 1250/1600 at the moment. They work overclocked in every situation except for 3dmark firestirke where first card will go back to stock 1100/1500 but 2nd will stay overclocked. need help ive reloaded drivers a few times and im on 15.7.1 at the moment. ive clean installed drivers everytime with DUD. Any help with someone who have had this issue. Thanks for reading.


----------



## bichael

Nice overclock! Wish myne would go that high...

Although only one card I had issues with firestrike running at stock clocks sometimes. I tried various means of overclocking (eg afterburner, catalyst etc) which seemed to sometimes work but in the end I think it was a result of having a browser open. Not sure why that would be but give it a go with all browsers closed if you haven't already.


----------



## bonami2

Well i have a 7950 but it the same thing ahahh

Anyone having strange bug in eyefinity with amd card? (WINDOW 10

Sometime my screen flicker and bip when i move the volume


----------



## dfg555

I was wondering if anyone else can change their voltage in MSI Afterburner? I have a dual-x 270x. I already have voltage unlocking checked. And tried the EULA modification.


----------



## MiladEd

You need to use TriX software for Sapphire cards. Download it from Sapphire website.


----------



## JackCY

AB only understands reference and MSI cards. The rest is one big hack by forcing constant voltage if that even works, and hope it forces the right voltage and to the right level.
It's best to use the tool that was made for your GPU as that is likely the only that will understand it and be able to control the specific voltage controllers.


----------



## LzbeL

guys, the 280x have directx 12 native support?? (Im say 12, not 12.1)


----------



## JackCY

All modern cards support DirectX 12 pretty much. Of course the support for the special feature sets varies per model. It runs Mantle it will run DX12.


----------



## milcs

I have just bought a refurbished Sapphire R9 280X TRI-X OC graphic card. I have been playing a bit with it and trying to find a suitable OC for it.

I have a few questions about it:

1- What program would you guys recommend using to control clocks, voltages, fans and make sure everything is running smooth?

2- I am, at the moment, using Trixx. I can control about everything, but the voltage seems to be stuck at 1.2V base, 1.256V boost. Do you guys know if it is possible to control the voltage on my card (or is it locked)?

3- If anyone has a similar graphic card, what are the normal temps for it (iddle and load)?

4- For owners as well... Is it normal that this TRI-X cooling system gives a bit of a high pitch screeching noise?

5- I have previously owned NVIDIA graphic cards and never an ATI. I have to say that the Catalyst software looks quite bad compared to Nvidia's Geforce experience. So, my question is... Is the ATI gaming experience worth it? Or is it safe to uninstall Raptr (or whatever the program is called)?

Cheers everyone!


----------



## neurotix

Quote:


> Originally Posted by *milcs*
> 
> I have just bought a refurbished Sapphire R9 280X TRI-X OC graphic card. I have been playing a bit with it and trying to find a suitable OC for it.
> 
> I have a few questions about it:
> 
> 1- What program would you guys recommend using to control clocks, voltages, fans and make sure everything is running smooth?
> 
> 2- I am, at the moment, using Trixx. I can control about everything, but the voltage seems to be stuck at 1.2V base, 1.256V boost. Do you guys know if it is possible to control the voltage on my card (or is it locked)?
> 
> 3- If anyone has a similar graphic card, what are the normal temps for it (iddle and load)?
> 
> 4- For owners as well... Is it normal that this TRI-X cooling system gives a bit of a high pitch screeching noise?
> 
> 5- I have previously owned NVIDIA graphic cards and never an ATI. I have to say that the Catalyst software looks quite bad compared to Nvidia's Geforce experience. So, my question is... Is the ATI gaming experience worth it? Or is it safe to uninstall Raptr (or whatever the program is called)?
> 
> Cheers everyone!


1. You can use any program to control voltage and fans. MSI Afterburner or Sapphire Trixx should both work the same. JackCY is wrong, you can use Afterburner with Sapphire cards. (I should know. I have 6 of them.) Trixx is preferred though, especially because in Crossfire it lets you set clocks individually for each card. There's other utilities too like ASUS GPU Tweak. It's a matter of preference.

2. I've had a lot of Sapphire cards and none of them have been voltage locked. Going all the way back to a Sapphire 4670. There's no reason it should be locked, but it might be. Make sure you're using the newest Catalyst drivers, and the newest version of Trixx. Try that and get back to me.

3. Normal temps for a 280X/7970 should be 35c idle and around 60c load with 100% fan. It's all basically dependent on your fan speed. You can lower the fan speed, but, if you exceed 80C that's a problem. In my experience with my 7970, anything over 70C for prolonged periods of time and I would get driver crashes or artifacts in games/flashing textures. I would not be surprised if this is people's problem here with all the "artifacting 280X" reports. They are probably overheating. People, just set your fans to 100% and game with headphones on full blast. Your card will thank you.

4. If it's this one here then yes, that can happen. The cooler and the card is great, but the fans can have issues. I had to RMA my 290 Tri-X cards because of incredibly loud fan rattling/screeching noises. I got Vapor-X cards back from RMA and they don't have that problem, but the fans look to be the same. So, it's unique to the card. I'm willing to bet it's not coil whine, but fan vibration. There are videos on youtube about taking the card apart, taking the fans off and putting rubber spacers on top of the screw hole, in between the screw and the bracket. If you're adventurous you can try this. If not, and you consider it a really big problem, RMA the card.

5. I don't use Raptr. I uninstall it. Next time, when you install drivers, at the very beginning when the Catalyst interface loads you can select "Express" or "Custom". Pick custom and uncheck "AMD Gaming Evolved App". This way, you won't get Raptr again. The program might be useful for streaming/video recording and I think it might be native now. I think that's what most people use it for.


----------



## JackCY

You can use AB with any card but it is made to work with reference designs and MSI cards, the rest is a lottery.


----------



## LzbeL

Guys... Gigabyte 280X OC Rev 3.0... 7 months of use. 156€. Is Good buy?


----------



## Horsemama1956

Seems kind of high? I got my 280x about a month ago in mint condition, unregistered with everything untouched in the retail box for $130 Canadian(around 90€). Around here you could get a 290 for that much.


----------



## bonami2

Quote:


> Originally Posted by *Horsemama1956*
> 
> Seems kind of high? I got my 280x about a month ago in mint condition, unregistered with everything untouched in the retail box for $130 Canadian(around 90€). Around here you could get a 290 for that much.


110$ canadian a 7950 windforce in perfect condition too about 150$ for 280x around me


----------



## JackCY

In Europe, yes that's acceptable. I've seen some in auctions go lower though but those being sold without an auction go for that much or more.
US and other places obviously have very different prices of electronics since they lack the 20%+ taxes/customs etc. and their products are much cheaper in the shops new as well, hence their resell value is also lower.


----------



## hatchet_warrior

Quote:


> Originally Posted by *LzbeL*
> 
> Guys... Gigabyte 280X OC Rev 3.0... 7 months of use. 156€. Is Good buy?


If you can, get the serial number from the seller. (you can't do anything nefarious with it so they should be ok) You can go to rma.gigabyte.com make an account and check the status of the warranty. Gigabyte's warranty is 3 years and it is attached to the serial not the buyer. That would tell you if it really was only used for 7 months. It also tells you how much longer you have on the warranty. If you're getting almost 2.5 years out of the warranty then it is totally worth it for tat price. If you don't want to make an account you can PM me the serial and I'll check it for you.


----------



## bonami2

Yea and some gigabyte are voltage locked well the 7950 i have is while other are not


----------



## DizzlePro

Dropped the memory clock from 1500 to 1450mhz & the artifacts are gone in GTA V

& to make up for the lack of msaa, i use vsr @ 2560x1440

i get 45-60 with maxed out settings (no msaa, grass on normal)


----------



## jason387

Quote:


> Originally Posted by *DizzlePro*
> 
> Dropped the memory clock from 1500 to 1450mhz & the artifacts are gone in GTA V
> 
> & to make up for the lack of msaa, i use vsr @ 2560x1440
> 
> i get 45-60 with maxed out settings (no msaa, grass on normal)


Elpida memory?


----------



## DizzlePro

Quote:


> Originally Posted by *jason387*
> 
> Elpida memory?


Hynix


----------



## masteratarms

The MSI has a mid-plate I didn't realise that it also had artifacting at basically stock clocks. I have a Asus DC2T 3GD5 and it has bare memory, 1600mV with totally bare memory at stock. I got it 2nd hand and have added a G10 kraken and some heatsinks http://www.tigerdirect.com/applications/searchtools/item-details.asp?EdpNo=178124&Sku=T925-1022&csid=ITD&recordsPerPage=10&body=REVIEWS#CustomerReviewsBlock
hacksawed and sanded to fit.



Just a quick OC with stock volts, 51'c GPU temp and similar VRM temps.


----------



## HyeVltg3

[No idea how it double posted during the edit.]


----------



## HyeVltg3

May as well ask around instead of rummaging through google hits.

I had a dying GTX 670 before all this, finally decided to go Red on a sheer whim. Grabbed the 280X as it was the closest AMD version of the 670 (too broke atm to get anything "new")
read a ton of great reviews on the 280X, too bad none of them mentioned the huge Artifacting problem, I went from a dying video card to a faulty one. Could not return to seller.

The First 280X was running perfectly fine from AMD CCC 15.0(?) till I looked again and 15.7 was released. On <15.7 I was getting minor Artifacts, which I remedied with lowering Core clocks from 1070 >> 1050 and Mem 6400 >> 6300 (yes I know, this is wrong, but at the time I did not know it was a value added up by 4 "actual" clocks, I know now after seeing others GPU-Z prtscrns, thanks =D learned something new) Was reducing artifacts until I updated to 15.7 and something broke it and could not play 3D games anymore, card would freeze/lock-up, monitor would go black or white or just lines and only fix was a hard reset.
Further lowered clocks, nope, got help with modding the bios, stil no-go, decide to RMA.
Looked at ASUS RMA, card was still under warranty. So I sent it off a few weeks back.
Took a picture of its S/N as I've learnt over the years with components that sometimes they fix yours or just give you another.

Got back the 280X 1.5 weeks back. Different S/N! Dam.
Started artifacting last night, first was in Sims 4 and now its artifacting just typing this up on Chrome.

Wanted to ask if I should try out that Bios update from Asus or something else? nvm, says my bios is up to date.
What is everyone using to lower the clocks? GPUTweak? Seems to give me a lot of pauses or it never loads up on startup. Is there no alternative.
Anyone else using GPUTweak II ?

Asus R9280X-DC2T-3GD5

(and yep, artifacting ain't a word)


----------



## Bartouille

Quote:


> Originally Posted by *HyeVltg3*
> 
> May as well ask around instead of rummaging through google hits.
> 
> I had a dying GTX 670 before all this, finally decided to go Red on a sheer whim. Grabbed the 280X as it was the closest AMD version of the 670 (too broke atm to get anything "new")
> read a ton of great reviews on the 280X, too bad none of them mentioned the huge Artifacting problem, I went from a dying video card to a faulty one. Could not return to seller.
> 
> The First 280X was running perfectly fine from AMD CCC 15.0(?) till I looked again and 15.7 was released. On <15.7 I was getting minor Artifacts, which I remedied with lowering Core clocks from 1070 >> 1050 and Mem 6400 >> 6300 (yes I know, this is wrong, but at the time I did not know it was a value added up by 4 "actual" clocks, I know now after seeing others GPU-Z prtscrns, thanks =D learned something new) Was reducing artifacts until I updated to 15.7 and something broke it and could not play 3D games anymore, card would freeze/lock-up, monitor would go black or white or just lines and only fix was a hard reset.
> Further lowered clocks, nope, got help with modding the bios, stil no-go, decide to RMA.
> Looked at ASUS RMA, card was still under warranty. So I sent it off a few weeks back.
> Took a picture of its S/N as I've learnt over the years with components that sometimes they fix yours or just give you another.
> 
> Got back the 280X 1.5 weeks back. Different S/N! Dam.
> Started artifacting last night, first was in Sims 4 and now its artifacting just typing this up on Chrome.
> 
> Wanted to ask if I should try out that Bios update from Asus or something else? nvm, says my bios is up to date.
> What is everyone using to lower the clocks? GPUTweak? Seems to give me a lot of pauses or it never loads up on startup. Is there no alternative.
> Anyone else using GPUTweak II ?
> 
> Asus R9280X-DC2T-3GD5
> 
> (and yep, artifacting ain't a word)


Yea you can use gpu tweak or msi afterburner. But since all you want to do is lower clocks then you can simply use CCC overdrive.

No surprise your graphics card is artifacting honestly... Memory is rated for 1500MHz and ASUS decided to run it at 1600MHz.


----------



## HyeVltg3

Quote:


> Originally Posted by *Bartouille*
> 
> Yea you can use gpu tweak or msi afterburner. But since all you want to do is lower clocks then you can simply use CCC overdrive.
> 
> No surprise your graphics card is artifacting honestly... Memory is rated for 1500MHz and ASUS decided to run it at 1600MHz.


While I was waiting on a reply, I lowered Mem clocks to 1450 like the poster above mentioned he was able to run GTAV at vsr smoothly.

Surprisingly working so far, no artifacts, playing The Forest, max temps have been 69-70c on max settings.
Using AB.

Is CCC overdrive better for AMD cards?
Should I bump Core from 1070 to 1100?


----------



## Bartouille

Quote:


> Originally Posted by *HyeVltg3*
> 
> While I was waiting on a reply, I lowered Mem clocks to 1450 like the poster above mentioned he was able to run GTAV at vsr smoothly.
> 
> Surprisingly working so far, no artifacts, playing The Forest, max temps have been 69-70c on max settings.
> Using AB.
> 
> Is CCC overdrive better for AMD cards?
> Should I bump Core from 1070 to 1100?


You can probably run it at 1500mhz since that's what the memory is rated for.

Well CCC overdrive does the same thing as gpu tweak or any other overclocking software if all you want do is change clocks, plus you don't need any extra software since it's built into CCC. Difference is that overclocking software allow you to modify voltages allowing you to overclock even further.

And yes you can try 1100. No guarantee it's going to be stable tho.


----------



## JackCY

Those Hynix VRAMs run even 1800MHz at 1.6V they are set to








The problem is lack of VRAM cooling and faulty/unstable core or memory controller in the core.
Underclocking doesn't help much if your VRAMs are overheating you need to lower the VRAM voltage which can be done but it's a pain to get into. Just RMA it.
The newer Asus 280xs run 1.5V and 1500MHz, that's the v2.


----------



## masteratarms

Quote:


> Originally Posted by *JackCY*
> 
> Those Hynix VRAMs run even 1800MHz at 1.6V they are set to
> 
> 
> 
> 
> 
> 
> 
> 
> The problem is lack of VRAM cooling and faulty/unstable core or memory controller in the core.
> Underclocking doesn't help much if your VRAMs are overheating you need to lower the VRAM voltage which can be done but it's a pain to get into. Just RMA it.
> The newer Asus 280xs run 1.5V and 1500MHz, that's the v2.


How do u lower the vram voltage? I've not found a bios on techpowerup! that lowers from 1.6v to 1.5v.

I made some notes on my bios testing. I went to techpowerup and started at the top bios (out of 14 compatible asus 280x bios), this is what I wrote:

1500U = Asus.R9280X.3072.131119_2.rom 1250mV gpu + tweaks. 1500 VRAM set - not able to downclock with afterburner 1600mV vram, slight corruption 1140/1500 - 1150/1500 dx error
best so far

1500R = Asus.R9280X.3072.130828_1.rom 1250mV gpu + tweaks. 1375 VRAM set - afterburner 1600mV vram CTD @ 1140 on restart game (1130) massive artifacts.

1500S = Asus.R9280X.3072.131119.rom 1250mV gpu + tweaks. 1375 vram set Looks like a keeper. 1140/1500 no problems in BF4

I found Sapphire Trixx was the only way I could downclock RAM and even then only to 1475Mhz. Also found that if its artifacting u need to reboot and try lowering memory clocks because if u dont reboot it will continue artifacting even if u lower clocks. I edited the bios' as you can see above to 1375Mhz I used VBE7.0.0.7b.exe from techpowerup! to do so.

Edit: reading the above post again I would comment that setting the bios memory clocks to 1300Mhz even with 1.6v did cure artifacting without additional cooling. I did it as a test to determine if it was simply heat or a more fundamental problem first before I spent the time to mod heatsinks and add G10 bracket.


----------



## Roboyto

Quote:


> Originally Posted by *HyeVltg3*
> 
> While I was waiting on a reply, I lowered Mem clocks to 1450 like the poster above mentioned he was able to run GTAV at vsr smoothly.
> 
> Surprisingly working so far, no artifacts, playing The Forest, max temps have been 69-70c on max settings.
> Using AB.
> 
> Is CCC overdrive better for AMD cards?
> Should I bump Core from 1070 to 1100?


Did you properly uninstall all previous drivers from Nvidia? This can cause all kinds of strange problems.

Download DDU and allow it to boot into safe mode. Uninstall all nvidia and AMD drivers so you can start from scratch.

http://www.guru3d.com/files-get/display-driver-uninstaller-download,20.html

Then download and install CCleaner and clean the registry. Reboot, clean the registry again, and reboot once more. You will probably want to remove CCleaner after you're done because it is now another one of those annoying programs that 'monitors' your computer and is always popping up in your face. Aside from that new annoyance, its registry cleaner has never let me down.

Re-install the latest drivers from AMD.

Uninstall/Re-install the GPU and make sure PCI-E slot is clear of any dust/debris. Make sure all other cable connections are good. Sometimes the simplest things can fix problems like this.

If you have Overdrive enabled in CCC *and* you're using a 3rd party overclocking software then you are asking for issues. Disable Overdrive and only use Afterburner, Trixx, GPU Tweak, etc.

If the card has the clocks set to 1600 on the memory by default from the manufacturer, then the card should run at those speeds. If the card has Hynix memory, judging by some other posts, then they typically have some overclock headroom; if not a fair amount. Going from 1500(6GHz) to 1600(6.4GHz) is a modest memory overclock by most standards. My R9 290, for example, has as 1250(5GHz) default clock speeds and it is capable of running at 1700(6.8GHz) without issue.

If none of the other things I have mentioned above fix your problem so the card can run at it's stock settings then you may have a faulty card. If you purchased it new, you may consider a RMA.

Before you go down that road though, there is another trick that may work. MSI Afterburner may have a setting to alter called AUX Voltage. For the R9 290(X) cards this adjusts RAM voltage, and can sometimes help with RAM overclock stability. I am not certain if you can do this on a 280X, but it is worth a shot.

Your absolute last resort could be a fresh windows installation. This may sound extreme, but I have had it fix hardware issues for me more than once. If you're on Windows 8/10, then a system refresh could be a worthwhile option.

If you aren't on Win8/10 and you don't want to do a fresh install you can try something called Windows SysPrep. This will remove all hardware drivers and have windows go through the preliminary driver installation phase without altering any programs or other settings. It is quite handy for upgrading an old computer to newer hardware without having to reinstall EVERYTHING.

http://www.sevenforums.com/tutorials/135077-windows-7-installation-transfer-new-computer.html


----------



## JackCY

masteratarms: As I said it's a pain to do. If it can even be done via VBIOS update which I suspect may not be. You're down to the hardware modifications that as far I found no one yet took a photo of at least I don't have one. But it's rumored that Asus does this when they service the cards, or used to, on some cards they process. You could also try comparing V1 vs V2 and figure out the HW changes needed but it may be impossible if the parts aren't labelled and you need a very high res picture to do this.

For Asus card, I use the Asus tweak which is the only software that fully understands the card and can change 2D and 3D clocks, anything short of Vram since that one is locked and only variable on the Matrix or something.
I don't have any problem running the card at any clock both core and VRAM, higher or lower. Wanna drop it to 4.8GHz, no problem, want to boost it to 7.2GHz, no problem. Of course the artifacts are more common the higher the VRAM clock is. Depending on how moody the application running is. Some run fine with 7.2, some artifact even with 6 only.

Once I'm sure they don't have any spare 280xs, I'm gonna try throwing it back and get something else as a replacement.


----------



## BruceB

Hey guys!
My Powercolor 280X's clock seems to be stuck t 880MHz max, that means the Card isn't boosting like it should.
Is this a known Problem? Are there any known causes?


----------



## masteratarms

@JackCY, The Asus 280x DC2T (DCIIT) 3GD5 has a CHL 8228G which is common on alot of other GPUs and is supported by MSI afterburner. However the card has to be specifically put into afterburner's database (its ASIC # 3006) so that the program knows how to control it. I tried posting in the 3rd party hardware database thread but got no response and I can change the GPU v via bios mod so I left it. If any card with the 8228G VRM controller can do the software volt mod then the Asus version can too. I was using GPUtweak on my 6850 voltage tweak which gave me a little voltage bump but this CHL 8228G shouldn't be restricted to that program. The 8228G was a selling point for me, disappointed it is still an issue to overcome. My Asetek 570LX is definitely capable of cooling the card down no probs there.

@BruceB the obvious cause is if the GPU is running outside its temperature/power limit range. The boost only kicks in when the GPU is within those limits, usually it results in choppy gameplay due to the GPU switching core speeds not heard of a card consistantly not boosting. There is a fix, mod the bios with vbe7 from techpowerup!, ofcourse check the card isn't throttling due to it being outside of its target parameters.


----------



## nikitasd

A friend gave me an XFX Radeon R9 290X 4GB Double Dissipation  and i installed it. I found out that my gpu running at 50ºc iddle. I dont know about gaming. Is that ok or i have a problem?


----------



## BruceB

Quote:


> Originally Posted by *masteratarms*
> 
> @BruceB the obvious cause is if the GPU is running outside its temperature/power limit range. The boost only kicks in when the GPU is within those limits, usually it results in choppy gameplay due to the GPU switching core speeds not heard of a card consistantly not boosting. There is a fix, mod the bios with vbe7 from techpowerup!, ofcourse check the card isn't throttling due to it being outside of its target parameters.


Thanks for the Reply; Throttling came to my mind too but the Cards temps are ~65°C under load and everything about it is stock.
Sometimes it does boost but it seems to be on a per-boot Basis... maybe a Problem with the Win10 Drivers?


----------



## HyeVltg3

Quote:


> Originally Posted by *nikitasd*
> 
> A friend gave me an XFX Radeon R9 290X 4GB Double Dissipation and i installed it. I found out that my gpu running at 50ºc iddle. I dont know about gaming. Is that ok or i have a problem?


Sounds a bit normal my 280x runs between 36-45c on idle, depending on the weather, the AC vents dont seem to cool the room with the PC very well but heard the 290 series runs hotter.

Whats your ambient temps? (room temp where the PC is located)
Could be that theres a lot of dust on the fins? take a closer look at the card see if theres a lot of dust or debris, you'll need to can-air it to clear it all up.

hopefully its not poor thermal paste, replacing that voids warranty usually.


----------



## BruceB

Quote:


> Originally Posted by *masteratarms*
> 
> @BruceB the obvious cause is if the GPU is running outside its temperature/power limit range. The boost only kicks in when the GPU is within those limits, usually it results in choppy gameplay due to the GPU switching core speeds not heard of a card consistantly not boosting. There is a fix, mod the bios with vbe7 from techpowerup!, ofcourse check the card isn't throttling due to it being outside of its target parameters.


Quote:


> Originally Posted by *BruceB*
> 
> Thanks for the Reply; Throttling came to my mind too but the Cards temps are ~65°C under load and everything about it is stock.
> Sometimes it does boost but it seems to be on a per-boot Basis... maybe a Problem with the Win10 Drivers?


I've been playing around with it all day and it seems I've found the cause of issue I was having: if there's no Monitor detected (either switched off or unplugged) on boot then the Card's base clock seems to max-out at 880 MHz (ie. no boost); if I turn my Monitor on _before_ my PC then it behaves normally and boosts to 1030 MHz. Try it yourself and tell us what happens!

Strange Thing is: when I turn the PC on first (forcing no boost on the GPU) then the PC uses ~75W _less_ power at idle (~150W Vs. ~75W), what in the world could cause this?


----------



## masteratarms

Quote:


> Originally Posted by *BruceB*
> 
> I've been playing around with it all day and it seems I've found the cause of issue I was having: if there's no Monitor detected (either switched off or unplugged) on boot then the Card's base clock seems to max-out at 880 MHz (ie. no boost); if I turn my Monitor on _before_ my PC then it behaves normally and boosts to 1030 MHz. Try it yourself and tell us what happens!
> 
> Strange Thing is: when I turn the PC on first (forcing no boost on the GPU) then the PC uses ~75W _less_ power at idle (~150W Vs. ~75W), what in the world could cause this?


Sounds like a question for the manufacturer, I have not seen this issue discussed before.


----------



## BruceB

Quote:


> Originally Posted by *masteratarms*
> 
> Sounds like a question for the manufacturer, I have not seen this issue discussed before.


Ok, thanks for the Reply, I'll drop powercolor an email and post back with the Reply


----------



## Nassenoff

Hi,

Anybody with some Gigabyte R9 280X Winforce 3X rev 2.0 experience who knows if it possible to get a full cover waterblock?
Is there a full cover waterblock for the HD7970 that will fit?

I have posted seperate post about this also:
http://www.overclock.net/t/1571358/gigabyte-r9-280x-winforce-3x-rev-2-0-full-cover-waterblock


----------



## masteratarms

@Nassenoff

https://shop.ekwb.com/ek-fc7970

Click the compatibility list, compare the pictures of the Gigabyte 7970 boards to see if they are the same as a picture you have of your 280x rev 3.

Actually doing a search on 280x it doesn't look like EK have made a full cover block for it. They said "no plans" to make a full cover for rev.1 so rev.3 would be a stretch:
http://configurator.ekwb.com/step1_complist?gpu_gpus=1212

It might be more cost effective/practical to stick with a custom loop assuming you already cool your CPU if not consider this:
http://www.overclock.net/t/1487012/official-nzxt-kraken-g10-owners-club

I've done it on my Asus 280x DC2T 3GD5 no camera atm but results:


----------



## Nassenoff

Quote:


> Originally Posted by *masteratarms*
> 
> @Nassenoff
> 
> https://shop.ekwb.com/ek-fc7970
> 
> Click the compatibility list, compare the pictures of the Gigabyte 7970 boards to see if they are the same as a picture you have of your 280x rev 3.
> 
> Actually doing a search on 280x it doesn't look like EK have made a full cover block for it. They said "no plans" to make a full cover for rev.1 so rev.3 would be a stretch:
> http://configurator.ekwb.com/step1_complist?gpu_gpus=1212


I thought rev 1-3 was the BIOS, isn't the visual looks and hardware only rev 01 and 02:


But i have done a comparison and seeing a pattern, seems like they are only offering waterblocks as long as the custom card have standard PCB layout.
But it seems like the XFX is quite similar to the standard PCB, looks quite similar when you compare the VRM location.



Quote:


> Originally Posted by *masteratarms*
> 
> @Nassenoff
> 
> It might be more cost effective/practical to stick with a custom loop assuming you already cool your CPU if not consider this:
> http://www.overclock.net/t/1487012/official-nzxt-kraken-g10-owners-club


I currently use a closed pre-filled loop on the CPU.
I want to do a custom water cool build with my Bitfenix mITX, and I'm going to change the side panel to a clear plexi glass.
With the glass installed there will be minimum of air.
Probably going to wait with the build and change the card.


----------



## gnemelf

well even with the stuff being close to in the same place certain parts of the card may come in contact with the heatsink which would short out the card.


----------



## bichael

Quote:


> Originally Posted by *Nassenoff*
> 
> Hi,
> 
> Anybody with some Gigabyte R9 280X Winforce 3X rev 2.0 experience who knows if it possible to get a full cover waterblock?
> Is there a full cover waterblock for the HD7970 that will fit?
> 
> I have posted seperate post about this also:
> http://www.overclock.net/t/1571358/gigabyte-r9-280x-winforce-3x-rev-2-0-full-cover-waterblock


Alphacool GPX









http://www.alphacool.com/product_info.php/info/p1396_Alphacool-NexXxoS-GPX---ATI-R9-280X-M02---mit-Backplate---.html


----------



## Nassenoff

Quote:


> Originally Posted by *bichael*
> 
> Alphacool GPX
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.alphacool.com/product_info.php/info/p1396_Alphacool-NexXxoS-GPX---ATI-R9-280X-M02---mit-Backplate---.html


Thanks!
But I was hoping for an acrylic clear top since the GPU installs vertically and is on display in my case.


----------



## gnemelf

Quote:


> Originally Posted by *Nassenoff*
> 
> Thanks!
> But I was hoping for an acrylic clear top since the GPU installs vertically and is on display in my case.


well for a non reference card thats not gonna happen at least not a full coverage card.


----------



## Nassenoff

Probably swap the card with a used r9 290 or something then


----------



## bichael

Speaking of used r9 290s...

I'm considering upgrading from a 270x to a 290 so just wanted to check if there was anything I should be looking out for. I will be watercooling it with one of the Alphacool GPX blocks (or a fullcover if I see any good deals).

Second hand they seem good value at the moment, not sure how worried I should be about its prior life though? Tempted to just grab the cheapest I can find on Taobao (which is around £130).

Currently have been looking at a reference card, MSI gaming card, or powercolor Turbo/PCS+. Pretty sure all have reference length pcbs which I believe is about 267mm (and the max I would ever get in my case).

Will be using a 450W sfx psu but my research suggests I should be okay with no OC (it has 36A/432W on 12V rail). If I upgrade my cpu down the line I may jump to a 600W then.

note - decided on a 290 as not sure a 380 is a big enough step up to be worthwhile and a 390 (or 970) would be over a S$400 limit meaning I would pay tax bringing into Singapore.

Cheers


----------



## neurotix

Quote:


> Originally Posted by *bichael*
> 
> Speaking of used r9 290s...
> 
> I'm considering upgrading from a 270x to a 290 so just wanted to check if there was anything I should be looking out for. I will be watercooling it with one of the Alphacool GPX blocks (or a fullcover if I see any good deals).
> 
> Second hand they seem good value at the moment, not sure how worried I should be about its prior life though? Tempted to just grab the cheapest I can find on Taobao (which is around £130).
> 
> Currently have been looking at a reference card, MSI gaming card, or powercolor Turbo/PCS+. Pretty sure all have reference length pcbs which I believe is about 267mm (and the max I would ever get in my case).
> 
> Will be using a 450W sfx psu but my research suggests I should be okay with no OC (it has 36A/432W on 12V rail). If I upgrade my cpu down the line I may jump to a 600W then.
> 
> note - decided on a 290 as not sure a 380 is a big enough step up to be worthwhile and a 390 (or 970) would be over a S$400 limit meaning I would pay tax bringing into Singapore.
> 
> Cheers


I'd go with the Powercolor PCS+. However, I'm not sure if it's reference or not for a full cover block. The cooler is one of the best, comparable to a Sapphire Tri-X but beaten by the Vapor-X. You could consider the Sapphire Tri-X 290 since it is a totally reference board.

Keep in mind the PCS+ is basically a 3 slot card with the cooler on.

I guess if you're water cooling it, you could do the MSI Gaming too. The problem with those is leaking fan bearing grease all over the place. I'm not sure if there's any of these left that do this. It was a pretty big story when they first came out. Some guy even had a mining rig with like 20 of them and over half dumped bearing grease all over his stuff.

They are still incredibly good value and imo only a fool would buy the 390. Nothing has changed between the 290 and 390, they are literally identical except with newer bios, and I believe looser RAM timings in the BIOS to allow for faster clocked RAM. I've had 4 290s and ALL have done 1500mhz RAM, which I believe is stock for the 390 now. So chances are, you can simply mildly overclock your 290 to match the 390. Your chance of this is probably much greater if you buy a custom card (PCS+, MSI etc)

Performance wise, a single 290 outperforms 270X Crossfire when both are OC'ed and has double the RAM to boot. (4 times as much if you get a 390). So, going from a 270X to a 290 you get double everything- shaders, ROPs, TMUs and VRAM.

If your sig rig is the one it's going in, well I imagine that G3258 (4.8ghz) under heavy load might already be taxing a 450w PSU. I wouldn't add a 290 to that PSU unless you reset your CPU to stock. The best way to know for sure is to run an x264 test on the CPU with a kill-a-watt meter attached and see what kind of wattage you're drawing. Keeping in mind that an OC'ed 290 can draw 300w. Even stock, 1ghz it can take 250w. (my i7 and two 290s peak at 850w at the wall in the latest, most demanding stuff.)


----------



## MiladEd

Hey guys. I've got a Sapphire R9 280X Dual-X. I've got a rather high OC on it (1100/1600 MHz) and has been running it for a while, no artifacts, but since it gets hot, I've put up a rather aggressive fan curve on it. Recently, on lower RPMs, I've noticed one of the fans make a ticking noise. I opened the case and of the fans seem to be turning around a bit wobbly. On higher RPMs the noise and wobbliness is gone.

Should I be worried?


----------



## Catscratch

Quote:


> Originally Posted by *MiladEd*
> 
> Hey guys. I've got a Sapphire R9 280X Dual-X. I've got a rather high OC on it (1100/1600 MHz) and has been running it for a while, no artifacts, but since it gets hot, I've put up a rather aggressive fan curve on it. Recently, on lower RPMs, I've noticed one of the fans make a ticking noise. I opened the case and of the fans seem to be turning around a bit wobbly. On higher RPMs the noise and wobbliness is gone.
> 
> Should I be worried?


The stock fans on the cooler are very mediocre quality. So it's not suprising they are giving out with an aggressive fan profile. Same thing happened to me on my Sapphire 6850. So I wired thermalright hr3-gt cooler to the original fan header so the card could utilize the 12cm fan I put on it until the card died.

How old is your card ? You can replace them yourself but you have to check how they are wired together.


----------



## GoLDii3

Quote:


> Originally Posted by *MiladEd*
> 
> Hey guys. I've got a Sapphire R9 280X Dual-X. I've got a rather high OC on it (1100/1600 MHz) and has been running it for a while, no artifacts, but since it gets hot, I've put up a rather aggressive fan curve on it. Recently, on lower RPMs, I've noticed one of the fans make a ticking noise. I opened the case and of the fans seem to be turning around a bit wobbly. On higher RPMs the noise and wobbliness is gone.
> 
> Should I be worried?


How loud is the noise?

I RMA'd a 7870 for a rather annoying fan noise. On Amazon tho,i don't know if any other retailer would have accepted it.

Truth is,Dual-X cooler sucks. Rattling or ticking noises are always common. I've had 3 Dual-X cards (7870,7950 and 7970) and the fan always made some extra noise.


----------



## MiladEd

Quote:


> Originally Posted by *Catscratch*
> 
> The stock fans on the cooler are very mediocre quality. So it's not suprising they are giving out with an aggressive fan profile. Same thing happened to me on my Sapphire 6850. So I wired thermalright hr3-gt cooler to the original fan header so the card could utilize the 12cm fan I put on it until the card died.
> 
> How old is your card ? You can replace them yourself but you have to check how they are wired together.


I've had it since December 2014. Planning to get a 1440p 144 Hz FreeSync Monitor rather soon and I'll upgrade my GPU to a Fury/Fury X after doing so.
Quote:


> Originally Posted by *GoLDii3*
> 
> How loud is the noise?
> 
> I RMA'd a 7870 for a rather annoying fan noise. On Amazon tho,i don't know if any other retailer would have accepted it.
> 
> Truth is,Dual-X cooler sucks. Rattling or ticking noises are always common. I've had 3 Dual-X cards (7870,7950 and 7970) and the fan always made some extra noise.


Pretty loud. Goes like "tick tick tick tick tick tick tick tick" but only on low RPMs. On higher RPMs (before it gets so high than the motor noise surpasses all noises) it makes a low crackling noise. But not too noticeable.


----------



## [CyGnus]

i have asus top 280x and both fans died so i did the little red mod, and guess what i never looked back from now on i only use AIO coolers with vga's









Full load temp of 61ºc 1150/1600 these are my 24/7 settings


----------



## MiladEd

Quote:


> Originally Posted by *[CyGnus]*
> 
> i have asus top 280x and both fans died so i did the little red mod, and guess what i never looked back from now on i only use AIO coolers with vga's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Full load temp of 61ºc 1150/1600 these are my 24/7 settings
> 
> 
> 
> Spoiler: Warning: Spoiler!


That... is pretty ghetto looking. lol. How are your temps?

Also, I noticed your radiator is not mounted on the case... I plan to buy a Fury/Fury X. Fury is too long (my case takes VGAs up to 29 cm and both Tri-X and Strix versions are 30 cm) and if I opt for the Fury X, I wouldn't have any other space to mount a radiator, since my CPU's AIO taking the only space I had left for it... Is it safe to just throw the radiator somewhere, provided it can be secured with zip ties or something?


----------



## [CyGnus]

My temps are great idle are around 30/32ºc in games they go to 55-60 no noise whatsoever and great overclock








The radiator can be in whatever place you can fit it as long it has a good air source, zip ties are overlclockers best friend


----------



## buttface420

Quote:


> Originally Posted by *[CyGnus]*
> 
> i have asus top 280x and both fans died so i did the little red mod, and guess what i never looked back from now on i only use AIO coolers with vga's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Full load temp of 61ºc 1150/1600 these are my 24/7 settings


lol thats awesome...i might do something like this with my dual-x 280x.


----------



## nixon61

Guys help me find the BIOS on a Sapphire Vapor-X R9 280X Tri-X OC, not where you can't find it on techpowerup there, and only there for the Sapphire Vapor-X R9 280X OC, this is another card and she has a low frequency. All killed today, and everything is good. Can anyone have? I just don't know where else to look for....


----------



## pbiernik

Hey guys!

What is the best BIOS for Sapphire 280x Dual X?



280xoriginal.zip 41k .zip file


Can I flash vapor-x or another model in my gpu?

Thank you!!!


----------



## buttface420

Quote:


> Originally Posted by *pbiernik*
> 
> Hey guys!
> 
> What is the best BIOS for Sapphire 280x Dual X?
> 
> 
> 
> 280xoriginal.zip 41k .zip file
> 
> 
> Can I flash vapor-x or another model in my gpu?
> 
> Thank you!!!


lol nevermind i see you have the same bios as me,but your gpu clock is 1020 and mine is 1050


----------



## pbiernik

Quote:


> Originally Posted by *buttface420*
> 
> lol nevermind i see you have the same bios as me,but your gpu clock is 1020 and mine is 1050


Hehe.
I can reach 1200 MHz on the core, if I raise the voltage to 1.3v


----------



## MiladEd

Quote:


> Originally Posted by *pbiernik*
> 
> Hehe.
> I can reach 1200 MHz on the core, if I raise the voltage to 1.3v


Lucky... mine artifacts at anything over 1100 MHz regardless of the voltages.


----------



## pbiernik

Quote:


> Originally Posted by *MiladEd*
> 
> Lucky... mine artifacts at anything over 1100 MHz regardless of the voltages.


Try to reduce the speed of the GPU memory.


----------



## MiladEd

Quote:


> Originally Posted by *pbiernik*
> 
> Try to reduce the speed of the GPU memory.


I did. Didn't help.


----------



## RIPmyPC

RIPmyPC

- Asus r9 270
- OC @ 1150/1525 STABLE
- 80°c Max (Benchmark)
- http://www.hostingpics.net/viewer.php?id=538323OCr927011501525.png
- http://www.hostingpics.net/viewer.php?id=666234OCR9270GPUZ.png

- Do I need to change my power limit or it's good?
|----> After having trouble with my power limit, I set it to the max (+20%). I will re-edit my post if I change it or if I do something else.


----------



## matagyula

Hey fellas, I might be late to the party, but I just made the move from a HD 7870 XT to a Sapphire R9 280X Vapor-X (this model). I bought used, for 160€, 3 months of warranty left. Everything is fine so far, no artifacting and no instability issues (then again I've only had the card for ~26 hours







), but the temps worry me a bit. Under load it gets all the way up to around 83°C, which is a tad much for my liking, plus at that point the fans are unbearably loud. All this at stock clock speeds (1070/1550, Hynix memory modules).

Have any of you guys mounted their card with a Gelid Icy-vision-A ? Gelid webpage for the product. It's the cooler I've used for almost two years on my 7870XT. On the product page it does say the R9 280X is supported, but I'd be happy to hear your personal experience with this particular combo.


----------



## pbiernik

Quote:


> Originally Posted by *matagyula*
> 
> Hey fellas, I might be late to the party, but I just made the move from a HD 7870 XT to a Sapphire R9 280X Vapor-X (this model). I bought used, for 160€, 3 months of warranty left. Everything is fine so far, no artifacting and no instability issues (then again I've only had the card for ~26 hours
> 
> 
> 
> 
> 
> 
> 
> ), but the temps worry me a bit. Under load it gets all the way up to around 83°C, which is a tad much for my liking, plus at that point the fans are unbearably loud. All this at stock clock speeds (1070/1550, Hynix memory modules).
> 
> Have any of you guys mounted their card with a Gelid Icy-vision-A ? Gelid webpage for the product. It's the cooler I've used for almost two years on my 7870XT. On the product page it does say the R9 280X is supported, but I'd be happy to hear your personal experience with this particular combo.


Check this: http://linustechtips.com/main/topic/10418-how-to-change-your-gpus-thermal-paste-with-pictures/

Good luck!


----------



## cmpxchg8b

Blasphemy! R9 280 and Titan Black in the same build.


----------



## masteratarms

Using the Geforce as a physX donkey?


----------



## bichael

Quote:


> Originally Posted by *neurotix*
> 
> I'd go with the Powercolor PCS+. However, I'm not sure if it's reference or not for a full cover block. The cooler is one of the best, comparable to a Sapphire Tri-X but beaten by the Vapor-X. You could consider the Sapphire Tri-X 290 since it is a totally reference board.
> 
> Keep in mind the PCS+ is basically a 3 slot card with the cooler on.
> 
> I guess if you're water cooling it, you could do the MSI Gaming too. The problem with those is leaking fan bearing grease all over the place. I'm not sure if there's any of these left that do this. It was a pretty big story when they first came out. Some guy even had a mining rig with like 20 of them and over half dumped bearing grease all over his stuff.
> 
> They are still incredibly good value and imo only a fool would buy the 390. Nothing has changed between the 290 and 390, they are literally identical except with newer bios, and I believe looser RAM timings in the BIOS to allow for faster clocked RAM. I've had 4 290s and ALL have done 1500mhz RAM, which I believe is stock for the 390 now. So chances are, you can simply mildly overclock your 290 to match the 390. Your chance of this is probably much greater if you buy a custom card (PCS+, MSI etc)
> 
> Performance wise, a single 290 outperforms 270X Crossfire when both are OC'ed and has double the RAM to boot. (4 times as much if you get a 390). So, going from a 270X to a 290 you get double everything- shaders, ROPs, TMUs and VRAM.
> 
> If your sig rig is the one it's going in, well I imagine that G3258 (4.8ghz) under heavy load might already be taxing a 450w PSU. I wouldn't add a 290 to that PSU unless you reset your CPU to stock. The best way to know for sure is to run an x264 test on the CPU with a kill-a-watt meter attached and see what kind of wattage you're drawing. Keeping in mind that an OC'ed 290 can draw 300w. Even stock, 1ghz it can take 250w. (my i7 and two 290s peak at 850w at the wall in the latest, most demanding stuff.)


Thanks for the advice, I think the PCS+ looks like a good option. Seems board design is improved vs reference and I can still get a GPX block for it. Also good air cooler which although would be too big for my case if I go back to air cooling I would probably change case at the same time.

Had been meaning to get a plug in power meter for a while so took your advice and got one to check before upgrading. My G3258 at 4.7GHz is at 1.28Vcore (I got lucky!) and my 270x has locked voltage with overclock to about 1160. Small pump and four fans plus a couple of 2.5" disks. Some quick tests show max measured wattage at the wall of;
Idle, around 50W
CPU loaded with XTU stress test, around 120W
Firestrike, around 230W
XTU + Valley together, around 250W

So given potential TDP difference between 270x and 290 of around 120W seems I should be okay on 450W PSU.

Only thing holding me back now is I don't seem to be able to find my dremel which I need for a case mod to squeeze the card in...


----------



## cmpxchg8b

Quote:


> Originally Posted by *masteratarms*
> 
> Using the Geforce as a physX donkey?


I've just thrown all spare cards I had lying around unused into this box. TB actually does all the 3D part, and 280 is only there to drive more displays (no 3D).


----------



## diggiddi

Quote:


> Originally Posted by *cmpxchg8b*
> 
> I've just thrown all spare cards I had lying around unused into this box. TB actually does all the 3D part, and 280 is only there to drive more displays (no 3D).


In my case a 7950 is doing 2d and the 290x for heavy lifting


----------



## matagyula

Thanks for the guide







I replaced the thermal paste today, and so far I am very satisfied with the results (I might be keeping the stock cooler on). While running FireStrike the max temperature reached was 66°C, with the fans reaching 40% of their maximum speed, hooray! I'll update this post once I get to play some more CS:GO









EDIT - well, it's not all as nice as I hoped it to be







Thats two CS:GO matches


----------



## techjesse

I7 5960X Ln2 @ 5.0GHz







R9 280X QuadFire Water Cooled Vantage P85790 http://www.3dmark.com/3dmv/5338233



BOOM


----------



## neurotix

Quote:


> Originally Posted by *techjesse*
> 
> I7 5960X Ln2 @ 5.0GHz
> 
> 
> 
> 
> 
> 
> 
> R9 280X QuadFire Water Cooled Vantage P85790 http://www.3dmark.com/3dmv/5338233


That's sick.


----------



## techjesse




----------



## hatchet_warrior

I just RMA'd a Gigabyte 280x OC ver 2.0 due to some serious artifacts and flickering textures. http://1drv.ms/1JChGEP

When it was causing issues, it was only ever in games. Titanfall gave me issues from day 1 of owning it, but nothing else. MWO started to later but that seems to be an issue with a poorly made game. I started playing Thief recently and that had major issues, leading me to RMA it. Benchmarking utilities like FIrestrike, Heaven, and Valley never had any issues.

I picked up a used 260x to use in the meantime and had no problems. After a few weeks I get a box from Gigabyte with a note that says "no problem found." So far I have not had any issues, but I am wondering if the issues are going to arise later and be worse. I'm thinking they might have just reapplied TP, but the card never got that hot (max around 75C).

Anyone else have an experience like this? I just don't want this card to blow up on me one day because I missed something.


----------



## matagyula

@hatchet_warrior - I think I've had a similar issue with my previous R9 280x, see this video




I too RMA'd the card, but because the manufacturer provided no response after 30 days, I received money instead of a new card







Used part of that money to buy a used R9 280x, and this one is good.

I came to the conclusion that the artifacts were caused by one or more faulty memory modules. The reasoning behind it being the fact that older/less demanding games never utilized more than 2GB of VRAM and so did not produce artifacts. However, GTA V with maxxed out settings eats up all the VRAM, the faulty module(s) gets used and artifacts start popping up. I could reproduce the artifacts successfully when I applied this logic to any other game - more VRAM utilization = artifacting. For example I could set ARMA III to render at 200% resolution scaling, and with addedd MSAA the artifcats appeared almost instantly. Of course this is just my theory and I never received confirmation, plus I have no idea exactly how the VRAM chips get filled up.

edit: the card I rma'd wasn't overheating, either. In fact it was one of the coolest running GPUs I've ever had (wouldn't even hit 70°C under load) ... too bad it was faulty


----------



## neurotix

I have to wonder if the faulty RAM on these things that causes problems like this for some people is Elpida or Hynix.

Didn't the 290 have these issues when it was new and it was because of Elpida memory? That was probably worse because people were getting black screens and couldn't even game at all!

Why don't they just stop using garbage, crappy Elpida on everything? Does anyone really ever want Elpida RAM over Hynix RAM?









Really glad my 290 Tri-X cards, and 290 Vapor-X have guaranteed Hynix RAM.

I do believe most of the other cards I own, including my 270X and 7970, have Elpida RAM though. No issues with it so far. Sorry to you guys for getting junk cards.


----------



## masteratarms

Its Hynix RAM on my DC2T. I got it 2nd hand and was lucky because the first one I got didn't boot. This guy I'm sure was bit-mining but that is irrelevant. The problem isn't Elipida or Hynix its that manufacturers were running the RAM above specifications without making sure they are adequately cooled. I have RAM voltage @ 1.6v (and I don't know how to lower it to 1.5v stock), I have RAM sinks and a Kraken G10. It has been fine when playing any game except Company of Heroes. The fans are set to ramp up when the core gets hot but CoH was not that stressful and I saw artifacts. I have played indie games no problem. I set the bios to default 1375 RAM speed from 1600Mhz but use afterburner to set 1500Mhz. The RAM absolutely cannot cope with 1.6v 1600Mhz when its rated for 1.5v 1500Mhz. Yes there are people clocking to 1700Mhz but its not guaranteed.


----------



## hatchet_warrior

Quote:


> Originally Posted by *matagyula*
> 
> I came to the conclusion that the artifacts were caused by one or more faulty memory modules. The reasoning behind it being the fact that older/less demanding games never utilized more than 2GB of VRAM and so did not produce artifacts. However, GTA V with maxxed out settings eats up all the VRAM, the faulty module(s) gets used and artifacts start popping up. I could reproduce the artifacts successfully when I applied this logic to any other game - more VRAM utilization = artifacting. For example I could set ARMA III to render at 200% resolution scaling, and with addedd MSAA the artifcats appeared almost instantly. Of course this is just my theory and I never received confirmation, plus I have no idea exactly how the VRAM chips get filled up.
> 
> edit: the card I rma'd wasn't overheating, either. In fact it was one of the coolest running GPUs I've ever had (wouldn't even hit 70°C under load) ... too bad it was faulty


That is a very good point. I never thought about checking how "full" my ram got. Perhaps it is just a single module that isn't correctly connected to the heatsink. If I have time this weekend I will look into this a lot more. Might be an excuse to go full h2o









Quote:


> Originally Posted by *neurotix*
> 
> I have to wonder if the faulty RAM on these things that causes problems like this for some people is Elpida or Hynix.
> 
> Didn't the 290 have these issues when it was new and it was because of Elpida memory? That was probably worse because people were getting black screens and couldn't even game at all!
> 
> Why don't they just stop using garbage, crappy Elpida on everything? Does anyone really ever want Elpida RAM over Hynix RAM?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Really glad my 290 Tri-X cards, and 290 Vapor-X have guaranteed Hynix RAM.
> 
> I do believe most of the other cards I own, including my 270X and 7970, have Elpida RAM though. No issues with it so far. Sorry to you guys for getting junk cards.


There are 3 types of ram in these cards. Low end Hynix, Elpida, and High end Hynix. They are in that order from worst to best. Elpida is really not that bad overall. It can't OC as much as the good Hynix but it isn't bad for stock clocks.


----------



## MiladEd

So I've had this card for exactly 10 months today. It's the Dual-X version from Sapphire.

I've had it overclocked since day one but for the past 4 months I've been running it at 1100/1600 MHz with an overvolt of 1243 (previous OC didn't require overvolt). Before reaching this clock, I experimented with different clocks and voltages to find this spot, all any other clock over 1100 MHz proved unstable regardless of the voltage.

I played lots of witcher 3 with this clock and the highest temp I'd gotten was 78 C. But today, I tried playing it again after a while, and it reached 84 C!! I even tried running the case open and it didn't really help...

I've to add. One of the fans of the cooler was running a bit wobbly and make a ticking noise, so I had to do a simple mod to give it a bit more clearance to prevent the noise. It helped, but today I've noticed it seems to run a little bit slower than before...

After I saw the high temp, I lowered the clock to 1080/1500 MHz which had proved stable on stock volts, and temps didn't go higher than 71 C.

Apart from the faulty fan, what could be the reason of this high temp? Is the card reaching its life's end?


----------



## buttface420

thermal paste dried? i re-applied mine every 6 months or so cause temps would rise


----------



## Tobe404

What's the best way to clean a GPU without damaging it?
My 280x looks hell dusty in my new system and the temps don't seem as good.


----------



## MiladEd

Quote:


> Originally Posted by *Tobe404*
> 
> What's the best way to clean a GPU without damaging it?
> My 280x looks hell dusty in my new system and the temps don't seem as good.


Can of compressed air or an air blower.
Quote:


> Originally Posted by *buttface420*
> 
> thermal paste dried? i re-applied mine every 6 months or so cause temps would rise


I think I'll get 2 12x12 cm PWM fans and replace the original fans with them and reapply the TIM in the process.


----------



## GoLDii3

Anyone has the idea of when the R9 280X should be upgraded? For future proof.


----------



## masteratarms

Next generation of AMD/Nvidia are getting a die shrink & both manufacturers will use HBM. Expected 2016. Will the die shrink filter down to lower tier cards? Probably not but I've just bought a 2nd hand 280x and its plenty fast for 1080p at the moment.


----------



## MiladEd

As I said before, the fans on my Sapphire R9 280X Dual-X were dying, so I replaced them!

See full thread here!


----------



## lostsurfer

Need help guys...I obtained an XFX r9 280x DD from a buddy who mentioned he had nothing but trouble with it but never RMA'd it but I could have it...sweet...he informed me that about 30 min of game play it would artifact and cause the game to crash.

Got it home, removed the heat-sync and fan and reapplied thermal paste after cleaning thoroughly, I could tell the factory job was horrid as the gpu had literally not thermal compound on it, it was alone the gpu socket. SO figure that was it. Fired up and away she went, well sort of, games started crashing and system would reboot, and the fans would spin up and die down. So i figured flash latest bios cause I found that the the clocks were all off. After bios flash ran better/longer but still hard reboots. Though eh, it's this cheap 575w psu. Bought a module corsair 750, new case cause it was on sale and 4 fan setup....my highest temp under load is 59c. and some games (bf4) ran great no issues. Fired up Star Wars Beta and kept crashing. So did some more research and installed afterburner and set to set constant voltage as I noticed in the GPU-z log it was fluxing. Well now the beta works fine and (BF4) gets stuck....lol....any ideas? This obviously isn't going to be my most stable build but if me and my son can game on it in the living room...bonus...









*any ideas*


----------



## masteratarms

Lower memory clocks if its got graphical artifacts. Increase power target, it can be done from catalyst control center but also from editing bios with VBE7.0.0.7b can give more leeway. Could help if the fluctuating voltage is due to power target, usually it drops clocks. Also if it does have boost clock, u can make boost and 3d clocks the same (I have).


----------



## PhillyB

Morning everyone. Have a overclocking question for the gurus here.

I have a XFX DD 280x and am trying to overclock it with afterburner. I unlocked the voltage slider and it is set at 1.2. I have tried a few clocks up to 1170 and Kombuster doesnt crash, but after about 15-30 seconds, the clocks start fluctuating.

System Specs:
-XFX DD 280x
-tried gpu clocks between 1090 and 1170
-memory at 1600
-using an alphacool NexXxos 280x m04 block
-water temps topped out at 30c.
-gpu temp was 43c
-vrm1 86c
-vrm2 ~70c (i think, I'm away from that computer now)

The clock fluctuations happen roughly as vrm1 tops out around 86c

Any advice to get a stable clock frequency?

Thanks for the help.


----------



## GoLDii3

Quote:


> Originally Posted by *PhillyB*
> 
> Morning everyone. Have a overclocking question for the gurus here.
> 
> I have a XFX DD 280x and am trying to overclock it with afterburner. I unlocked the voltage slider and it is set at 1.2. I have tried a few clocks up to 1170 and Kombuster doesnt crash, but after about 15-30 seconds, the clocks start fluctuating.
> 
> System Specs:
> -XFX DD 280x
> -tried gpu clocks between 1090 and 1170
> -memory at 1600
> -using an alphacool NexXxos 280x m04 block
> -water temps topped out at 30c.
> -gpu temp was 43c
> -vrm1 86c
> -vrm2 ~70c (i think, I'm away from that computer now)
> 
> The clock fluctuations happen roughly as vrm1 tops out around 86c
> 
> Any advice to get a stable clock frequency?
> 
> Thanks for the help.


Either it's throttling or its boost fault. Why don't you try using VBE7 to remove the boost from the bios?


----------



## PhillyB

i was doing some poking around and I didn't change the Power Limit percentage. Does this matter?


----------



## masteratarms

@PhillyB

http://www.alphacool.com/product_info.php/info/p1453_Alphacool-NexXxoS-GPX---ATI-R9-280X-M04---mit-Backplate---Black.html?language=en&XTCsid=klocs29n16d6nm1tncud1n8la6

Looking at the User review, his VRM temps were not as high as yours, I think he is using a 270x. Are you sure you attached the cooler with either thermal pad/paste on the VRM? I understand that the product is a universal GPU block with a custom heatsink for the rest of the card, and you only need to upgrade the 2nd part when changing graphics card.

If the VRM cooling is the best it can be then look at my post above yours regarding power target and boost clocks.


----------



## bichael

As the vrm is partly passively cooled with the gpx blocks you could try pointing a desk fan at it, and assuming that helps lower vrm temp see if you get same result?


----------



## lostsurfer

Quote:


> Originally Posted by *bichael*
> 
> As the vrm is partly passively cooled with the gpx blocks you could try pointing a desk fan at it, and assuming that helps lower vrm temp see if you get same result?


I'm going to give this a shot, I down clocked to 850 from 1000 and played bf4 for about 20 min before I had to turn in for the night. I'm going to tinker a little more, been forever since I've done any overclocking with video cards so its' a relearned experience for me....


----------



## lostsurfer

Quote:


> Originally Posted by *masteratarms*
> 
> Lower memory clocks if its got graphical artifacts. Increase power target, it can be done from catalyst control center but also from editing bios with VBE7.0.0.7b can give more leeway. Could help if the fluctuating voltage is due to power target, usually it drops clocks. Also if it does have boost clock, u can make boost and 3d clocks the same (I have).


If I increase the power target, what should I start with and what increments should I increase? Also, I have constant voltage in afterburner enabled, would this be an issue, and can I just do the power target from afterburner? If my thinking is right, leave all stock settings (1000/1500) increase power target by say (20 or your recommendation) (do this with constant voltage set on/off?) Then when/if I find the sweet spot make the change permanent with bios registry edit? Thanks for the guidance guys!


----------



## PhillyB

Well, got home and was able to try some things. Looks like the Power Limit % was the key to getting it to stay at the set clocks. Though the VRMs defiantly got hot.

Core Voltage was 1.144v
Power Limit % +10
Core Clock 1110
Memory Clock 1600


----------



## MiladEd

Sooo my R9 280X is... well... showing some worrying signs. I had originally OCed it to 1150 MHz with a hefty OV. Showed artifacts, lowered to 1130 MHz. It was fine for a while, then showed artifacts, lowered it to 1120. Same thing, lowered it to 1110. Then again artifacts were shown, lowered it to 1100 MHz. It was fine for a very long time, but today I saw artifacts while playing GTA V.

Oh, and the temps... they were fine, it was only at 65 C when it artifacted in GTA V.

Is the thing nearing it's life's end?


----------



## tomytom99

Quote:


> Originally Posted by *MiladEd*
> 
> Sooo my R9 280X is... well... showing some worrying signs. I had originally OCed it to 1150 MHz with a hefty OV. Showed artifacts, lowered to 1130 MHz. It was fine for a while, then showed artifacts, lowered it to 1120. Same thing, lowered it to 1110. Then again artifacts were shown, lowered it to 1100 MHz. It was fine for a very long time, but today I saw artifacts while playing GTA V.
> 
> Oh, and the temps... they were fine, it was only at 65 C when it artifacted in GTA V.
> 
> Is the thing nearing it's life's end?


A friend of mine had a similar issue with his 7870. It can usually be fixed by going under the stock clock, even though that is disappointing to do. You can avoid further degrading by running at the lowest frequency you need, and keeping it cold as ice.
Another issue could be that you have an overclocked monitor, of which sometimes causes artifacts.


----------



## GoLDii3

Quote:


> Originally Posted by *MiladEd*
> 
> Sooo my R9 280X is... well... showing some worrying signs. I had originally OCed it to 1150 MHz with a hefty OV. Showed artifacts, lowered to 1130 MHz. It was fine for a while, then showed artifacts, lowered it to 1120. Same thing, lowered it to 1110. Then again artifacts were shown, lowered it to 1100 MHz. It was fine for a very long time, but today I saw artifacts while playing GTA V.
> 
> Oh, and the temps... they were fine, it was only at 65 C when it artifacted in GTA V.
> 
> Is the thing nearing it's life's end?


I also artifact on GTA V. It happens only in that game,i think it's not the card fault. Of course nothing big,just a few flashes here and there.

By the way,if you have the Dual-X version it may just be not enough voltage - Those cards have an insane amount of voltage drop.


----------



## MiladEd

Quote:


> Originally Posted by *tomytom99*
> 
> A friend of mine had a similar issue with his 7870. It can usually be fixed by going under the stock clock, even though that is disappointing to do. You can avoid further degrading by running at the lowest frequency you need, and keeping it cold as ice.
> Another issue could be that you have an overclocked monitor, of which sometimes causes artifacts.


No overclocked monitor. I've lowered the clock to stock values so far, will run it at this for a while and see what happens. Bad thing is I've lost at least 7 frames while playing Witcher 3, the overclock really helped my framerate in that game.
Quote:


> Originally Posted by *GoLDii3*
> 
> I also artifact on GTA V. It happens only in that game,i think it's not the card fault. Of course nothing big,just a few flashes here and there.
> 
> By the way,if you have the Dual-X version it may just be not enough voltage - Those cards have an insane amount of voltage drop.


I have the Dual-X version. I'd set it to 1243 V and it was fine for a long time, like 4 months, and that is in the summer where ambient temps were pretty high already.


----------



## Phantasia

Got a quick question and hope for some guidance from you guys.

I won an EVGA X99 FTW motherboard...

It got me thinking.

Sell it and use the money to upgrade the 280x to something else. What, that's the question... A used 290/290x? Used 970?

Alternative:

Keep the 280x and upgrade my xeon e3 1230v3 to an i7 5820k and dd4...

EDIT:

I can then try to sell the Xeon, the Asrock z87 extreme4 and the ddr3 as well...


----------



## GoLDii3

Quote:


> Originally Posted by *Phantasia*
> 
> Got a quick question and hope for some guidance from you guys.
> 
> I won an EVGA X99 FTW motherboard...
> It got me thinking.
> 
> Sell it and use the money to upgrade the 280x to something else. What, that's the question... A used 290/290x? Used 970?
> 
> Alternative:
> Keep the 280x and upgrade my xeon e3 1230v3 to an i7 5820k and dd4...
> EDIT:
> I can then try to sell the Xeon, the Asrock z87 extreme4 and the ddr3 as well...


Sell the motherboard,upgrade to whatever you'd like since it's free. Or keep the money and wait for the new 2016 GPU's.

No use upgrading to X99 if you only plain to game.


----------



## ARacoma9999

Hey guys, just wanted to come join up if I could. Just recently got an XFX R9 280 DD edition, I'm gonna link my invoice from Newegg since it's 1 AM and I don't feel like using my lame photo skillz right now hahaha. If an actual photo of the card is needed in action, just lemme know. Thanks a million!


----------



## masteratarms

Quote:


> Originally Posted by *MiladEd*
> 
> Sooo my R9 280X is... well... showing some worrying signs. I had originally OCed it to 1150 MHz with a hefty OV. Showed artifacts, lowered to 1130 MHz. It was fine for a while, then showed artifacts, lowered it to 1120. Same thing, lowered it to 1110. Then again artifacts were shown, lowered it to 1100 MHz. It was fine for a very long time, but today I saw artifacts while playing GTA V.
> 
> Oh, and the temps... they were fine, it was only at 65 C when it artifacted in GTA V.
> 
> Is the thing nearing it's life's end?


Artifacts are cause by the memory running past its limit. Changing the GPU clock will lower overall PCB temps but its smarter to lower memory clocks. My Asus 280x DC2T which I bought 2nd hand was artifacting very badly at 1600Mhz 1.6v, I've got it at 1500Mhz 1.6v with a G10 kraken and RAMsinks and 200mm fan blowing on it. The memory is bad and its RMA'able but since I already RMA'ed a 2nd hand card from the same seller and he got no refund/replacement I'll just keep it as I paid I believe £40 lower than the going rate for a pukka card.


----------



## MiladEd

I'd been running the memories @ 1600 MHz since I go the card. Not sure if they're too blame. Also lowering core clocks had previously fixed my artifacting so...

But I'll try it. I'll run the memory at original 1500 MHz and the core at 1100 MHz. See what happens.


----------



## brenzor

Hey guys, I'm late to the party ! As much as I have a love/hate relationship with my Asus TOP 280x, you can count me in this club









But seriously tho, Asus, plz fix.


----------



## Tobe404

Is there a UEFI version of the Gigabyte 280x Rev 2 (F60/Elpida) BIOS anywhere? I've had a look around but buggered if I could find one. Thanks.


----------



## aaronsta1

Quote:


> Originally Posted by *MiladEd*
> 
> Sooo my R9 280X is... well... showing some worrying signs. I had originally OCed it to 1150 MHz with a hefty OV. Showed artifacts, lowered to 1130 MHz. It was fine for a while, then showed artifacts, lowered it to 1120. Same thing, lowered it to 1110. Then again artifacts were shown, lowered it to 1100 MHz. It was fine for a very long time, but today I saw artifacts while playing GTA V.
> 
> Oh, and the temps... they were fine, it was only at 65 C when it artifacted in GTA V.
> 
> Is the thing nearing it's life's end?


is the gta5 artifacts the textures being stretched?

i had this issue too.. i had to bump the volt up one step.

i dont think its really the cards tho.. i think its the drivers or the game itself..


----------



## MiladEd

Quote:


> Originally Posted by *aaronsta1*
> 
> is the gta5 artifacts the textures being stretched?
> 
> i had this issue too.. i had to bump the volt up one step.
> 
> i dont think its really the cards tho.. i think its the drivers or the game itself..


I had that too, previously, but this time it was a bunch of green squares somewhere on the screen. Only flashed for a fracture of a second.


----------



## tomytom99

This may seem a tad uncalled for, but do you know if the 270x supports one of those DP MST's? I had an epiphany a few days ago, and realized that I could do a 5x1 eyefinity if I can use an MST. I'd use my 24" screen in the middle, and put two 19" 5:4 panels on either side.

It would be interesting to see if the 270x with the helping hand of a 7870 would manage a 6880*1024 eyefinity array.
I'm just asking so that I don't have to experiment to find if it's even possible, as I don't want to deal with returning stuff.


----------



## MiladEd

So, after reducing the memory clock to 1500 MHz, I seem to have fixed the artifacting. Core clock is 1100 MHz BTW. Temps dropped too, about 3 degrees. Now I get max 68 C on extended gameplay on full load (Witcher 3).

I've another question. I can find a used 280X for a decent price around here. Is it worse it to crossfire it with my current one? I've a 850 watt PSU so I have enough power for them. My problem is, my secondary PCIE slot is only 4X. How much will that reduce performance?


----------



## diggiddi

From what I understand it will be a stutterfest


----------



## tomytom99

Quote:


> Originally Posted by *diggiddi*
> 
> From what I understand it will be a stutterfest


Kinda what I figured when I was brainstorming. (My friend's single 7870 shoots itself with just three 1440p screens).

As long as I can do it, I'd be happy. I'd then be able to keep the arrangement once I upgrade to a better GPU.

Right now I'm temporarily running two 6950's, due to the fact that some cheapo bearings went real bad on my 270x Toxic. It'll be in Sapphire's shop for a new cooler soon. I'm no longer an avid Sapphire fan, as they just redid their website. Well, most of it. The support section is still archaic. They also still need me to send the card in for ANY service, they won't just send me new fans or a new assembly.


----------



## xutnubu

Been a while since I posted here. Last time I was asking if I should replace the TIM on my 280x.

After months of having the tube of Gelid GC Extreme sitting in my drawer, I said why not. My card was reaching 82C, so why not check it out.

This is what I found when I opened it (sorry for the potato quality):



I think the image speaks for itself.

Long story short, I managed to take off ±13C° with the new application.

So yes, it might be worthy for some.


----------



## Devildog83

Quote:


> Originally Posted by *ARacoma9999*
> 
> 
> 
> Hey guys, just wanted to come join up if I could. Just recently got an XFX R9 280 DD edition, I'm gonna link my invoice from Newegg since it's 1 AM and I don't feel like using my lame photo skillz right now hahaha. If an actual photo of the card is needed in action, just lemme know. Thanks a million!


When ever you get some time add a photo and what clocks you are running at via a GPUZ screen shot and I will add you to the list.
Congrats on your purchase.


----------



## lightsout

Hey guys I know most folks don't run 270's in this club but had a question. I have been running the same drivers for over a year because they are patched for OCing my monitor and I didn't want to fuss with that anymore.

Will I see any gains by updating the drivers for these 270's? I know the 290's have seen nice gains but not sure about the smaller cards.


----------



## Derek129

Never hurts to try, you can always roll back. Whenever I put my asus 270 back in my pc because my 390 sh-its the bed I always install the latest driver/beta and it runs flawlessly and cool and quiet.


----------



## lightsout

Quote:


> Originally Posted by *Derek129*
> 
> Never hurts to try, you can always roll back. Whenever I put my asus 270 back in my pc because my 390 sh-its the bed I always install the latest driver/beta and it runs flawlessly and cool and quiet.


Thank you sir I may give it a go.


----------



## Exxlir

Any of the experts that have a r9 280x toxic sapphire card please assist, I have the card sitting on boost mode but I have never overclocked a graphics card before can someone explain how to go about this and also what clocks theve manage to reach at stable and if it is worth o'cing for a performance boost or not,

Thanks


----------



## MiladEd

Quote:


> Originally Posted by *Exxlir*
> 
> Any of the experts that have a r9 280x toxic sapphire card please assist, I have the card sitting on boost mode but I have never overclocked a graphics card before can someone explain how to go about this and also what clocks theve manage to reach at stable and if it is worth o'cing for a performance boost or not,
> 
> Thanks


First, get Sapphire TRIXX software, for voltage control.

Overclocking the card is as easy as turning a bunch of sliders up and down. Start from the default core clock of your card, then go up. with 10 MHz steps. Keep an eye out for temps, and artifacts while gaming. If you see artifacts, bump up the voltages a bit. Keep on this until you reach the thermal limit. For 280X, highest temperature is said to be 85 C, but it's wiser to keep is about 10 C lower than that if you want your card to have longevity. Also, Tahiti cores (280, 280X, 7950, 7970 etc) mostly lose overclocking at temps higher than 70 C, or so I've heard. Turning the Power limit slider up increases power usage, and heat output, but it prevent card form throttling. Keep it lower if you have a smaller PSU or are concerned about temps.

I've a Sapphire R9 280X Dual-X. Mine's not a very good overclcocker, as the highest core clock I've been able to reach was 1100 MHz. Any thing higher, regardless of the voltages, artifacts in GTA V.

Remember, some games may run just fine with a certain OC while others may cause problems. My cards does just fine with 1150 MHz in The Witcher 3, while GTA V, although it is a much less demanding game, artifacts to hell, even at lower clocks.

The default core clock on my 280X is 1020 MHz. OCing does quite help with performance, I've achieved about 6 frames in The Witcher 3 with my 80 MHz OC.

Good luck and happy overclocking.


----------



## Mercennarius

Has anyone had screen "studdering" issues with there R9 280X before? Looks like the screen moves up and down real quick and seems to happen most when opening certain windows or scrolling in a window. Not my computer but looks like this:
https://www.youtube.com/watch?v=pYMGRjr5bVQ

I've reinstalled the drivers and tried both the beta and most recent WHQL drivers with no success.


----------



## rdr09

Quote:


> Originally Posted by *Mercennarius*
> 
> Has anyone had screen "studdering" issues with there R9 280X before? Looks like the screen moves up and down real quick and seems to happen most when opening certain windows or scrolling in a window. Not my computer but looks like this:
> https://www.youtube.com/watch?v=pYMGRjr5bVQ
> 
> I've reinstalled the drivers and tried both the beta and most recent WHQL drivers with no success.


if it happens when watching YT video . . . try disabling Hardware Acceleration (HA). Rightclick video>uncheck HA. Or, go to advance settings of the browser and disable it there.

edit: More info . . .

http://www.overclock.net/t/1265543/the-amd-how-to-thread


----------



## Gumbi

Quote:


> Originally Posted by *JackCY*
> 
> No. Unless your VRM cooling is awful but then the VRMs would burn instead.
> 
> All chips have some limit.
> I can run mine up to 1120, on stock voltage but the more I push it over 1070 the more I may encounter some weird behavior/artifact in games. Benchmarking is pointless except Heaven with specific AA to discover the instability. All chips have a wall and no matter how much voltage you feed it it won't clock higher, at least not without sub-zero cooling.
> Overall AMD GPUs are one of the worst overclockers compared to how much extra you can milk out of Nvidia and Intel chips. Some are lucky to get up to +10% but that's about it. No wonder AMD GPUs used to have base clock below 1GHz, they simply don't clock high.


You can't compare architectures like that. Also, 290s/390s overclock very well.


----------



## Mercennarius

Quote:


> Originally Posted by *rdr09*
> 
> if it happens when watching YT video . . . try disabling Hardware Acceleration (HA). Rightclick video>uncheck HA. Or, go to advance settings of the browser and disable it there.
> 
> edit: More info . . .
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread


Doesn't happen with just watching videos, sometimes just happens in a normal browser, window, or game.


----------



## Exxlir

Quote:


> Originally Posted by *MiladEd*
> 
> First, get Sapphire TRIXX software, for voltage control.
> 
> Overclocking the card is as easy as turning a bunch of sliders up and down. Start from the default core clock of your card, then go up. with 10 MHz steps. Keep an eye out for temps, and artifacts while gaming. If you see artifacts, bump up the voltages a bit. Keep on this until you reach the thermal limit. For 280X, highest temperature is said to be 85 C, but it's wiser to keep is about 10 C lower than that if you want your card to have longevity. Also, Tahiti cores (280, 280X, 7950, 7970 etc) mostly lose overclocking at temps higher than 70 C, or so I've heard. Turning the Power limit slider up increases power usage, and heat output, but it prevent card form throttling. Keep it lower if you have a smaller PSU or are concerned about temps.
> 
> I've a Sapphire R9 280X Dual-X. Mine's not a very good overclcocker, as the highest core clock I've been able to reach was 1100 MHz. Any thing higher, regardless of the voltages, artifacts in GTA V.
> 
> Remember, some games may run just fine with a certain OC while others may cause problems. My cards does just fine with 1150 MHz in The Witcher 3, while GTA V, although it is a much less demanding game, artifacts to hell, even at lower clocks.
> 
> The default core clock on my 280X is 1020 MHz. OCing does quite help with performance, I've achieved about 6 frames in The Witcher 3 with my 80 MHz OC.
> 
> Good luck and happy overclocking.


very useful information there mate thank you! I'll have a go and see what happens just bought a new psu, evga g2 650w because my old one got shorted which was a 750w coolermaster G2 is much better though, currently getting around 60-80 fps in most the games I play but thinking in the near future a o'c might be needed


----------



## MiladEd

Quote:


> Originally Posted by *Exxlir*
> 
> very useful information there mate thank you! I'll have a go and see what happens just bought a new psu, evga g2 650w because my old one got shorted which was a 750w coolermaster G2 is much better though, currently getting around 60-80 fps in most the games I play but thinking in the near future a o'c might be needed


Glad to be of help.

Yeah, G2 is a very good product, it's a Tier 1 PSU which means it doesn't get any better than that. 650 W is plenty for a single 280X. Seems like you almost have the same CPU as me too, I've an OC FX-8320 @ 4.5 GHz. I think 650 W would be plenty enough for an OC FX-8350, and 280X. I personally have a High Power Astro GD 850 W PSU, which is enough for dual 280X, but since my mobo's second PCIE slot is rated at 4X, I can't really CF sadly.

Well, good luck again with your OC if you decided to go through with it. Also, feel free to PM me with any more questions you had, I visit the site almost everyday.


----------



## Exxlir

i have a question about the 280x toxic sapphire overclocked

when im running a game on load im pretty sure i hear either the gpu or the cpu ( i know the cpu does make the high pitched noise as it works) making high pitched noise as if its a hard drive arm writing on to metal and its making me very paranoid this is why i replaced the psu recently due to hearing roughly the same noise coming from the gpu power connectors has anyone got any idea if this is a normal high pitched noise as the gpu works ?


----------



## diggiddi

Possibly coil whine


----------



## Exxlir

Quote:


> Originally Posted by *diggiddi*
> 
> Possibly coil whine


i thought so myself then i listened very closely and i dont think it is coil whine hmmmm


----------



## aaronsta1

Quote:


> Originally Posted by *Mercennarius*
> 
> Doesn't happen with just watching videos, sometimes just happens in a normal browser, window, or game.


yep it happens.. it does it with both my 7870XT and my 270X.


----------



## Exxlir

can any one tell me if its normal when i open the side panel from my tower have the gpu sitting in front of my and only when a game or a high demand of graphics is needed the gpu at the power connectors seems to get very loud with a buzzing noise i have asked this previously and people suggested coil whine but I'm not so sure i think the cards drawing power and some where its faulting and that's why i can hear the buzzing gritty noise i may be completely wrong and just plain paranoid but i just cant sit with the fact its coil whine when i switch off vsync its the same noise and when vsync is switched on its the exact same noise again just low frequency buzzing as if one of the fans has a bit of dust trapped on it but it doesn't i thoroughly checked and made sure it wasn't dust..

i personally would like to know if its normal for such noise or not.

thanks


----------



## Roboyto

Quote:


> Originally Posted by *Exxlir*
> 
> can any one tell me if its normal when i open the side panel from my tower have the gpu sitting in front of my and only when a game or a high demand of graphics is needed the gpu at the power connectors seems to get very loud with a buzzing noise i have asked this previously and people suggested coil whine but I'm not so sure i think the cards drawing power and some where its faulting and that's why i can hear the buzzing gritty noise i may be completely wrong and just plain paranoid but i just cant sit with the fact its coil whine when i switch off vsync its the same noise and when vsync is switched on its the exact same noise again just low frequency buzzing as if one of the fans has a bit of dust trapped on it but it doesn't i thoroughly checked and made sure it wasn't dust..
> 
> i personally would like to know if its normal for such noise or not.
> 
> thanks


If it changes pitch when you enable VSYNC odds are it is coil whine.

For your fan noise suspicions just use your finger to temporarily stop each fan and see if the noise goes away or not.

To further investigate the possibility of coil whine being the culprit you can run something that gives very high frame rates. 3DMark benches like Cloudgate or Sky Diver come to mind. Take a listen to the noise there if it's present. With the high frames it should be a highher pitched noise.

If it is coil whine sometimes altering clocks, voltages, and power settings with an OC utility will change the tone/frequency of the sound or reduce it.

Coil whine is a common enough thing to happen. Sometimes card exhibit none at all under any conditions, other times it only comes out when clocks/volts/pwr settings are changed, or it could only happen under extremely high frame rate scenarios; like a loading screen for a game.

If it can't be remedied you can try to RMA your card if it's stil under warranty.


----------



## lightsout

How do you guys over clock you crossfired cards?

With my 270's the only thing reliable is AMD overdrive. Most other apps don't adjust the second card.

Asus gpu tweak even gives me voltage control which AB does not. But when I open a bench the second card sits at 300mhz.

Yesterday it took me like 30 minutes to get the second card out of 2D clocks. Driver reinstall didn't even do it.


----------



## diggiddi

I monitor with two instances of GPUZ and MSI AB
Make sure CF is enabled in CCC
In AB make sure unified monitoring is on, add 20hz more to core to wake it up from 2d speeds


----------



## lightsout

Quote:


> Originally Posted by *diggiddi*
> 
> I monitor with two instances of GPUZ and MSI AB
> Make sure CF is enabled in CCC
> In AB make sure unified monitoring is on, add 20hz more to core to wake it up from 2d speeds


Thanks for the feedback.

When I set AB to not synchronize clocks between cards (not the setting you mentioned). ANd then select card #2. I cannot change the clocks on card 2. All the sliders are greyed out. Not sure why this is. I think I may swap cards and update the drivers just to see what happens.

The cards are identical too so its odd.


----------



## diggiddi

Need clocks to be snychronized


----------



## lightsout

Quote:


> Originally Posted by *diggiddi*
> 
> Need clocks to be snychronized


Well thats the thing, when they are set to sync. AB only changes the first card anyways. So if I have them set to sync or not, the first card only changes. I have ULPS disable through the registry.

Only way it seems to reliably change the second gpu is through overdrive.


----------



## Exxlir

Quote:


> Originally Posted by *Roboyto*
> 
> If it changes pitch when you enable VSYNC odds are it is coil whine.
> 
> For your fan noise suspicions just use your finger to temporarily stop each fan and see if the noise goes away or not.
> 
> To further investigate the possibility of coil whine being the culprit you can run something that gives very high frame rates. 3DMark benches like Cloudgate or Sky Diver come to mind. Take a listen to the noise there if it's present. With the high frames it should be a highher pitched noise.
> 
> If it is coil whine sometimes altering clocks, voltages, and power settings with an OC utility will change the tone/frequency of the sound or reduce it.
> 
> Coil whine is a common enough thing to happen. Sometimes card exhibit none at all under any conditions, other times it only comes out when clocks/volts/pwr settings are changed, or it could only happen under extremely high frame rate scenarios; like a loading screen for a game.
> 
> If it can't be remedied you can try to RMA your card if it's stil under warranty.


appreciate the answer you have given me, the only time the noise happens is when a game or movie is loaded and the Gpu needs to be used I personally don't believe it's coil whine because why would a high pitched noise come from two vga psu connectors, I did think it might be coil whine but then the fans are not making any noises I checked three times there's no dust trapped in the card or the fans but yet when the card loads anything it makes a noise like dust is hitting fans and it comes from both vga power connectors from the psu, so it got me wondering the card is working ok not getting any hotter than usual but it's becoming very loud for no reason, maybe I'm paranoid as I said before maybe the noise I'm hearing is the card drawing power from the psu and it's normal but then why would you hear it above two noctua F12s and a rad and 2 mega flow coolermaster fans and three Gpu fans spinning all at the same time that's why I'm worrying I can hear it over all those fans it doesn't seem normal to me


----------



## Roboyto

Quote:


> Originally Posted by *Exxlir*
> 
> appreciate the answer you have given me, the only time the noise happens is when a game or movie is loaded and the Gpu needs to be used I personally don't believe it's coil whine because *why would a high pitched noise come from* *two vga psu connectors*, I did think it might be coil whine but then the fans are not making any noises I checked three times there's no dust trapped in the card or the fans but yet when the card loads anything it makes a noise like dust is hitting fans and it comes from both vga power connectors from the psu, so it got me wondering the card is working ok not getting any hotter than usual but it's becoming very loud for no reason, maybe I'm paranoid as I said before maybe the noise I'm hearing is the card drawing power from the psu and it's normal but then why would you hear it above two noctua F12s and a rad and 2 mega flow coolermaster fans and three Gpu fans spinning all at the same time that's why I'm worrying I can hear it over all those fans it doesn't seem normal to me


If it only happens when the card is loaded, then it is most likely coil whine coming from somewhere.

Is the noise coming from the connectors at the card, or at the PSU? It's possible the PSU is emitting the noise.

I would suggest altering clocks, voltages, and/or power settings with an OC utility as this can change the tone/frequency of the sound or at least reduce it.

Have you tried using different PCIe power cables? If you have extras from your current GPU. It's not wise to mix/math cables from different PSUs. This may sound silly, but it doesn't hurt to try.

How about trying different PCIe power ports from the PSU? If your rig sig is correct than you are using the EVGA 650G2, and depending on which variant it can have more than two VGA/PCIe power ports.

To answer your question as to why the noise would be coming from/near the power connectors. The area with the yellow box around it in the photo below is the heatsink covering the VRMs for the R9 280X toxic. The green boxes are around the chokes. The VRMs, or voltage regulation modules, are what convert the power from the PSU to the necessary amounts the card requires. The chokes are supposed to reduce/eliminate the noise that you are hearing, but you may have a faulty card. If your card is still under warranty you may want to contact Sapphire and see what they can do for you.


----------



## ARacoma9999

Quote:


> Originally Posted by *Devildog83*
> 
> When ever you get some time add a photo and what clocks you are running at via a GPUZ screen shot and I will add you to the list.
> Congrats on your purchase.


Here's GPUZ:



Annnd then here's the card in my rig, I don't wanna take it out and then bamboozle myself at trying to get it back in with my loop in before it hahaha


----------



## Devildog83

Quote:


> Originally Posted by *ARacoma9999*
> 
> Here's GPUZ:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> Annnd then here's the card in my rig, I don't wanna take it out and then bamboozle myself at trying to get it back in with my loop in before it hahaha


You have been added. I don't remember if it was a 280 or 280x but noticed core clocks at 975 so I assumed it was a 280 and put it under there. If it;s a 280x let me know and I will change it.


----------



## ARacoma9999

Quote:


> Originally Posted by *Devildog83*
> 
> [/SPOILER]
> 
> You have been added. I don't remember if it was a 280 or 280x but noticed core clocks at 975 so I assumed it was a 280 and put it under there. If it;s a 280x let me know and I will change it.


Your attention to detail makes my impress feels increase. You're spot on about it rofl


----------



## barkeater

Quote:


> Originally Posted by *Exxlir*
> 
> can any one tell me if its normal when i open the side panel from my tower have the gpu sitting in front of my and only when a game or a high demand of graphics is needed the gpu at the power connectors seems to get very loud with a buzzing noise i have asked this previously and people suggested coil whine but I'm not so sure i think the cards drawing power and some where its faulting and that's why i can hear the buzzing gritty noise i may be completely wrong and just plain paranoid but i just cant sit with the fact its coil whine when i switch off vsync its the same noise and when vsync is switched on its the exact same noise again just low frequency buzzing as if one of the fans has a bit of dust trapped on it but it doesn't i thoroughly checked and made sure it wasn't dust..
> 
> i personally would like to know if its normal for such noise or not.
> 
> thanks


Could also be something as simple as a fan bearing needing a lube. May not be coil whine at all (which is not same as fan noise but has often been mistaken for). I would suggest removing the card and taking the cover off to get better access to fans. Very easy to add a minute drop of oil to the fan bearing (search you tube on this if you've never done it). Be careful not to add too much as it will spatter all over the fans, shroud, etc. Come to think of it I think this is something I need to do as well


----------



## ARacoma9999

Hey guys, I'd like to get suggestions about replacing the TIM on my XFX 280. Specifically, taking apart the card and if the difficulty of getting to the TIM is worth it for possibly better temps. I have a syringe of Arctic's MX-2 and I was gonna use it for my CPU when I switch to rigid tubing. Should I do my GPU as well? Also, as far as taking GPUs apart, I've done it about 4 or 5 times so far, I just wasn't sure if XFX's cooling system is set up differently than the general layout of GPU cooling systems.


----------



## Scrimstar

will the R9 285 or R9 280x have better performance after being overclocked?


----------



## MiladEd

Quote:


> Originally Posted by *Scrimstar*
> 
> will the R9 285 or R9 280x have better performance after being overclocked?


Hard to say, but 280X is better performer overall and it clocks well. However, the more efficient nature of 285/380 makes it a better choice IMO. However, if I were you, I'd wait for 380X which is coming out in a few weeks and will have better performance over 280X and all the features of 285/380.


----------



## GoLDii3

Quote:


> Originally Posted by *Scrimstar*
> 
> will the R9 285 or R9 280x have better performance after being overclocked?


280X with overclock will beat a 285.

285 has only the advantage on tesselation heavy games like The Witcher 3. Most 280X can hit easily 1150 MHz,and at that clock they are more or less like a GTX 780.

My card has terrible voltage regulation and does 1130-1150 MHz easily at "1.30V" because voltage regulation is crap and has a lot of voltage drop.

With a decent card like the Sapphire Toxic or DCUII,you can also start aiming for 1200 MHz. 280X clocked at that,dayum.


----------



## Scrimstar

I am looking at this card rn, i have the dualX 270x running at 1150MHz, but i can get a saph dual x 280x/285 for $130 at my local store, probably not gonna spend $250 for Free sync capabilities

http://www.amazon.com/gp/product/B00FLMKNE0/ref=s9_simh_gw_g147_i4_r?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=desktop-1&pf_rd_r=099VHG2P1BT0Y508Y4EQ&pf_rd_t=36701&pf_rd_p=2079475242&pf_rd_i=desktop


----------



## frankenstein406

Figured I would ask in here before starting a thread. My 280x dual x was having screen loss, black screens, restart on boot, and some other weird stuff when plugged into first dvi port. When I plug the dvi cord into second port all issues have stopped, Is this likely a faulty card?


----------



## Megadrone

getting this error, what does it mean?


----------



## diggiddi

Driver crash


----------



## bamaredwingsfan

I'm running a sapphire 280x currently, an I'd like to bump up gaming performance. At the moment my cpu is the athlon x4 860k. My question is, if I find another 280x will that jump performance up a good deal or should I sell the 280x I have an get something newer? Am I about to start hitting the cpu' s limit if I get a second card? I'd like to get something a little more power efficient, so if I do add a second card it won't tax my corsair 850 PSU. Any advice is appreciated.

Sent from my XT1575 using Tapatalk


----------



## rdr09

Quote:


> Originally Posted by *bamaredwingsfan*
> 
> I'm running a sapphire 280x currently, an I'd like to bump up gaming performance. At the moment my cpu is the athlon x4 860k. My question is, if I find another 280x will that jump performance up a good deal or should I sell the 280x I have an get something newer? Am I about to start hitting the cpu' s limit if I get a second card? I'd like to get something a little more power efficient, so if I do add a second card it won't tax my corsair 850 PSU. Any advice is appreciated.
> 
> Sent from my XT1575 using Tapatalk


I'd recommend to really, i mean really, find out what's holding back your gaming performance. when you play your games, monitor your usages (cpu and gpu usages) using afterburner. really simple. it will tell you the truth.

i know when i crossfired 2 7900 cards, they were bottlenecked in some games by an i7 Sandy 4.5 HT off (basically an i5).


----------



## hatchet_warrior

Quote:


> Originally Posted by *bamaredwingsfan*
> 
> I'm running a sapphire 280x currently, an I'd like to bump up gaming performance. At the moment my cpu is the athlon x4 860k. My question is, if I find another 280x will that jump performance up a good deal or should I sell the 280x I have an get something newer? Am I about to start hitting the cpu' s limit if I get a second card? I'd like to get something a little more power efficient, so if I do add a second card it won't tax my corsair 850 PSU. Any advice is appreciated.
> 
> Sent from my XT1575 using Tapatalk


The first question would be, what games do you play?

Some games are GPU dependent, some are CPU dependent, and others are just poorly optimized. While a second 280x should give you good improvements, it might not be the best "bang for your buck"


----------



## Mercennarius

Quote:


> Originally Posted by *bamaredwingsfan*
> 
> I'm running a sapphire 280x currently, an I'd like to bump up gaming performance. At the moment my cpu is the athlon x4 860k. My question is, if I find another 280x will that jump performance up a good deal or should I sell the 280x I have an get something newer? Am I about to start hitting the cpu' s limit if I get a second card? I'd like to get something a little more power efficient, so if I do add a second card it won't tax my corsair 850 PSU. Any advice is appreciated.
> 
> Sent from my XT1575 using Tapatalk


I was just in your shoes trying to decide the best route to go and after much research decided upgrading to a 290X was a better overall option than a 280X in crossfire. Sell your 280X and buy a 290X is my recommendation. At that point your CPU will be your limitation.


----------



## Roboyto

Quote:


> Originally Posted by *bamaredwingsfan*
> 
> I'm running a sapphire 280x currently, an I'd like to bump up gaming performance. At the moment my cpu is the athlon x4 860k. My question is, if I find another 280x will that jump performance up a good deal or should I sell the 280x I have an get something newer? Am I about to start hitting the cpu' s limit if I get a second card? I'd like to get something a little more power efficient, so if I do add a second card it won't tax my corsair 850 PSU. Any advice is appreciated.
> 
> Sent from my XT1575 using Tapatalk


Odds are you will be OK with the 850W PSU even if you added another 280X. I know people are running dual 290 systems, with bigger CPUs, on 750/850 PSUs.

@hatchet_warrior brings up a good point. What games do you play *and* at what resolution? Lots of titles may not be optimized for dual GPUs so there could be little to no gain with a second GPU. If you were going to make an upgrade I would say you should get a better CPU/motherboard. I don't know much about the new Athlon chips, but I know they're not made for high demand gaming. If you want dual GPUs you should move to AM3+ platform, or Intel since even the i3 is a better gaming CPU than your Athlon, where the FX-6300 is a great deal for better performance or the 8320(e) so you get 8 core benefits with lower power/heat. If you have a microcenter near you, you can get a decent ASRock 970 motherboard with an 8320e for about $130 out the door last time I checked.

If you wanted to stick to single GPU a 290X is a good buy currently; but their stock is depleting quickly. One big GPU is nearly a guarantee for better performance across any benchmark/game compared to potential issues with two cards.

A decent review that pitted the 290X against 280X and others:

http://www.tweaktown.com/reviews/6555/his-iceq-r9-290x-ipower-hybrid-iturbo-4gb-oc-ed-video-card-review/index.html

This review does have the hybrid water-cooled version of a 290X, but it's default clocks of 1100/1500 are possible with a good air-cooled version of the 290X


----------



## Megadrone

whats driver crash? you sure its not the card? it happens only in one game.


----------



## bamaredwingsfan

Games I play are, Fallout 4, Star Wars the old Republic, Marvel Heroes 2015, hopefully Mass Effect 4 an Star Citizen when they come out. My current plan was to run this cheaper AMD setup until the new socket and cpu's come out in 2016. Then I will decide from there if I want the new Intel or AMD cpu an MB.


----------



## Roboyto

Quote:


> Originally Posted by *bamaredwingsfan*
> 
> Games I play are, Fallout 4, Star Wars the old Republic, Marvel Heroes 2015, hopefully Mass Effect 4 an Star Citizen when they come out. My current plan was to run this cheaper AMD setup until the new socket and cpu's come out in 2016. Then I will decide from there if I want the new Intel or AMD cpu an MB.


That was going to be my other suggestion. There will also be the next gen GPUs hitting the market, which will undoubtedly drive prices down on current models. Waiting a little while will bring good things.


----------



## hatchet_warrior

Quote:


> Originally Posted by *bamaredwingsfan*
> 
> Games I play are, Fallout 4, Star Wars the old Republic, Marvel Heroes 2015, hopefully Mass Effect 4 an Star Citizen when they come out. My current plan was to run this cheaper AMD setup until the new socket and cpu's come out in 2016. Then I will decide from there if I want the new Intel or AMD cpu an MB.


Stay true to that plan then. I am in a similar situation. I want a new Freesync monitor, which would mean new GPU, but right now it's just not a good time to upgrade if you already have a decent setup. Hold out for the next generation of HBM graphics cards and AMD's new chips in 2016.


----------



## Exxlir

Quote:


> Originally Posted by *Roboyto*
> 
> If it only happens when the card is loaded, then it is most likely coil whine coming from somewhere.
> 
> Is the noise coming from the connectors at the card, or at the PSU? It's possible the PSU is emitting the noise.
> 
> I would suggest altering clocks, voltages, and/or power settings with an OC utility as this can change the tone/frequency of the sound or at least reduce it.
> 
> Have you tried using different PCIe power cables? If you have extras from your current GPU. It's not wise to mix/math cables from different PSUs. This may sound silly, but it doesn't hurt to try.
> 
> How about trying different PCIe power ports from the PSU? If your rig sig is correct than you are using the EVGA 650G2, and depending on which variant it can have more than two VGA/PCIe power ports.
> 
> To answer your question as to why the noise would be coming from/near the power connectors. The area with the yellow box around it in the photo below is the heatsink covering the VRMs for the R9 280X toxic. The green boxes are around the chokes. The VRMs, or voltage regulation modules, are what convert the power from the PSU to the necessary amounts the card requires. The chokes are supposed to reduce/eliminate the noise that you are hearing, but you may have a faulty card. If your card is still under warranty you may want to contact Sapphire and see what they can do for you.


thank you for your answer, since the last time I posted I still get the noise that is happening and it Is 100% coming from the vga connectors that are plugged in to the Gpu end not the psu.

Not sure why this is happening but as you've shown the picture and explanation I will still continue to monitor the Gpu, I have continued to use it for 2 weeks now and nothing significant has happened in the mean time the noise is still ongoing but it's not got any louder or quieter I think it might be just as simple as coil whine but time will tell if further things go wrong In the future.


----------



## MicroCat

Thermal Dilemma.

Helping a friend build a low cost gaming system. We manged to get a used Giga 280x OC Rev 2 for $80. Good deal. However, this card spent some time toiling in the mines and the stock fans are done working for the bitcoin man. I'm not a fan of small fans anyhow, so I've ghetto'd a pair of 120mm AC F12s onto the cooler.

Highest temp in benches is 69c at 1100/1500. Also used 3x 80mm and peaked at 74c.

I've scoured the first couple hundred pages of this thread to find a definitive answer to my questions about cooler remount requirements.

My question is...Are these temps decent or do I need to remount the cooler, new TIM, new pads for memory and new pads for the VRM sink?

If so, what thickness for VRM pads, .5mm or 1mm? Is it worth it to go Fujipoly 11watt/m-k crazy? (Spending 25% of the card cost on thermal pads is a violation of the cheapskate principle)

Thanks!


----------



## hatchet_warrior

70c is a pretty good temp. I know of people that would love that. I believe max safe temp is 90-95c on that card. I have the same one and rarely hit 70c. They have a pretty damn good heatsink.


----------



## joeyxl

Hey guys hows it going?

i know im a little late to join the club, but ive had a R9280 in my rig for about a year now and wanted to join! heres the specs:

Model:R9280
Manufacturer: Club 3D
Series:Royal King
CPU: 960MHz
RAM:3GB of [email protected]

heres a pic of the card. sorry for the ugly air cooler, liquid cooling is going into this rig (missed my previous water loop







)


----------



## Devildog83

Quote:


> Originally Posted by *joeyxl*
> 
> Hey guys hows it going?
> 
> i know im a little late to join the club, but ive had a R9280 in my rig for about a year now and wanted to join! heres the specs:
> 
> Model:R9280
> Manufacturer: Club 3D
> Series:Royal King
> CPU: 960MHz
> RAM:3GB of [email protected]
> 
> heres a pic of the card. sorry for the ugly air cooler, liquid cooling is going into this rig (missed my previous water loop
> 
> 
> 
> 
> 
> 
> 
> )
> 
> You are in! Welcome !!! Nice little budget rig.
> 
> I do have one question though, you spent almost as much on your keyboard as you did on the CPU and Motherboard, was there a reason for that?


----------



## joeyxl

Quote:


> Originally Posted by *Devildog83*
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeyxl*
> 
> Hey guys hows it going?
> 
> i know im a little late to join the club, but ive had a R9280 in my rig for about a year now and wanted to join! heres the specs:
> 
> Model:R9280
> Manufacturer: Club 3D
> Series:Royal King
> CPU: 960MHz
> RAM:3GB of [email protected]
> 
> heres a pic of the card. sorry for the ugly air cooler, liquid cooling is going into this rig (missed my previous water loop
> 
> 
> 
> 
> 
> 
> 
> )
> 
> You are in! Welcome !!! Nice little budget rig.
> 
> I do have one question though, you spent almost as much on your keyboard as you did on the CPU and Motherboard, was there a reason for that?
> 
> 
> 
> the keyboard i have is amazing. i have had it for 5 years now and not one thing is wrong with it
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...


----------



## MiladEd

Hey! I wanted to update my clocks! I've further overlclocked the core to 1125 MHz (fully stable, no artifacts whatsoever) and Set the Memory clock to 1500 MHz. 1600 MHz was very troublesome. High temps, artifacts and little performance gain.


----------



## F3ERS 2 ASH3S

How is everyone doing with the new Crimson Drivers?


----------



## Devildog83

Quote:


> Originally Posted by *joeyxl*
> 
> the keyboard i have is amazing. i have had it for 5 years now and not one thing is wrong with it
> 
> 
> 
> 
> 
> 
> 
> .


That makes sense now.


----------



## Devildog83

Quote:


> Originally Posted by *MiladEd*
> 
> Hey! I wanted to update my clocks! I've further overlclocked the core to 1125 MHz (fully stable, no artifacts whatsoever) and Set the Memory clock to 1500 MHz. 1600 MHz was very troublesome. High temps, artifacts and little performance gain.


Updated. You will find that you get more performance from a core overclock than from a memory anyhow. A slightly lower memory and higher core is always better and more stable than trying to push the memory. Good job.


----------



## MiladEd

Quote:


> Originally Posted by *Devildog83*
> 
> Updated. You will find that you get more performance from a core overclock than from a memory anyhow. A slightly lower memory and higher core is always better and more stable than trying to push the memory. Good job.


Thanks! The temp difference is massive. Previously I was 1100 MHz core and 1600 MHz VRAM with load temps exceeding 79 C. Now, with 1125 MHz core and 1500 MHz VRAM it hardly goes any higher than 72 C. I've also increased the voltages quite a bit.


----------



## Scrimstar

idk if this is the right place to ask, but which 380x out of these two is better? cooling and overclock wise. i know some cards dont allow voltage adjustment..

http://www.amazon.com/dp/B017WMD8ZM/ref=psdc_284822_t1_B017WMD1VI

http://www.amazon.com/dp/B017WMD1VI/ref=psdc_284822_t2_B017WMD0ZA


----------



## diggiddi

I'd go with the Sapphire


----------



## MiladEd

Quote:


> Originally Posted by *Scrimstar*
> 
> idk if this is the right place to ask, but which 380x out of these two is better? cooling and overclock wise. i know some cards dont allow voltage adjustment..
> 
> http://www.amazon.com/dp/B017WMD8ZM/ref=psdc_284822_t1_B017WMD1VI
> 
> http://www.amazon.com/dp/B017WMD1VI/ref=psdc_284822_t2_B017WMD0ZA


I read a review on 380X which showed it had been overlcocked to 1200 (or something near that) without any voltage increase. It does allow voltage control with Sapphire TRIXX software, so it may even go higher.

That being said, 380X and 280X are very, very close. 380X sometimes beats 280X and vice versa. 380X does have some new and better features like FreeSync, and uses slightly lower power, though not much.

Anyway, I love my 280X, and will not upgrade until Arctic Islands, maybe even not after that.


----------



## NeoReaper

Hey, question to all you R9 270x owners,
I have been having some trouble trying to find a bios with unlocked voltage because every bios I flash from the gpu database supports Elpida Memory (Which I think is causing the issue when I go and install drivers)
If any of you R9 270x members have a card that has Hynix memory and unlocked voltage (Not just power limiter), can you save the bios and upload it for me please?


----------



## MasterBillyQuizBoy

Can any XFX owners out there explain what the CDFC, EDFC, CDFR etc. naming conventions are?

I have a CDFC but there are also some others on the site that seem exactly the same. Some are obvious like having an extra 2GB of memory, but I'm stumped on the rest. The only thing I can tell between mine and this other card is that one is "Standard" class and the other is "Warrior" class. Whatever that means, they're both the same clock speeds.

What do these four letter codes stand for?

I have the CDFC R9-270x


----------



## $ilent

Hi all

Can anyone tell me if it's possible to flash a 280 to 280x? If so which gpus can do this or can they all flash?

Thanks


----------



## GoLDii3

Quote:


> Originally Posted by *$ilent*
> 
> Hi all
> 
> Can anyone tell me if it's possible to flash a 280 to 280x? If so which gpus can do this or can they all flash?
> 
> Thanks


No,you cant. The closer thing you can have is a 7950 with a 7970 PCB but you can't do anything about the shaders.


----------



## AVATARAT

Hello,
During I play game today I got BSOD and after restart in my Crimson OC menu I saw this:











My Crimson before that was up to 1200 Core speed, now I try at 1260 and it work. But I scary to try up








The card is with original bios and I never do any mod on it.


----------



## MiladEd

I used to get BSOD with insufficient voltages.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *AVATARAT*
> 
> Hello,
> During I play game today I got BSOD and after restart in my Crimson OC menu I saw this:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My Crimson before that was up to 1200 Core speed, now I try at 1260 and it work. But I scary to try up
> 
> 
> 
> 
> 
> 
> 
> 
> The card is with original bios and I never do any mod on it.


What version of crimson are you running. The first release had a bug


----------



## AVATARAT

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> What version of crimson are you running. The first release had a bug


Yes I used first release. I update and now all is ok.

Thanks.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *AVATARAT*
> 
> Quote:
> 
> 
> 
> Originally Posted by *F3ERS 2 ASH3S*
> 
> What version of crimson are you running. The first release had a bug
> 
> 
> 
> Yes I used first release. I update and now all is ok.
> 
> Thanks.
Click to expand...

Fantastic!


----------



## N2Gaming

Sub'd just got a AXR9 380 and currently uninstalling the Nvidia Drivers.


----------



## MiladEd

So, I've a question. My monitor had a setting for 75 Hz refresh rate, but only on 4:3 resolutions. I figured out you can force a custom resolution and refresh rate with Radeon settings, so I forced 1080p @ 75 Hz, which seems to be working fine so far. I tried playing some GTA V to see how smoother it feels, and it does feel quite a bit smoother (my 280X manages to deliver 75 FPS pretty well, actually), but what was more interesting is that, even though I still have frame drops to 50s, they too look smoother in a 75 Hz refresh rate? How come? FYI VSync is off but framerate is capped at 75 Hz.


----------



## Flisker_new

Hey guys,

anyone here tried to disable green power leds on R9 280X ? I'm building white/black/red rig and the green leds are terrible there









Is it possible to turn them off by some software ? Or do I need to mod bios ? Is it even possible ?

Any info would be awesome











It's Asus pcb.


----------



## radier

Use black tape









Taptaptap Mlais M52


----------



## Flisker_new

Quote:


> Originally Posted by *radier*
> 
> Use black tape
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Taptaptap Mlais M52


Tried that, it's absolutely terrible solution









- it doesn't stick well and doesn't cover leds completely
- also it just looks terrible, having black tape on graphics card board


----------



## Harrywang

Ever since a couple months ago (3-4 months I would say) my HIS r9 280x idle temperature has increased by 10degrees. Has anyone else noticed this? Is it just my card?? Load temperatures has also been increased a little bit. It's in stock settings. Used the bios switch and still the same. Cleaned my fans and heatsink a little bit and no go. Also got a new case with very good air flow and nothing has changed in terms of temp. It used to idle at 30 degrees now it's 40 constantly. My cpu idles at 25-30 still.

Could this be a thermal paste problem?? Thats all I can think of. Maybe drivers to. Before I was using 14.x drivers which were fine but I don't remember if switching to 15.x++ increased the temps.


----------



## BruceB

Quote:


> Originally Posted by *Harrywang*
> 
> Ever since a couple months ago (3-4 months I would say) my HIS r9 280x idle temperature has increased by 10degrees. Has anyone else noticed this? Is it just my card?? Load temperatures has also been increased a little bit. It's in stock settings. Used the bios switch and still the same. Cleaned my fans and heatsink a little bit and no go. Also got a new case with very good air flow and nothing has changed in terms of temp. It used to idle at 30 degrees now it's 40 constantly. My cpu idles at 25-30 still.
> 
> Could this be a thermal paste problem?? Thats all I can think of. Maybe drivers to. Before I was using 14.x drivers which were fine but I don't remember if switching to 15.x++ increased the temps.


Problems with thermal paste normally Show themselves imediately; maybe a new drive has changed your fan Profile? If you're using the _Crimson Red_ Driver you Need to make sure you're using the _Beta_, Problems with fan profiles have been reported with V1.0.


----------



## MasterBillyQuizBoy

I am currently troubleshooting two XFX R9-270's in Crossfire.

I also just realized my boards 2nd PCIE slot is only X4 speed.

How much am I really losing on that second slot only a X4 and not X16?
Specs list the 2nd slot as "1 x PCIe 2.0 x16 (x4 mode, black) "

It is an Asus M5A880V EVO. When I had the setup working I saw gains in 3D Mark that seemed in line for system.


----------



## diggiddi

Quote:


> Originally Posted by *MasterBillyQuizBoy*
> 
> I am currently troubleshooting two XFX R9-270's in Crossfire.
> 
> I also just realized my boards 2nd PCIE slot is only X4 speed.
> 
> How much am I really losing on that second slot only a X4 and not X16?
> Specs list the 2nd slot as "1 x PCIe 2.0 x16 (x4 mode, black) "
> 
> It is an Asus M5A880V EVO. When I had the setup working I saw gains in 3D Mark that seemed in line for system.


http://www.overclock.net/t/967533/crossfire-x16-x4-vs-x8-x8-comparison
http://www.anandtech.com/show/5458/the-radeon-hd-7970-reprise-pcie-bandwidth-overclocking-and-msaa


----------



## MasterBillyQuizBoy

Quote:


> Originally Posted by *diggiddi*
> 
> http://www.overclock.net/t/967533/crossfire-x16-x4-vs-x8-x8-comparison
> http://www.anandtech.com/show/5458/the-radeon-hd-7970-reprise-pcie-bandwidth-overclocking-and-msaa


Interesting, thanks. Not sure what these specs mean though. It's listed both as x16 and x4 with a color thrown in too. Dunno if paying $100 for a dual x16 board will be worth it. As long as the frames are above 60, the limit of my TV.


----------



## arr-e

Hi, just sharing my latest OC:

MSI R9 270X 2GB @ TwinFrozr IV

GPU Clock: 1220MHz
Mem. Clock: 1600MHz
Core voltage: +50mV
Power Limit: +20%

3DMark Fire Strike

Max temp: 56ºC / Idle: 29ºC


----------



## radier

Use Far Cry 3/Crysis 3 for stability testing. Unigine Valey or Catzilla are good too.

Taptaptap Mlais M52


----------



## Zellgadus

Hello everyone, I have 2x MSI R9 280X in a crossfire setup. I usually run them stock But I'd like to try overclocking them soon since I have them both water cooled now thanks to Kraken G10s and H55s. Now, I have a few questions for any other R9280X owners. Does anyone have issues with the Windows 10 and the new Radeon Crimson software?

I'll explain what I mean by issues. A lot of my games have become unplayable if I play them at full screen. Some of them is the keyword. For example, Deus Ex human Revolution works in full screen, same for Star wars Battlefront and BF4. But titles like Tales of Zestiria and Skyrim don't even work anymore at all. Those 2 games can't even launch before crashing.Fairy Fencer gets a massive FPS drop in Fullscreen. League of Legends doesn't work in fullscreen and same for other games. Now except for Zestiria and Skyrim all the others work in borderless mode which makes it ok.

Now I'm wondering if anyone else has any issues either with windows 10 or the new Radeon Crimson drivers. If you're an R9 280X owner have you had any issues with fullscreen gaming?

This is my computer http://ca.pcpartpicker.com/p/PQJy6h .

Thanks for any feedback.


----------



## aaronsta1

Quote:


> Originally Posted by *Zellgadus*
> 
> Hello everyone, I have 2x MSI R9 280X in a crossfire setup. I usually run them stock But I'd like to try overclocking them soon since I have them both water cooled now thanks to Kraken G10s and H55s. Now, I have a few questions for any other R9280X owners. Does anyone have issues with the Windows 10 and the new Radeon Crimson software?
> 
> I'll explain what I mean by issues. A lot of my games have become unplayable if I play them at full screen. Some of them is the keyword. For example, Deus Ex human Revolution works in full screen, same for Star wars Battlefront and BF4. But titles like Tales of Zestiria and Skyrim don't even work anymore at all. Those 2 games can't even launch before crashing.Fairy Fencer gets a massive FPS drop in Fullscreen. League of Legends doesn't work in fullscreen and same for other games. Now except for Zestiria and Skyrim all the others work in borderless mode which makes it ok.
> 
> Now I'm wondering if anyone else has any issues either with windows 10 or the new Radeon Crimson drivers. If you're an R9 280X owner have you had any issues with fullscreen gaming?
> 
> This is my computer http://ca.pcpartpicker.com/p/PQJy6h .
> 
> Thanks for any feedback.


hmm skyrim crash before it starts. you dont happen to have razer cortex installed?


----------



## FireFoxII

Hi all and sorry for my english...

I'm having a Sapphire r9 270 and in overclock I reach 1000/1450 from original 945/1400

I can't overvolt video card because seems to be locked...

Now I want to try bios of 270x always from sapphire... There are 1070/1400 - 1100/1450 - 1150/1500 depending on version o video card

This can help me to increase my limit? Bios of 270x integrate a greater volt to reach these frequency?

Thanks for help


----------



## GoLDii3

Quote:


> Originally Posted by *FireFoxII*
> 
> Hi all and sorry for my english...
> 
> I'm having a Sapphire r9 270 and in overclock I reach 1000/1450 from original 945/1400
> 
> I can't overvolt video card because seems to be locked...
> 
> Now I want to try bios of 270x always from sapphire... There are 1070/1400 - 1100/1450 - 1150/1500 depending on version o video card
> 
> This can help me to increase my limit? Bios of 270x integrate a greater volt to reach these frequency?
> 
> Thanks for help


The R9 270 and R9 270X are not the same chip. You can't flash the R9 270X BIOS. You're going to brick your 270.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *FireFoxII*
> 
> Hi all and sorry for my english...
> 
> I'm having a Sapphire r9 270 and in overclock I reach 1000/1450 from original 945/1400
> 
> I can't overvolt video card because seems to be locked...
> 
> Now I want to try bios of 270x always from sapphire... There are 1070/1400 - 1100/1450 - 1150/1500 depending on version o video card
> 
> This can help me to increase my limit? Bios of 270x integrate a greater volt to reach these frequency?
> 
> Thanks for help


even if you where able to flash and the card still work (99% unlikely) you would have no benefit. not only would the cores still be locked reducing the performance, the clocks would not really change either. Your limiting factor is that you are unable to add voltage.. In this case there are 2 reasons that it could be.. that in the BIOS for the 270 it is restricted (again very unlikely) or there is an IC that is missing that would allow you to adjust voltage. The latter could be due to different revisions or the manufacturer simply did not want you to have that option.

For example, I own 2 XFX DD 280x
One has an IC that allows me to adjust the voltage however the other is 2 revisions off and no longer has the IC that allows the voltage

I can flash the BIOS to either or and it makes no difference


----------



## FireFoxII

Thanks... I believe r9 270 and 270x was same chip with different memory... Now it's clear...

And another question...

On techpowerup I can see that there is a newest bios for my card...

My was of 2013-11-05 22:05:00 and last was of 2014-02-14 05:46:00... Think that can is useful to upgrade?


----------



## c0V3Ro

Hey mates,
Anyone else with a XFX R9 280x double black that gpus clock drops down to 850 in 3Dmark?
Already tried different settings in Crimson 15.30.1025.1001 and Msi AB 4.02.
http://www.3dmark.com/fs/6967365
3Dmark reports default clocks, 1080/1550, but gpu-z log records 850/1500, vddc 1.114V.


----------



## radier

It's common bug in AMD drivers. In the next update from futuremark team they will introduce workaround for this issue. Right now it's available in beta on Steam.

Taptaptap Mlais M52


----------



## nidzakv

Is anyone using asus 280x v2 tri-slot version? If there is someone,
i wolud like to have some feedback about minimum fan speed in rpm,
and is the card audible in idle?
Thanks in advance...


----------



## c0V3Ro

Thanks
Quote:


> Originally Posted by *radier*
> 
> It's common bug in AMD drivers. In the next update from futuremark team they will introduce workaround for this issue. Right now it's available in beta on Steam.
> 
> Taptaptap Mlais M52


Thanks.
But you mean the paid version?
Already installed the latest demo and patch 4.4.1.
Happy new year for all!
Cheers!


----------



## Harrywang

HIS r9 280x iceq turbo boost

Able to get it at

1170 core
1600 memory
20% power limit
1269 core voltage

71-72c max on ungine heaven benchmark

Seems like kind of a ****ty OC but oh well..


----------



## FireFoxII

What is power limit? I see it and I can reach 20%... It's a type of overvolting?


----------



## GoLDii3

Quote:


> Originally Posted by *FireFoxII*
> 
> What is power limit? I see it and I can reach 20%... It's a type of overvolting?


It auments the card TDP (in Watts) so it does not throttle if it exceeds the BIOS TDP limit.

For example,if you have a VGA with a 250W TDP but you clock it lets say at 1,25V and it exceeds 250W the card will throttle the core and memory frequencies to stay within the 250W limit. Setting the power limit to 20% will let the card use 20% more power.


----------



## hatchet_warrior

So a little update on my RMA. I got my 280x back with a note saying they could not reproduce any errors. So that was $15 in shipping, $50 for a temp GPU, and 2 weeks wasted. It's been 4 months and I'm starting to experience major issues. I get the spiking and flickering textures that were shown in the earlier GTA V video, but now I'm getting weird black or yellow checkered blocks like this http://1drv.ms/1S0GXly

So far I have tested system RAM, and reverted my CPU to stock speeds. I have been through 3 different versions of drivers, with clean installs, and upgrades. I am planning on sending it back again, but want to make sure that they do something about it. I'm not looking for anything special, I just want the card to work as it should out of the box. I have attempted to share video, and screenshots of the issue, but I doubt anyone actually looks at those. Is there anything I can say or do to make sure I don't get my card back again?


----------



## c0V3Ro

Couldn't make a Xfx double black pass through 3dmark vantage with default clocks 1080/1550 or play BF4. Even with bios modded to 1.3V and 30%. Almost sure i got a lemon, huh?


----------



## Bartouille

Quote:


> Originally Posted by *c0V3Ro*
> 
> Couldn't make a Xfx double black pass through 3dmark vantage with default clocks 1080/1550 or play BF4. Even with bios modded to 1.3V and 30%. Almost sure i got a lemon, huh?


Back the memory down to 1500mhz because that's what it's rated for by the manufacturer (not xfx, I'm talking hynix, elpida or samsung depending what you got on your card). If it fixes the instability, it still means your card can't run the factory oc which you can RMA it for, but running memory 50mhz lower is not the end of the world either. If the problem still persists after lowering mem, you should rma. Never seen a card not being able to do 1080mhz at 1.3v, especially a 280X not a 280... That's why I think it's memory related.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Bartouille*
> 
> Quote:
> 
> 
> 
> Originally Posted by *c0V3Ro*
> 
> Couldn't make a Xfx double black pass through 3dmark vantage with default clocks 1080/1550 or play BF4. Even with bios modded to 1.3V and 30%. Almost sure i got a lemon, huh?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Back the memory down to 1500mhz because that's what it's rated for by the manufacturer (not xfx, I'm talking hynix, elpida or samsung depending what you got on your card). If it fixes the instability, it still means your card can't run the factory oc which you can RMA it for, but running memory 50mhz lower is not the end of the world either. If the problem still persists after lowering mem, you should rma. Never seen a card not being able to do 1080mhz at 1.3v, especially a 280X not a 280... That's why I think it's memory related.
Click to expand...

Memory related. However I say go back to the stock BIOS and see how it goes.. and check drivers first...

but that is the stock rating.. I would also check the sticker to make sure you got the Black edition

The last letters on the product number will tell you then it says the ref number


----------



## c0V3Ro

Thanks Bartouille and F3ERS 2 ASH3S








Clocks got stucked at 500/150, 0.950vddc.
Uninstalled Msi AB and reinstalled drivers. Got back to default. Weird.
Reflashed the OEM BIOS.
Droped the Vram to 1500. 1st try 3dmark11 gone bad. Raise overdrive to 20%.
Tuning on Crimson since Msi AB doesn't unlock overvolt.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *c0V3Ro*
> 
> Thanks Bartouille and F3ERS 2 ASH3S
> 
> 
> 
> 
> 
> 
> 
> 
> Clocks got stucked at 500/150, 0.950vddc.
> Uninstalled Msi AB and reinstalled drivers. Got back to default. Weird.
> Reflashed the OEM BIOS.
> Droped the Vram to 1500, 1st 3dmark11 gone bad. Raise overdrive to 20%.
> Tuning on Crimson since Msi AB doesn't unlock overvolt.


You should be good with the stock 1550 on the VRAM now


----------



## c0V3Ro

Still couldn't find why clocks stay at 850/1500, 99% gpu load, vddc 1.144 at GPU-z Log running FireStrike.









GPU-ZSensorLog-firestrike.txt 41k .txt file


This driver is kind of fuzzy, huh?
Score gets worse in Unigine Heaven DX11 running at 1030/1500/+30% compared to 1000/1500/+20%








Card can't run this bench at default clocks, also.


----------



## Bartouille

Quote:


> Originally Posted by *c0V3Ro*
> 
> Still couldn't find why clocks stay at 850/1500, 99% gpu load, vddc 1.144 at GPU-z Log running FireStrike.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPU-ZSensorLog-firestrike.txt 41k .txt file
> 
> 
> This driver is kind of fuzzy, huh?
> Score gets worse in Unigine Heaven DX11 running at 1030/1500/+30% compared to 1000/1500/+20%
> 
> 
> 
> 
> 
> 
> 
> 
> Card can't run this bench at default clocks, also.


There are drivers issues with Firestrike. My card doesn't boost on that benchmark either although it's watercooled and never exceeds 50c and well within power limits.


----------



## Dhoulmagus

Anybody else experiencing constant black screens and or crashes while gaming since upgrading to Crimson? It started on 15.11 for me with random black screens on CS: GO, updated to 15.12 yesterday and now its constant and crashing multiple games within minutes.

Single 280x, no 3rd party applications running (afterburner), all overclocks reverted, 24 hour memtest no errors, CPU stress tests no problems (Prime, AIDA, h.264) no temp issues at all, super frustrating.

Does Crimson not even have an uninstaller with it? I'm probably going to do another clean install just go go back to 15.7 catalyst with a clean conscience.


----------



## GoLDii3

Quote:


> Originally Posted by *Serious_Don*
> 
> Anybody else experiencing constant black screens and or crashes while gaming since upgrading to Crimson? It started on 15.11 for me with random black screens on CS: GO, updated to 15.12 yesterday and now its constant and crashing multiple games within minutes.
> 
> Single 280x, no 3rd party applications running (afterburner), all overclocks reverted, 24 hour memtest no errors, CPU stress tests no problems (Prime, AIDA, h.264) no temp issues at all, super frustrating.
> 
> Does Crimson not even have an uninstaller with it? I'm probably going to do another clean install just go go back to 15.7 catalyst with a clean conscience.


No.


----------



## c0V3Ro

Quote:


> Originally Posted by *Bartouille*
> 
> There are drivers issues with Firestrike. My card doesn't boost on that benchmark either although it's watercooled and never exceeds 50c and well within power limits.


Got a crash, freeze red screen, on BF4, 1080/1500/20%.
Regarding those clocks and driver issues the card runs pretty cold even with stock cooler.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *c0V3Ro*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bartouille*
> 
> There are drivers issues with Firestrike. My card doesn't boost on that benchmark either although it's watercooled and never exceeds 50c and well within power limits.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Got a crash, freeze red screen, on BF4, 1080/1500/20%.
> Regarding those clocks and driver issues the card runs pretty cold even with stock cooler.
Click to expand...

You sure its the vid card and not a ram issue? or bad OC on the processor?


----------



## Bartouille

Quote:


> Originally Posted by *c0V3Ro*
> 
> Got a crash, freeze red screen, on BF4, 1080/1500/20%.
> Regarding those clocks and driver issues the card runs pretty cold even with stock cooler.


Your card is unstable. Is your card the xfx be (black edition) model or you just flashed the be bios on a non-be model? If it's really a be card then you should rma. If not, then don't expect it to run at 1080mhz stable. Either way, in the meantime you can downclock to 1000/1500 which is the reference 280x clock and all cards should be able to do these clocks.


----------



## Roboyto

Quote:


> Originally Posted by *Serious_Don*
> 
> Anybody else experiencing constant black screens and or crashes while gaming since upgrading to Crimson? It started on 15.11 for me with random black screens on CS: GO, updated to 15.12 yesterday and now its constant and crashing multiple games within minutes.
> 
> Single 280x, no 3rd party applications running (afterburner), all overclocks reverted, 24 hour memtest no errors, CPU stress tests no problems (Prime, AIDA, h.264) no temp issues at all, super frustrating.
> 
> Does Crimson not even have an uninstaller with it? I'm probably going to do another clean install just go go back to 15.7 catalyst with a clean conscience.


I haven't had any issues with Crimson on either PC with Win X and my 290s. I have heard that some people have had issues and it took more than one install to get it working without qualms. Run Display Driver Uninstaller through safe mode, then use a registry cleaner like CCleaner, reboot after the registry cleaning, re-download the driver package and give it another shot.


----------



## c0V3Ro

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> You sure its the vid card and not a ram issue? or bad OC on the processor?


Pretty sure CPU OC is solid.
If card clocks are dropped it runs fine.
Card is a DB, checked sticker and box.

Will consider RMA it


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *c0V3Ro*
> 
> Quote:
> 
> 
> 
> Originally Posted by *F3ERS 2 ASH3S*
> 
> You sure its the vid card and not a ram issue? or bad OC on the processor?
> 
> 
> 
> Pretty sure CPU OC is solid.
> If card clocks are dropped it runs fine.
> Card is a DB, checked sticker and box.
> 
> Will consider RMA it
Click to expand...









poor card...


----------



## Harrywang

To get a stable overclock. Is it best to run unigine on extreme preset or custom?


----------



## karadoulis

XFX R9 280X Black Edition
1080 - 1550


----------



## lostsurfer

Quote:


> Originally Posted by *karadoulis*
> 
> 
> 
> XFX R9 280X Black Edition
> 1080 - 1550


What voltage sir?


----------



## karadoulis

Quote:


> Originally Posted by *lostsurfer*
> 
> What voltage sir?


I'll answer your question when I go back to home cause I am at work now. As I long as I remember I am using stock bios voltage.


----------



## karadoulis

Quote:


> Originally Posted by *lostsurfer*
> 
> What voltage sir?


12V: 12.19V
VDDC: 0.844V - 1.188V (While gaming)
MVDDC: 1500V

I am pretty sure one of those is the one you need!!


----------



## pagulhan

MSI 280x Gaming 1190 - 1550 1.281v (no time to check how high the memories can go... for now)
pagulhan


----------



## lostsurfer

Thanks! Yea my card had some issues I had to lower the stock volts on it but haven't gotten around to clocking it yet, but I feel it's coming!


----------



## lostsurfer

Quote:


> Originally Posted by *pagulhan*
> 
> 
> MSI 280x Gaming 1190 - 1550 1.281v (no time to check how high the memories can go... for now)
> pagulhan


What's your temps and asic, if you don't mind me asking?


----------



## karadoulis

Well I also have issues with it and I am trying different BIOS to find the one that is actually working - I recetly discovered that it had DD BIOS and not a Black Edition!!

Even the one from XFXSupport didn't work and now I am waiting for them to send me the stock one.

Hope it works!

Did you try a different BIOS? It shouldn't need to lower the clocks in the first place.. :/


----------



## pagulhan

Quote:


> Originally Posted by *lostsurfer*
> 
> What's your temps and asic, if you don't mind me asking?


I can't remember asic, I'll update this post later. I believe it's not even 70%; more like 68%. Temps are below 80 *C. Fan's speed is set to 92% but there's no a huge difference between 60 and 90% so I don't really care. MSI's used a quality fan. I think I can go for even lower voltage and keep it stable. A funny thing is on basic 1.2v oc's stable at 1130, on 1.25 it's 1155 and 1.281v for 1190 but I believe I can go for lower voltage for 1190 (at first I did 1.3v for 1200, then 1.3v for 1190 and finally 1.281 for 1190) so temps will change.


----------



## Egads

Hi All,

HIS IceQ 280x owner. Does anybody know if and which 300 series cards could crossfire with the 280x?

Thanks


----------



## Egads

never mind, found my answer...I guess not


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Egads*
> 
> never mind, found my answer...I guess not


Correct. 280x Is Tahiti GPU and 380x is Tonga they are different architectures of GCN


----------



## SabbathHB

HIS Iceq-X Turbo Boost 280X

GPU clock @1235
MEM clock @1655
board limit @+20
VDDC @1368mv
MVDDC @1556


----------



## Bartouille

Very nice paint job.


----------



## Venomrider

I'm curious to know if anyone has tried or has successfully flashed their R9 270x bios to a R9 280x bios, I don't know if this would be possible without bricking your card. Thanks


----------



## MiladEd

Quote:


> Originally Posted by *SabbathHB*
> 
> HIS Iceq-X Turbo Boost 280X
> 
> GPU clock @1235
> MEM clock @1655
> board limit @+20
> VDDC @1368mv
> MVDDC @1556
> 
> 
> Spoiler: Warning: Spoiler!


Holy crap, that's a huge OC! How are your temps looking?

I can't only do 1150 MHz core and 1500 MHz mem with VDDC of 1300 before I get unstable (aftefacts and all) and even that on some games, most games I just run 1125 MHz!
Quote:


> Originally Posted by *Venomrider*
> 
> I'm curious to know if anyone has tried or has successfully flashed their R9 270x bios to a R9 280x bios, I don't know if this would be possible without bricking your card. Thanks


It's not possible. 280X is a completely different architecture. 270X is Pictrain (IIRC) but 280X is Tahiti.


----------



## Harrywang

Quote:


> Originally Posted by *SabbathHB*
> 
> HIS Iceq-X Turbo Boost 280X
> 
> GPU clock @1235
> MEM clock @1655
> board limit @+20
> VDDC @1368mv
> MVDDC @1556


How did you get past 1300 core voltage?

what a coincidence because I was going to make a post about the exact same card.

Recently I got a new PSU and was able to get a stable OC of 1200 core and 1600 memory on 1.3v. It wouldn't crash at all in ungine heaven extreme preset and I even tried 1210 but I was getting to much artifacts.

It's weird because with my old PSU I was only able to get it stable at 1160 core at 1.269 volts. Anything higher and it would crash.


----------



## SabbathHB

Quote:


> Originally Posted by *MiladEd*
> 
> Holy crap, that's a huge OC! How are your temps looking?
> 
> I can't only do 1150 MHz core and 1500 MHz mem with VDDC of 1300 before I get unstable (aftefacts and all) and even that on some games, most games I just run 1125 MHz!
> It's not possible. 280X is a completely different architecture. 270X is Pictrain (IIRC) but 280X is Tahiti.


She was in the low 80's in firestrike. These were just benching clocks. I run it @ stock HIS clocks normally. But I can't say enough about the HIS cards and their cooling. I ran 2x HIS 6850's in xfire before I went to the 280x and they were great cards as well. HIS doesn't get nearly enough credit imo.
Quote:


> Originally Posted by *Harrywang*
> 
> How did you get past 1300 core voltage?
> 
> what a coincidence because I was going to make a post about the exact same card.
> 
> Recently I got a new PSU and was able to get a stable OC of 1200 core and 1600 memory on 1.3v. It wouldn't crash at all in ungine heaven extreme preset and I even tried 1210 but I was getting to much artifacts.
> 
> It's weird because with my old PSU I was only able to get it stable at 1160 core at 1.269 volts. Anything higher and it would crash.


I used the HIS OC tool. ITurbo I believe it's called.


----------



## Harrywang

I never knew that temps were so important when OCing graphics cards. In my old case/air flow setup I could only get a stable OC of 1170/1600mhz at 1.269 volts. It would always crash after 1170 core clock.

After upgrading some parts doing a full reinstall of windows this card has TOTALLY changed. I honestly don't really know what happend. I'm pretty stable playing my games. (WoW, CS GO, sc2, etc etc) This is the highest overclock I got so far in ungine. I'm very stable at 1200/1600 when gaming but wanted to see how far I can get. Going to play with these clocks for a couple days to see how stable I really am. Really weird TBH.



Max temp was 75c~. It's usually around 69-70c. It only goes up to 75c on the last image.

Edit: Forgot you couldn't see my volts. I'm running at max volts 1.3. 20% power limit


----------



## H1vemind

Hey just a quick question, is it possible to unlock voltage on a gigabyte r9 270x (2gb wf3) and what kinds of results could I expect from it on air?


----------



## GoLDii3

Quote:


> Originally Posted by *H1vemind*
> 
> Hey just a quick question, is it possible to unlock voltage on a gigabyte r9 270x (2gb wf3) and what kinds of results could I expect from it on air?


Gigabyte cards usually have locked voltage via hardware i think. Anyways you can try with TriXx or Afterburner.

I had a 7870 (270x before rebrand) that could do 1200 MHz core at stock voltages of 1,20V i think. Was a good card,going up to 1300 core/1400 memory.

So it really depends on your card,but you can for sure expect at least 50 MHz.


----------



## xutnubu

What's the recommended mem. clock for the 280x? I've heard higher is not always better due to GDDR5 having error correction.

My memory clocks really high without artifacts, around 1850MHz (Elpida chips btw).


----------



## radier

Use Unigine Valley to check if higher mem clock gives you higher score.

Taptaptap Mlais M52


----------



## neurotix

Hi guys, it's been a long time, I have some updates. What's up Devildog, Roboyto?

I sold my two R9 290 Vapor-X to get as much money as I can for them used, before Polaris comes out. I also sold the "Orange PC/Frankenputer" in my sig to my younger brother. I should have enough to get two $500 tier cards when they come out. So, I'm placing my bets on Polaris. If it really sucks compared to Nvidia I might just go Nvidia this time, but since I use 3 monitors I'd prefer to stay AMD because Eyefinity is better supported than Nvidia Surround.

So, in the meantime I need a card to use. My Sapphire R9 270X Vapor-X (golden card, 84% ASIC, does 1300mhz) is in Big Red now.

However, the only other card I have left is a 4670, the first modern GPU I bought, and that means the wife's computer is lacking a decent card (Blue in my sig).

So, I bought a Sapphire R9 380X Nitro new because I figure I need something a little more powerful for my Eyefinity, it's also newer and has twice the VRAM of my 270X. It should be here this week.

I'm wondering if this club includes the 380X/380/370 or if there is a different club for those cards?

I can take some benches and post them when I get the 380X if you guys want.

I'm not really playing any PC games that require the two 290s anyway. I have a ton of consoles and handhelds I've been playing. A backlog of probably 100 games I've bought. The last game I really got into was Dragon Age Inquisition and I'm done with it. Witcher 3 doesn't appeal to me. I don't play fps's. So, there aren't many computer games that appeal to me now since I mostly like RPGs.

Hope everything is going well for you all.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *neurotix*
> 
> Hi guys, it's been a long time, I have some updates. What's up Devildog, Roboyto?
> 
> I sold my two R9 290 Vapor-X to get as much money as I can for them used, before Polaris comes out. I also sold the "Orange PC/Frankenputer" in my sig to my younger brother. I should have enough to get two $500 tier cards when they come out. So, I'm placing my bets on Polaris. If it really sucks compared to Nvidia I might just go Nvidia this time, but since I use 3 monitors I'd prefer to stay AMD because Eyefinity is better supported than Nvidia Surround.
> 
> So, in the meantime I need a card to use. My Sapphire R9 270X Vapor-X (golden card, 84% ASIC, does 1300mhz) is in Big Red now.
> 
> However, the only other card I have left is a 4670, the first modern GPU I bought, and that means the wife's computer is lacking a decent card (Blue in my sig).
> 
> So, I bought a Sapphire R9 380X Nitro new because I figure I need something a little more powerful for my Eyefinity, it's also newer and has twice the VRAM of my 270X. It should be here this week.
> 
> I'm wondering if this club includes the 380X/380/370 or if there is a different club for those cards?
> 
> I can take some benches and post them when I get the 380X if you guys want.
> 
> I'm not really playing any PC games that require the two 290s anyway. I have a ton of consoles and handhelds I've been playing. A backlog of probably 100 games I've bought. The last game I really got into was Dragon Age Inquisition and I'm done with it. Witcher 3 doesn't appeal to me. I don't play fps's. So, there aren't many computer games that appeal to me now since I mostly like RPGs.
> 
> Hope everything is going well for you all.


380x is not a rebrand of the 280x.. it is a new architecture based on tonga instead of tahiti.. I don't think that it will be included as it is GCN 2 and Tahiti is GCN 1. It would be cool to see how it stacks up though so don'[t feel discouraged







iirc the only reason why 270 and 270x was included was because they where rebrands and the 270x really wouldn't have much of a group not sure if there is one for the 380x or not.


----------



## xutnubu

Quote:


> Originally Posted by *radier*
> 
> Use Unigine Valley to check if higher mem clock gives you higher score.
> 
> Taptaptap Mlais M52


Yeah, never mind about those 1850MHz, I artifact in actual games. This is why I barely use Valley/Heaven.

If you want to see if you're stable fast just launch Crysis 3, you'll know in 5 minutes. I can do 1600MHz no problem.


----------



## neurotix

Is there a Tonga thread or 380X/380 owner's club?


----------



## radier

Is this so hard to find out using Google?

AMD R9 285 / 380 / 380X Tonga/Tonga XT Owners Discussion Thread


----------



## neurotix

I ended up finding it anyway.


----------



## lostsurfer

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> 
> 
> 
> 
> 
> 
> 
> poor card...


What does the DB represent? I'm about to RMA my card, never looked for the DB on the S/N label?


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *lostsurfer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *F3ERS 2 ASH3S*
> 
> 
> 
> 
> 
> 
> 
> 
> poor card...
> 
> 
> 
> What does the DB represent? I'm about to RMA my card, never looked for the DB on the S/N label?
Click to expand...

TDB

Tahiti - Double dissipation - Black edition IIRC


----------



## Devildog83

Hey everyone. I have been out for a while as my 290x took a dump but I am back. I have been short of cash lately so have have been running off of the on-board graphics on my 6600k, I know that sucks. I did just buy a 270x Devil on the cheap until I can afford a better card. I only have a 1080p monitor so this isn't a huge deal for now. If I have missed anyone needing to be added please let me know and I can do that.


----------



## SabbathHB

Quote:


> Originally Posted by *Devildog83*
> 
> Hey everyone. I have been out for a while as my 290x took a dump but I am back. I have been short of cash lately so have have been running off of the on-board graphics on my 6600k, I know that sucks. I did just buy a 270x Devil on the cheap until I can afford a better card. I only have a 1080p monitor so this isn't a huge deal for now. If I have missed anyone needing to be added please let me know and I can do that.


Hey man welcome back. This was mine to be added, thanks. http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-270-owners-club/9250#post_24790839


----------



## Devildog83

Quote:


> Originally Posted by *SabbathHB*
> 
> Hey man welcome back. This was mine to be added, thanks. http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-270-owners-club/9250#post_24790839


Sorry about this but I guess I can't. I no longer have the ability to edit the thread. Someone must have pulled that from me because I was absent for so long.


----------



## SabbathHB

Quote:


> Originally Posted by *Devildog83*
> 
> Sorry about this but I guess I can't. I no longer have the ability to edit the thread. Someone must have pulled that from me because I was absent for so long.


No worries man.


----------



## N2Gaming

What card is better and faster for gaming the R9 380 4GB 256 bit or the R9 280X 3GB 384 bit?


----------



## MiladEd

Quote:


> Originally Posted by *N2Gaming*
> 
> What card is better and faster for gaming the R9 380 4GB 256 bit or the R9 280X 3GB 384 bit?


R9 280X is faster, but not that much. 380 is newer, has more features, is slightly more effecient, but still not much. I however suggest going for R9 380X, which is slightly faster than 280X and has all the features of 380, and it's not that more expensive than 380.


----------



## Devildog83

Quote:


> Originally Posted by *N2Gaming*
> 
> What card is better and faster for gaming the R9 380 4GB 256 bit or the R9 280X 3GB 384 bit?


It depends on the resolution you are running, if you run 1080p I would take the faster card but if you run at a higher res like 1440p I would go for more ram which helps run higher res at higher settings. Overall I would say the 380 anyhow with a bit of an overclock it would most likely outperform the 280x anyway.


----------



## N2Gaming

Quote:


> Originally Posted by *MiladEd*
> 
> Quote:
> 
> 
> 
> Originally Posted by *N2Gaming*
> 
> What card is better and faster for gaming the R9 380 4GB 256 bit or the R9 280X 3GB 384 bit?
> 
> 
> 
> R9 280X is faster, but not that much. 380 is newer, has more features, is slightly more effecient, but still not much. I however suggest going for R9 380X, which is slightly faster than 280X and has all the features of 380, and it's not that more expensive than 380.
Click to expand...

I asked because I purchased the R9 380 before Christmas and now I notice the 280x and the 380x. I have seen the 380x for just a little not much more than what I paid for the 280. $199.00 shipped iirc with a $20.00 MIR that I never submitted :/ I got the cheapest and fastest card I could at the time of purchase not knowing about the 280X or the 380x GPU's

Quote:


> Originally Posted by *Devildog83*
> 
> Quote:
> 
> 
> 
> Originally Posted by *N2Gaming*
> 
> What card is better and faster for gaming the R9 380 4GB 256 bit or the R9 280X 3GB 384 bit?
> 
> 
> 
> It depends on the resolution you are running, if you run 1080p I would take the faster card but if you run at a higher res like 1440p I would go for more ram which helps run higher res at higher settings. Overall I would say the 380 anyhow with a bit of an overclock it would most likely outperform the 280x anyway.
Click to expand...

Res is set at 1080P for now. I tried to run the New TV at 4096 x 2160 and did not like the image at all LOL. I guess I can try running the 1440P to see how I like it







Just purchased a new LG 55UF6450 TV







to replace my dying Samsung 40" LCD









Edit: So far I like the image at 2560 x 1440 using the HDMI port from GPU to TV. At this point I don't have any sound yet. Can any one tell me how to get sound to the GPU inside the PC so the TV will get the sound through the HDMI cable?


----------



## Devildog83

Just got my R9 270x Devil which I stole for $130. The system looks so much better with a GPU inside again. Extra pics in the spoiler for those on a phone. I wish these guys would have stuck the logo on the other way, it's upside down.





Spoiler: Warning: Spoiler!


----------



## N2Gaming

Was able to learn the audio options through the sound in the control panel to get audio through the HDMI. Loving it in Battlefield Bad Company 2. Only game I've tried with new resolutions so far. Other games quit working on me due to Nvidia Physx not being installed any more.


----------



## rdr09

Quote:


> Originally Posted by *Devildog83*
> 
> Just got my R9 270x Devil which I stole for $130. The system looks so much better with a GPU inside again. Extra pics in the spoiler for those on a phone. I wish these guys would have stuck the logo on the other way, it's upside down.
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


I always admire your work. i cannot match it!


----------



## Devildog83

Quote:


> Originally Posted by *rdr09*
> 
> I always admire your work. i cannot match it!


Thank you, that's nice of you to say. Much appreciated.


----------



## Devildog83

Well the fun didn't last long. As posted I installed my 270x devil in my system yesterday and it seemed to work fine. Today I wanted to do some CPU testing with my loop because I am getting the new v2 H110i extreme performance for review and wanted to compare the 2. Well I was looking around in CPU z and noticed only 8 gigs of ram were showing up and 16 is installed so of course I said oh crap and went to ram testing. After moving the ram around to different slots I eventually got no post at all, just a black screen. I moved the HDMI to the mobo and nothing so I took out the 270x and presto all seemed well. I tried a different slot for the card and same thing, no post. I put in my wife's GPU and it worked perfect. I guess the GPU is a brick. I am returning it to get something else. Such a bummer, I love the Devil's. I may get a 280x instead for now.


----------



## rdr09

Quote:


> Originally Posted by *Devildog83*
> 
> Well the fun didn't last long. As posted I installed my 270x devil in my system yesterday and it seemed to work fine. Today I wanted to do some CPU testing with my loop because I am getting the new v2 H110i extreme performance for review and wanted to compare the 2. Well I was looking around in CPU z and noticed only 8 gigs of ram were showing up and 16 is installed so of course I said oh crap and went to ram testing. After moving the ram around to different slots I eventually got no post at all, just a black screen. I moved the HDMI to the mobo and nothing so I took out the 270x and presto all seemed well. I tried a different slot for the card and same thing, no post. I put in my wife's GPU and it worked perfect. I guess the GPU is a brick. I am returning it to get something else. Such a bummer, I love the Devil's. I may get a 280x instead for now.


That's sucks. Just found a reason to upgrade.


----------



## MiladEd

Quote:


> Originally Posted by *Devildog83*
> 
> Well the fun didn't last long. As posted I installed my 270x devil in my system yesterday and it seemed to work fine. Today I wanted to do some CPU testing with my loop because I am getting the new v2 H110i extreme performance for review and wanted to compare the 2. Well I was looking around in CPU z and noticed only 8 gigs of ram were showing up and 16 is installed so of course I said oh crap and went to ram testing. After moving the ram around to different slots I eventually got no post at all, just a black screen. I moved the HDMI to the mobo and nothing so I took out the 270x and presto all seemed well. I tried a different slot for the card and same thing, no post. I put in my wife's GPU and it worked perfect. I guess the GPU is a brick. I am returning it to get something else. Such a bummer, I love the Devil's. I may get a 280x instead for now.


Ah that sucks man. That's such a good looking GPU too.

280X is great, won't regret it. It refuses to die! 4 years after release and it's still relevant!


----------



## Roboyto

Not sure if everyone saw this, but it is a call to arms Red Team!

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/0_20

*Originally Posted by Noxinite 

Yes as per the HWBot rules:

Allowed optimisations :
Disable subtests
Driver settings finetuning
Tesselation tweaking*


----------



## neurotix

Quote:


> Originally Posted by *Devildog83*
> 
> Sorry about this but I guess I can't. I no longer have the ability to edit the thread. Someone must have pulled that from me because I was absent for so long.


Devil, this happened to me with my Sega and Retrogaming fan clubs. *You need to PM Enterprise and give him a link to the OP of this thread and ask for edit rights to add members.* I did this and got my edit rights back. Just do it when you have the time to.

Your rig looks absolutely sick and I second rdr09. I don't know how you get it so clean looking. It looks fantastic.

Sorry to hear about your new Devil 270X. I hope you can get your money back.

As far as the 280X vs the 380/380X, I would recommend the 380X to you guys in a heartbeat, even if you have to get it new. For $220 (sometimes less depending on brand and rebate) you are getting quite a powerful card.

The 380X is actually significantly more powerful than a 7970 or 280X.

3dmark11
7970: 11884
380X: 15570
290: 17186


Fire Strike
7970: 7940
380X: 9465
290: 12010

Those are my best scores among the 7970, 380X and 290 using the same CPU and overclock.

The 380X overclocked is actually in stock reference 290 levels of performance.

The 280X is identical to the 7970, but the 380X is actually newer and faster. I attribute the increased performance to the 380X being GCN 2.0, and the driver improvements they added in the Omega and now the Crimson drivers.

I don't have my 7970 anymore so I can't throw it in to see if it's really the drivers or not.

But anyway, TL;DR the 380X comes highly recommended. It is much faster than the 280X in my testing.


----------



## radier

Maybe it's faster in 3D Marks but not in games.

P.S. Tonga is GCN 1.3

Taptaptap Mlais M52


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> Devil, this happened to me with my Sega and Retrogaming fan clubs. *You need to PM Enterprise and give him a link to the OP of this thread and ask for edit rights to add members.* I did this and got my edit rights back. Just do it when you have the time to.
> 
> Your rig looks absolutely sick and I second rdr09. I don't know how you get it so clean looking. It looks fantastic.
> 
> Sorry to hear about your new Devil 270X. I hope you can get your money back.
> 
> As far as the 280X vs the 380/380X, I would recommend the 380X to you guys in a heartbeat, even if you have to get it new. For $220 (sometimes less depending on brand and rebate) you are getting quite a powerful card.
> 
> Thanks neurotix, Over the years I have learned tricks that I use over and over to keep builds as clean as possible. I got the Asus DC II 280x because of cash flow for $145. It's a temp purchase until work picks up and I can get a 390 which is where I want to go. I needed something under $150 for now.


----------



## Boinz

Currently using the XFX TDFD
http://www.newegg.com/Product/Product.aspx?Item=N82E16814150678

http://www.techpowerup.com/gpuz/details.php?id=ue7rc

Can i join?


----------



## Catscratch

Darn, I still couldn't learn to live with "280x won't wake monitor". I knew some 7900s and most 280x are known for Black Screen which was one of the reasons I waited until my 6850 gave out. I still went with 280x trix because 290 was outta reach (almost half $$$ more for it)

I only make the monitor sleep, it's set to 10 mins. I think it became more frequent. I wonder if it could actually be tied to TeamViewer, I also have the notorious "Win8.1 start button and charms won't work" problem. When the monitor sleeps, half the time, I have to hot plug the dvi cable to wake it. At least the case is next to the monitor. I'll have to uninstall TeamViewer and restart a couple times and test :/


----------



## Catscratch

Quote:


> Originally Posted by *Boinz*
> 
> Currently using the XFX TDFD
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150678
> 
> http://www.techpowerup.com/gpuz/details.php?id=ue7rc
> 
> Can i join?


That's a stylish looking card. I likey.


----------



## Boinz

Quote:


> Originally Posted by *Catscratch*
> 
> That's a stylish looking card. I likey.


Me too, bought it used from ebay. IO bracket was bent but i bent it back, and first fan buzzed, nothing a little oil on the bearing couldn't fix, this is probably the biggest card i've ever owned though.
Quote:


> 11.61" x 5.63"


----------



## Catscratch

Quote:


> Originally Posted by *Boinz*
> 
> Me too, bought it used from ebay. IO bracket was bent but i bent it back, and first fan buzzed, nothing a little oil on the bearing couldn't fix, this is probably the biggest card i've ever owned though.


Mine was making some noise too but not the fans, it's the shroud. I had feared it could bent the slot or itself when I first plugged the card. Well trix is a longer card. So I supported the other end with a pen, between the case bottom and the card's shroud. I guess it instead made the shroud bent somewhere and started to rattle awhile ago at %60 fan speed.

Even thou it's a trix, I never liked stock shrouds. The second 6850 started fan whining, I slapped Thermalright HR-3 GT on it. Funny thing is, I can put that thing on 280x too, same holes but it's not rated for this wattage card :/ If I get too desperate I'll have to remove the shroud and put 2x12 fans on the sink and wire it to the default fan header so it can still throttle them.


----------



## Devildog83

SabbathHB has been added, welcome. I will be adding everyone else that has submitted lately later today since I now have control again.

Thanks,

DD


----------



## SabbathHB

Sweet man, thanks!


----------



## N2Gaming

I use to have problems waking my pc from sleep when I was using a nvidia GTX 460 but with this new powercolor R9 380 the pc wakes up every time no problemo. Does any one have a nvidia gpu working for PhysX with a Radion card


----------



## Devildog83

Quote:


> Originally Posted by *Boinz*
> 
> Currently using the XFX TDFD
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150678
> 
> http://www.techpowerup.com/gpuz/details.php?id=ue7rc
> 
> Can i join?


You have been added, welcome !!!


----------



## Boinz

Quote:


> Originally Posted by *Devildog83*
> 
> You have been added, welcome !!!


Thank you DD.


----------



## Devildog83

I don't know what was wrong but I got the Devil to work, alas I have already listed it and should have my 280x tomorrow so upgrade it is.


----------



## bamaredwingsfan

Add me to if you will. I have a sapphire r9 280x myself


----------



## Devildog83

Quote:


> Originally Posted by *bamaredwingsfan*
> 
> Add me to if you will. I have a sapphire r9 280x myself


What clocks should I list and do you have a pic?


----------



## N2Gaming

Quote:


> Originally Posted by *N2Gaming*
> 
> Does any one have a nvidia gpu working for PhysX with a Radion card


Bump any one???


----------



## neurotix

Sorry, it's not 2007 anymore.

I don't really know anyone who still uses a dedicated PhysX card with an AMD GPU. It's pretty much pointless because most games use other physics engines that generally run on the CPU. If you have a strong modern CPU (e.g. Intel) you should have no trouble just installing the PhysX drivers and running PhysX on your CPU for games that need it.

I can't even recall the last game I played that used PhysX. It's pretty much dead.

Of course, if people want to correct me if I'm wrong, that's fine. But in my understanding PhysX isn't really a thing anymore and you certainly don't need a card dedicated to it wasting space and power, when it can run on the CPU with an AMD card and modern CPUs are plenty fast enough to run it brute-force.


----------



## Devildog83

I am now moved up to the 280x section. Just got this today and it works perfect. Only $145.



Anyone know if you can get a backplate?


----------



## Devildog83

I have a bit of flicker when I am in a browser, does anyone else and does anyone know how to get rid of it?


----------



## N2Gaming

Quote:


> Originally Posted by *neurotix*
> 
> Sorry, it's not 2007 anymore.
> 
> I don't really know anyone who still uses a dedicated PhysX card with an AMD GPU. It's pretty much pointless because most games use other physics engines that generally run on the CPU. If you have a strong modern CPU (e.g. Intel) you should have no trouble just installing the PhysX drivers and running PhysX on your CPU for games that need it.
> 
> I can't even recall the last game I played that used PhysX. It's pretty much dead.
> 
> Of course, if people want to correct me if I'm wrong, that's fine. But in my understanding PhysX isn't really a thing anymore and you certainly don't need a card dedicated to it wasting space and power, when it can run on the CPU with an AMD card and modern CPUs are plenty fast enough to run it brute-force.


Well I had games working with the Nvidia GPU and PhysX eneabled but after I unistalled the drivers for the Nvidia cards some of my games quit working. I was not aware I could just install PhysX drivers with out using a PhysX GPU.

EDIT: Thank you neurotix, I am able to play the old games again


----------



## neurotix

Oh, no problem.

I don't really know when it changed or whatever but at some point you no longer needed an Nvidia GPU in your system to run PhysX. It can run on the CPU instead as long as you have the driver installed (e.g. PhysX shows up in your control panel). I'm pretty sure any CPU made in the last 5 years is easily as powerful as say, a 8800GT for PhysX, if not more so.

Help me out if I helped you out


----------



## N2Gaming

My Ph II X4 955 BE is doing me just fine for now. I'll upgrade eventually when IDK


----------



## Skye12977

I'm wondering if anyone can help me with this, do AMD GPUs hate Intel CPUs, or are the AMD GPUs just a bottleneck?
Let me explain.
First: my 3570k with a 290x
Second: my 3570k with a 280x
Now for Nvidia
First my 3570k with a 660
Second my 3570k with my 780

Now outside of this, I think I may have something wrong with my software or hardware: My post here.
1.Lower CPU "score"
2.Blueish tint to my screen
3.Icons on my desktop seem to have lines go through them. (I have two identical screens next to each other, one is hooked up to my 3570k/280x computer, and the other my i7 660m laptop. My screen hooked up to the laptop looks sharp and bright, wheras the 280x looks blue and dull with the said lines going though icons and such).


----------



## neurotix

Not really sure what to tell you, but those scores seem perfectly in line to what the cards are, to me.

The lower CPU score with AMD GPUs, even though the CPU is the same... Futuremark prefers Nvidia, obviously

I hope you get your hardware sorted out, but I can't tell you much other than to wipe your GPU drivers before reinstalling them. This is why I only use/bench AMD and don't touch Nvidia. (Driver issues when mixing.) Failing that, I would suggest picking a card and sticking with it, and perhaps reinstalling Windows.


----------



## Skye12977

Quote:


> Originally Posted by *neurotix*
> 
> Not really sure what to tell you, but those scores seem perfectly in line to what the cards are, to me.
> 
> The lower CPU score with AMD GPUs, even though the CPU is the same... Futuremark prefers Nvidia, obviously
> 
> I hope you get your hardware sorted out, but I can't tell you much other than to wipe your GPU drivers before reinstalling them. This is why I only use/bench AMD and don't touch Nvidia. (Driver issues when mixing.) Failing that, I would suggest picking a card and sticking with it, and perhaps reinstalling Windows.


I eneded up fixing the problem with my CPU score, but not with my DVI problem
Orignal Run of my 280x
Reinstaleld drivers 280x run


----------



## Catscratch

Quote:


> Originally Posted by *Devildog83*
> 
> I have a bit of flicker when I am in a browser, does anyone else and does anyone know how to get rid of it?


I do get single flicker once in awhile on a browser. Never bothered me much. Not waking up from sleep annoys me more. I ended up just NOT using windows monitor sleep so I turn the monitor off myself. Gonna wear the power button down quick


----------



## Boinz

Quote:


> Originally Posted by *Catscratch*
> 
> I do get single flicker once in awhile on a browser. Never bothered me much. Not waking up from sleep annoys me more. I ended up just NOT using windows monitor sleep so I turn the monitor off myself. Gonna wear the power button down quick


Thank god for my monitor's capactive button.


----------



## Devildog83

Quote:


> Originally Posted by *Catscratch*
> 
> I do get single flicker once in awhile on a browser. Never bothered me much. Not waking up from sleep annoys me more. I ended up just NOT using windows monitor sleep so I turn the monitor off myself. Gonna wear the power button down quick


I don't know if it's a driver issue or what but it can get annoying. It didn't do it with my 290x, 7870 devil or 270x devils but this one is bad. I think I will try an older driver since I still use windows 7. It's only in a web browser.


----------



## Devildog83

In Valley is a 1905 score and 45.5+ average FPS about right for a stock clock 280x?


----------



## Catscratch

Quote:


> Originally Posted by *Devildog83*
> 
> In Valley is a 1905 score and 45.5+ average FPS about right for a stock clock 280x?




Mine looks low now with a factory overclocked Sapphire.


----------



## neurotix

Devildog:

45fps is right in Valley for a stock 280x.

Around 7500-8000 would be good for Fire Strike (non-extreme).

My 380X actually does a lot more in Fire Strike but is about the same in Valley, 45 fps. Which is strange because I recall breaking 50fps in Valley with a 7970.


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> Devildog:
> 
> 45fps is right in Valley for a stock 280x.
> 
> Around 7500-8000 would be good for Fire Strike (non-extreme).
> 
> My 380X actually does a lot more in Fire Strike but is about the same in Valley, 45 fps. Which is strange because I recall breaking 50fps in Valley with a 7970.


I haven't overclocked it yet but I am sure I could get over 50 fps if I did. I will try firestrike next.

Good news I installed the new 16.1 driver and 99% of the flickering in browsers is gone.

I wonder if the 4 Gbs of memory makes a difference in firestrike as opposed to 3Gbs on the 280x.


----------



## Devildog83

1st firestrike at stock gpu with cpu at 4.3

2nd firestrike at a slight overclock 1119/1650 cpu still at 4.3


I could go higher on both CPU and GPU plus my memory is running under specs at 2133 instead of rated 2800. I don't know if memory helps firestrike or not.

3rd Valley with the same slight overclock.


What's cool is volts are unlocked in afterburner but were locked in GPU tweak so if I wanted I could go higher.


----------



## neurotix

System RAM speed doesn't really affect Fire Strike in my testing. I see no difference between 2133, 2400 and 2666.


Spoiler: Fire Strike







Fire Strike with my 380X for comparison.

I can't seem to break 45 fps with it in Valley though. All the Futuremark benches score higher than a 280X/7970 with the 380X.


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> System RAM speed doesn't really affect Fire Strike in my testing. I see no difference between 2133, 2400 and 2666.
> 
> 
> Spoiler: Fire Strike
> 
> 
> 
> 
> 
> 
> 
> Fire Strike with my 380X for comparison.
> 
> I can't seem to break 45 fps with it in Valley though. All the Futuremark benches score higher than a 280X/7970 with the 380X.


That's weird because Valley is almost completely GPU dependent. I got the same scores with a Pentium G3258 as I did with an FX 8350.

Why did AMD step down the memory bus and memory bandwidth on the 380 series?

mine - 316 bandwidth and 384 bit bus
your 380 - 192 bandwidth and 256 bus


----------



## radier

Tonga has better compression mechanism.

Taptaptap Mlais M52


----------



## Devildog83

I bumped up slightly. I set the RAM up to 2800 from 2133 and the Physics and combined jumped nicely. The GPU is now at 1150 1660, just that small jump netted 500 graphics points.


----------



## neurotix

Agreed, Tonga has the better memory compression so it doesn't need as high of a memory bus.

Devil, you should try and back the ram on the card down to 1500 and get the core clock up to 1200. Push that voltage.


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> Agreed, Tonga has the better memory compression so it doesn't need as high of a memory bus.
> 
> Devil, you should try and back the ram on the card down to 1500 and get the core clock up to 1200. Push that voltage.


I had it up to 1200 with a bit of added voltage, the stock RAM speed is 1600, so you think I should go under that?


----------



## F3ERS 2 ASH3S

Anyone do the VR test on 280x yet?


----------



## neurotix

Quote:


> Originally Posted by *Devildog83*
> 
> I had it up to 1200 with a bit of added voltage, the stock RAM speed is 1600, so you think I should go under that?


No. I figured the stock RAM speed would be 1500, mainly only because every AMD GPU I've bought and many I've seen post R9 name change has come at 1500 tops.

Set the RAM speed to 1600, crank the volts and see if you can get it past 1200mhz on the core.

With my 380X, the RAM comes stock at 1500. I found out it can do 1600 no problem. However, it added a fraction of fps to Valley (like 0.3) and like 50 points to my total Fire Strike score. No point.

Valley is good for preliminary tests and for testing RAM speed. Usually if I go too far on the RAM I will black screen almost immediately in Valley, which requires a hard reset. However, Valley is pretty bad in general for testing overclocks as I've seen it be "stable" in Valley yet immediately crash in another game or in Fire Strike (e.g. this happened with my 270X at 1300mhz).

Fire Strike is a really good way to stress test, especially if you loop it. When I see artifacts in Fire Strike they are usually immediately apparent and they usually show up as blue flashes all over the screen.

With a 280X on air though, I'm willing to bet you won't do much past 1200mhz anyway for gaming. You never know though, you might get really lucky with the card and get one that can game at 1250 or something.


----------



## Echoa

Hey guys, looking to join the ranks with a 4gb Gigabyte 270x. Opinions on the Gigabyte ones and what kinda clocks can i expect? There aren't many reviews for the Gigabyte version.


----------



## Devildog83

Quote:


> Originally Posted by *neurotix*
> 
> No. I figured the stock RAM speed would be 1500, mainly only because every AMD GPU I've bought and many I've seen post R9 name change has come at 1500 tops.
> 
> Set the RAM speed to 1600, crank the volts and see if you can get it past 1200mhz on the core.
> 
> With my 380X, the RAM comes stock at 1500. I found out it can do 1600 no problem. However, it added a fraction of fps to Valley (like 0.3) and like 50 points to my total Fire Strike score. No point.
> 
> Valley is good for preliminary tests and for testing RAM speed. Usually if I go too far on the RAM I will black screen almost immediately in Valley, which requires a hard reset. However, Valley is pretty bad in general for testing overclocks as I've seen it be "stable" in Valley yet immediately crash in another game or in Fire Strike (e.g. this happened with my 270X at 1300mhz).
> 
> Fire Strike is a really good way to stress test, especially if you loop it. When I see artifacts in Fire Strike they are usually immediately apparent and they usually show up as blue flashes all over the screen.
> 
> With a 280X on air though, I'm willing to bet you won't do much past 1200mhz anyway for gaming. You never know though, you might get really lucky with the card and get one that can game at 1250 or something.


This card must be factory overclocked. It is a DC II Top, heck I don't know. I had it at 1200 with only 1.2v and I think I can go to 1.3v if I dare. After bricking the 270x Devil and a 290x I am a bit apprehensive about trying to go too far. I can't afford to get another card right now. I keep it at stock except for a few bench's.


----------



## Charles877

Hello guys, im new on the web.

I have a problem With my Gigabyte R9 270 OC

The Video card comes with 975Mhz clock, When i try to play some games the display drivers stop working, the only one solution that i find was underclock it to 925Mhz (the stock for that chip)

But i get less FPS on some games, i want to know if are some way to solve this.


----------



## dragneel

Hey, i have some nub questions. There seems to be problems with Overdrive in the latest driver where overclocking the memory locks it in at that speed and removes the ability to put it back down so what program would I be better off using? XFX DD Black edition 280(non-x).

Also, roughly what would be considered a safe OC for it?


----------



## neurotix

Quote:


> Originally Posted by *dragneel*
> 
> Hey, i have some nub questions. There seems to be problems with Overdrive in the latest driver where overclocking the memory locks it in at that speed and removes the ability to put it back down so what program would I be better off using? XFX DD Black edition 280(non-x).
> 
> Also, roughly what would be considered a safe OC for it?


Don't use Overdrive through the AMD drivers to overclock.

Use either MSI Afterburner or Sapphire Trixx to overclock.

Either of those should allow you to set memory clocks to whatever you want manually.

EDIT: Seems I misinterpreted your problem. If the memory isn't downclocking, make sure you have ULPS (ultra low power state enabled). Also, a reasonable overclock for a 280 would be 1100mhz on the core. 1200mhz would be excellent and anything higher than that would be exceptional. At 1200mhz you should be as fast as a 280X.

To the guy above you with the 975mhz problem, the card probably isn't getting enough voltage through it's stock bios to run that speed. Try using Afterburner or Trixx to bump the stock voltage up a bit. You could also be overheating and that could make you crash while gaming. Please set fan speed higher (100%) if it doesn't bother you.


----------



## Catscratch

Quote:


> Originally Posted by *Catscratch*
> 
> 2500k 4ghz - 280x trix 1020mhz stock speed and power.
> *1080p*
> 
> dx11 - standard - 36.8 fps
> 
> 
> dx12 - standard - 43 fps
> 
> 
> dx11 - crazy - 25.3 fps
> 
> 
> dx12 - crazy - 26.5 fps
> 
> 
> The beta2 bench has texture glitches. 280x just dont have cahones for crazy preset. However, I have to emphasize this, because of driver overhead also visible on screenshots, the dx11 frametimes were erratic thus jittery gameplay despite FPS and preset. Dx12 was smooth despite preset and fps because of low overhead thus steady frametimes.


Anyone has the beta2 Ashes of the Singularity to bench ? I'm wondering if my 280x is doing things properly.


----------



## hatchet_warrior

So I'm coming to the end of my 280x RMA saga. It was finally determined that it was unfix-able, so a replacement is being offered. But they offered me a 380. I'm not too happy that they didn't offer at least a 380x, or even an option to upgrade to a 390. I know other people have received slight upgrades when a perfect replacement was not available. But I have never heard of a downgrade. I'm attempting to get a different replacement but communication is slow, and I'm trying to decide how much longer I want to wait.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hatchet_warrior*
> 
> So I'm coming to the end of my 280x RMA saga. It was finally determined that it was unfix-able, so a replacement is being offered. But they offered me a 380. I'm not too happy that they didn't offer at least a 380x, or even an option to upgrade to a 390. I know other people have received slight upgrades when a perfect replacement was not available. But I have never heard of a downgrade. I'm attempting to get a different replacement but communication is slow, and I'm trying to decide how much longer I want to wait.


well, a 380 being Tonga is actually an equivalent, 380x is an upgrade


----------



## GoLDii3

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> well, a 380 being Tonga is actually an equivalent, 380x is an upgrade


No.


----------



## hatchet_warrior

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> well, a 380 being Tonga is actually an equivalent, 380x is an upgrade


I would agree that if I was to purchase a new 380 or a used 280x, the 380 would be the better choice. From everything I have seen, the 280x and 380x are about as close as you can get. I understand the the 380 is also close, but in many cases it is a step down. Going forward with dx12 and 4k I'm sure the 380 can pull ahead, but that is not important today. If a 380x is not available, I would like to be given the option for a $0 380 or a $X 390.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *hatchet_warrior*
> 
> Quote:
> 
> 
> 
> Originally Posted by *F3ERS 2 ASH3S*
> 
> well, a 380 being Tonga is actually an equivalent, 380x is an upgrade
> 
> 
> 
> I would agree that if I was to purchase a new 380 or a used 280x, the 380 would be the better choice. From everything I have seen, the 280x and 380x are about as close as you can get. I understand the the 380 is also close, but in many cases it is a step down. Going forward with dx12 and 4k I'm sure the 380 can pull ahead, but that is not important today. If a 380x is not available, I would like to be given the option for a $0 380 or a $X 390.
Click to expand...

It features improved tessellation performance, lossless delta color compression in order to reduce memory bandwidth usage, an updated and more efficient instruction set, a new high quality scaler for video, and a new multimedia engine (video encoder/decoder). Delta color compression is supported in Mesa

^with the features that it provides the +/- 4% performance hit (which equates out to just a few frames any way) so all in all.. it actually is equal IIRC doesn't it support trueaudio and a few other things.. I can't remember fully atm.. And isn't there more VRAM which helps future proof a little

now if you don't use any of the added features then correct it is !=


----------



## neurotix

Quote:


> Originally Posted by *GoLDii3*
> 
> No.


No.

He's actually right.

The 380 and 380X are significantly faster than the 280X.

Here's my data taken from earlier in the thread:

As far as the 280X vs the 380/380X, I would recommend the 380X to you guys in a heartbeat, even if you have to get it new. For $220 (sometimes less depending on brand and rebate) you are getting quite a powerful card.

The 380X is actually significantly more powerful than a 7970 or 280X.

3dmark11
7970: 11884
380X: 15570
290: 17186


Fire Strike
7970: 7940
380X: 9465
290: 12010

Those are my best scores among the 7970, 380X and 290 using the same CPU and overclock.

The 380X overclocked is actually in stock reference 290 levels of performance.

The 280X is identical to the 7970, but the 380X is actually newer and faster. I attribute the increased performance to the 380X being GCN 1.3, and the driver improvements they added in the Omega and now the Crimson drivers.

I don't have my 7970 anymore so I can't throw it in to see if it's really the drivers or not.

*So there you go guys.*

In Futuremark benches, the 380X is at least 20% faster than the 280X at the same clock speed. This carries over to games and other benchmarks as well. An overclocked 380X comes close to or equals a stock reference 290.


----------



## GoLDii3

Quote:


> Originally Posted by *neurotix*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> No.
> 
> He's actually right.
> 
> The 380 and 380X are significantly faster than the 280X.
> 
> Here's my data taken from earlier in the thread:
> 
> As far as the 280X vs the 380/380X, I would recommend the 380X to you guys in a heartbeat, even if you have to get it new. For $220 (sometimes less depending on brand and rebate) you are getting quite a powerful card.
> 
> The 380X is actually significantly more powerful than a 7970 or 280X.
> 
> 3dmark11
> 7970: 11884
> 380X: 15570
> 290: 17186
> 
> 
> Fire Strike
> 7970: 7940
> 380X: 9465
> 290: 12010
> 
> Those are my best scores among the 7970, 380X and 290 using the same CPU and overclock.
> 
> The 380X overclocked is actually in stock reference 290 levels of performance.
> 
> The 280X is identical to the 7970, but the 380X is actually newer and faster. I attribute the increased performance to the 380X being GCN 1.3, and the driver improvements they added in the Omega and now the Crimson drivers.
> 
> I don't have my 7970 anymore so I can't throw it in to see if it's really the drivers or not.
> 
> *So there you go guys.*
> 
> In Futuremark benches, the 380X is at least 20% faster than the 280X at the same clock speed. This carries over to games and other benchmarks as well. An overclocked 380X comes close to or equals a stock reference 290.


The 380X is faster of course since it's the 280X updated to GCN 1.2,but the 380 is not.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *GoLDii3*
> 
> Quote:
> 
> 
> 
> Originally Posted by *neurotix*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> No.
> 
> He's actually right.
> 
> The 380 and 380X are significantly faster than the 280X.
> 
> Here's my data taken from earlier in the thread:
> 
> As far as the 280X vs the 380/380X, I would recommend the 380X to you guys in a heartbeat, even if you have to get it new. For $220 (sometimes less depending on brand and rebate) you are getting quite a powerful card.
> 
> The 380X is actually significantly more powerful than a 7970 or 280X.
> 
> 3dmark11
> 7970: 11884
> 380X: 15570
> 290: 17186
> 
> 
> Fire Strike
> 7970: 7940
> 380X: 9465
> 290: 12010
> 
> Those are my best scores among the 7970, 380X and 290 using the same CPU and overclock.
> 
> The 380X overclocked is actually in stock reference 290 levels of performance.
> 
> The 280X is identical to the 7970, but the 380X is actually newer and faster. I attribute the increased performance to the 380X being GCN 1.3, and the driver improvements they added in the Omega and now the Crimson drivers.
> 
> I don't have my 7970 anymore so I can't throw it in to see if it's really the drivers or not.
> 
> *So there you go guys.*
> 
> In Futuremark benches, the 380X is at least 20% faster than the 280X at the same clock speed. This carries over to games and other benchmarks as well. An overclocked 380X comes close to or equals a stock reference 290.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The 380X is faster of course since it's the 280X updated to GCN 1.2,but the 380 is not.
Click to expand...

380 is still within good margin of being equal to a 280x and any other loss that is there is made up in features and better tessellation. So you end up getting a better overall experience.. not saying it is an upgrade.. I am just saying that for them to replace a 280x with a 380 isn't actually all that bad of a replacement.


----------



## Catscratch

280x is still a better choice in dx11 titles. Dx12 will change it. 380 380x and 285 have 8 ACEs which will be useful in dx12 titles. 280x only has 2. Latest example for dx11 is Farcry Primal. 280x holds its ground against 380x and faster than the other two tongas.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Catscratch*
> 
> 280x is still a better choice in dx11 titles. Dx12 will change it. 380 380x and 285 have 8 ACEs which will be useful in dx12 titles. 280x only has 2. Latest example for dx11 is Farcry Primal. 280x holds its ground against 380x and faster than the other two tongas.


^this is true, the only premise of the conversation was due to someone receiving a 380 from an RMA because 280x is out of stock


----------



## Echoa

My 4gb Gigabyte 270x arrived today! Stress testing and overclocking @1200core 1500memory as we speak. It's miles cooler than my 6970 which even aftermarket cooler ran 65-70c normally but I haven't peaked past 55c yet on this 270x


----------



## Echoa

So here she is, in not exactly glory because of my dark room (maybe a better pic for later) please add me to the club :]


----------



## Catscratch

Quote:


> Originally Posted by *Echoa*
> 
> So here she is, in not exactly glory because of my dark room (maybe a better pic for later) please add me to the club :]


Sweet. My triple fan cooled card salutes your triple fan cooled card








Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> ^this is true, the only premise of the conversation was due to someone receiving a 380 from an RMA because 280x is out of stock


Yes. Very bad timing for a 280x card to go out. The goons at RMA should accept a 380x as a replacement. Namewise and Performance wise. The "it may/will perform better with DX12 titles" part should remain as "customer satisfaction" and compensation for not being able to use the paid but defective product.

https://www.techpowerup.com/reviews/ASUS/R9_380X_Strix/23.html


----------



## hatchet_warrior

Well I finally received word back about the RMA. They told me that a 380 was on par with a 280x, but for all my trouble they offered me a 380x. At the end of the day I would have take the 380 if it was my only option. I was just pretty pissed on principle alone. Considering when I first started the RMA there were a total of 0 Dx12 games I felt it was wrong to leverage potential performance against the actual performance I used to have. I think it's a slap in the face to people that buy into a product early.

Either way my little rant is over. I can see how a 380 is "suitable" but as Catscratch said, they should err on the side of slightly better. Good news is that many Gigabyte employees told me they are rebuilding the RMA department and changing the process. Hopefully that means no one will have to do 6 week RMAs like I just did.


----------



## Echoa

Quote:


> Originally Posted by *Catscratch*
> 
> Sweet. My triple fan cooled card salutes your triple fan cooled card



I return that salute sir

Had to drop my clock down to 1190mhz on core to get stable in crysis 3, kept having a driver crash at 1200 xP oh well its not like 10mhz is a big deal


----------



## Boinz

Quote:


> Originally Posted by *hatchet_warrior*
> 
> Well I finally received word back about the RMA. They told me that a 380 was on par with a 280x, but for all my trouble they offered me a 380x. At the end of the day I would have take the 380 if it was my only option. I was just pretty pissed on principle alone. Considering when I first started the RMA there were a total of 0 Dx12 games I felt it was wrong to leverage potential performance against the actual performance I used to have. I think it's a slap in the face to people that buy into a product early.
> 
> Either way my little rant is over. I can see how a 380 is "suitable" but as Catscratch said, they should err on the side of slightly better. Good news is that many Gigabyte employees told me they are rebuilding the RMA department and changing the process. Hopefully that means no one will have to do 6 week RMAs like I just did.


Nice, congrats on the 380x upgrade.


----------



## Catscratch

Quote:


> Originally Posted by *hatchet_warrior*
> 
> Well I finally received word back about the RMA. They told me that a 380 was on par with a 280x, but for all my trouble they offered me a 380x. At the end of the day I would have take the 380 if it was my only option. I was just pretty pissed on principle alone. Considering when I first started the RMA there were a total of 0 Dx12 games I felt it was wrong to leverage potential performance against the actual performance I used to have. I think it's a slap in the face to people that buy into a product early.
> 
> Either way my little rant is over. I can see how a 380 is "suitable" but as Catscratch said, they should err on the side of slightly better. Good news is that many Gigabyte employees told me they are rebuilding the RMA department and changing the process. Hopefully that means no one will have to do 6 week RMAs like I just did.


6 Weeks, phew. I'm glad you get a 380x. You'll be in a different club now but still, I hope this MS store vs Dx12 ruckus is resolved in favor of gamers and you get good gaming as much as you can.
Quote:


> Originally Posted by *Echoa*
> 
> 
> I return that salute sir
> 
> Had to drop my clock down to 1190mhz on core to get stable in crysis 3, kept having a driver crash at 1200 xP oh well its not like 10mhz is a big deal


Cool, keep it working.


----------



## hatchet_warrior

Quote:


> Originally Posted by *Boinz*
> 
> Nice, congrats on the 380x upgrade.


Thanks
Quote:


> Originally Posted by *Catscratch*
> 
> 6 Weeks, phew. I'm glad you get a 380x. You'll be in a different club now but still, I hope this MS store vs Dx12 ruckus is resolved in favor of gamers and you get good gaming as much as you can.


Yeah 4 weeks for the first RMA the "found no problems." I'm looking forward to DX12. But first I need to decide on a new monitor. My current one is starting to show signs of dying.


----------



## rdr09

Quote:


> Originally Posted by *hatchet_warrior*
> 
> Thanks
> Yeah 4 weeks for the first RMA the "found no problems." I'm looking forward to DX12. But first I need to decide on a new monitor. My current one is starting to show signs of dying.


http://www.overclock.net/t/1592334/vx2457-mhd-freesync-performance-specs-are-very-vague


----------



## hatchet_warrior

Quote:


> Originally Posted by *rdr09*
> 
> http://www.overclock.net/t/1592334/vx2457-mhd-freesync-performance-specs-are-very-vague


I saw the NX-VUE24A last night and like the looks of it. Then I saw its a flash deal at newegg. My wife would probably kill me if I buy it before my current monitor goes black.


----------



## Stige

Is it possible to unlock the 280X Voltage control? How about VRM temp monitoring? I tried the AB profile cfg trick but it didn't unlock for me atleast.


----------



## Devildog83

Quote:


> Originally Posted by *Echoa*
> 
> So here she is, in not exactly glory because of my dark room (maybe a better pic for later) please add me to the club :]


I am going to guess and say that's an Gigabyte Windforce R9 270x but if you could please tell me exactly what card it is and then I will add you.

Thank's,
DD


----------



## Echoa

Quote:


> Originally Posted by *Devildog83*
> 
> I am going to guess and say that's an Gigabyte Windforce R9 270x but if you could please tell me exactly what card it is and then I will add you.
> 
> Thank's,
> DD


Yea it's gigabyte 4gb 270x

Edit: also said it was in the post just above that one with the gpuz







i didn't think i forgot to mention what it was. Also dropped core to 1175mhz for stability if clocks will be included


----------



## Devildog83

Quote:


> Originally Posted by *Echoa*
> 
> Yea it's gigabyte 4gb 270x
> 
> Edit: also said it was in the post just above that one with the gpuz
> 
> 
> 
> 
> 
> 
> 
> i didn't think i forgot to mention what it was. Also dropped core to 1175mhz for stability if clocks will be included


You get more performance from core clocks, I would back the memory down to 1400 and try to get more from the core.


----------



## Echoa

Quote:


> Originally Posted by *Devildog83*
> 
> You get more performance from core clocks, I would back the memory down to 1400 and try to get more from the core.


I might give it a go when I get home at work atm, wish the voltage wasn't locked but don't want to fool around with BIOS till I have $ to replace it

Edit: Unfortunately even dropping ram down wont let me run 1200mhz stable :[


----------



## Devildog83

Quote:


> Originally Posted by *Echoa*
> 
> I might give it a go when I get home at work atm, wish the voltage wasn't locked but don't want to fool around with BIOS till I have $ to replace it


I have added you. WELCOME!!!


----------



## Agent Smith1984

What would be considered a "good" overclock on a pitcairn core? And then what would be considered more towards an "awesome" OC??

Wife is running a 7870, and I've got 1200/1450 locked in with no problem @ 1.25v, fully stable.

What's weird is, I can run this thing all the way up to 1275MHz @ 1.3v with no artifacts, but it eventually just black screens. If I reduce the VRAM clock to 1400, it prolongs this happening, but still blacks out around 45 minutes in. Mind you this is all on air.

Is 1200/1450 pretty good for these? I've only used tahiti and hawaii cards, and don't have much experience with pitcairn at all...

Would it be worth trying to keep reducing memory until I can get the core n the 1250-1275 range, or will the lower core and higher memory bandwidth help since the cqrd is somewhat bandwidth limited with it's 256 bit bus? Mind you this powers a 1080P TV.

I have been really impressed with how well the little card does. With the overclock in place, t's beating my son's stock XFX 7950!

Thanks


----------



## Echoa

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What would be considered a "good" overclock on a pitcairn core? And then what would be considered more towards an "awesome" OC??
> 
> Wife is running a 7870, and I've got 1200/1450 locked in with no problem @ 1.25v, fully stable.
> 
> What's weird is, I can run this thing all the way up to 1275MHz @ 1.3v with no artifacts, but it eventually just black screens. If I reduce the VRAM clock to 1400, it prolongs this happening, but still blacks out around 45 minutes in. Mind you this is all on air.
> 
> Is 1200/1450 pretty good for these? I've only used tahiti and hawaii cards, and don't have much experience with pitcairn at all...
> 
> Would it be worth trying to keep reducing memory until I can get the core n the 1250-1275 range, or will the lower core and higher memory bandwidth help since the cqrd is somewhat bandwidth limited with it's 256 bit bus? Mind you this powers a 1080P TV.
> 
> I have been really impressed with how well the little card does. With the overclock in place, t's beating my son's stock XFX 7950!
> 
> Thanks


If it's only a 1080p TV if you can drop the RAM and go for the core it'll be better. It's my understanding that they aren't exactly bandwidth limited especially for 1080p usage. As for a good OC or not from what I've been able to find 1200 core is decent, 1450 memory isn't but core matters more anyway. Give it a go and see if you can squeeze it 1250ish MHz.


----------



## Neocoolzero

So earlier this year, right in the beginning actually, my R9 280x from XFX started acting out and died on me.Never dabbled with ocing her, so was kinda perplexed with what hapenned, pc would start but no image whatsoever, and after messing around a bit, i noticed if i pushed the card slightly upwards when booting, i would get image.
So came to the conclusion that the weight of the card was the culprit here and since it was still on warranty i procedeed to a RMA, and after waiting a month they refunded me the total cost of the card and i bought a new one, team green this time, so feel free to remove me from the list guys.


----------



## neurotix

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What would be considered a "good" overclock on a pitcairn core? And then what would be considered more towards an "awesome" OC??
> 
> Wife is running a 7870, and I've got 1200/1450 locked in with no problem @ 1.25v, fully stable.
> 
> What's weird is, I can run this thing all the way up to 1275MHz @ 1.3v with no artifacts, but it eventually just black screens. If I reduce the VRAM clock to 1400, it prolongs this happening, but still blacks out around 45 minutes in. Mind you this is all on air.
> 
> Is 1200/1450 pretty good for these? I've only used tahiti and hawaii cards, and don't have much experience with pitcairn at all...
> 
> Would it be worth trying to keep reducing memory until I can get the core n the 1250-1275 range, or will the lower core and higher memory bandwidth help since the cqrd is somewhat bandwidth limited with it's 256 bit bus? Mind you this powers a 1080P TV.
> 
> I have been really impressed with how well the little card does. With the overclock in place, t's beating my son's stock XFX 7950!
> 
> Thanks


1200mhz is a great overclock for Pitcairn. I can recall people here only getting 1125-1175mhz max out of the 270 (especially) and 270X.

I have a golden 270X Vapor-X from Sapphire. 84% ASIC. It will do 1300mhz in some benches without artifacts (Valley and Heaven). However, it will crash in most games almost instantly at that speed.

I was able to game on it for hours at 1250mhz, with the RAM at 1500mhz. It could also pass Fire Strike at 1270mhz.

I don't really think the card is bandwidth limited either but I wouldn't reduce it below 1400mhz because there's simply no need (i.e. it is not going to make you more stable to reduce it further).

1200mhz is already a good OC. If you're lucky, you may be able to game on it at 1250mhz. This may require more than 1.3v and a bios mod however.


----------



## Catscratch

Quote:


> Originally Posted by *Neocoolzero*
> 
> So earlier this year, right in the beginning actually, my R9 280x from XFX started acting out and died on me.Never dabbled with ocing her, so was kinda perplexed with what hapenned, pc would start but no image whatsoever, and after messing around a bit, i noticed if i pushed the card slightly upwards when booting, i would get image.
> So came to the conclusion that the weight of the card was the culprit here and since it was still on warranty i procedeed to a RMA, and after waiting a month they refunded me the total cost of the card and i bought a new one, team green this time, so feel free to remove me from the list guys.


Bummer. I never overclocked my trix 280x either (pity i know). From day one, I supported the card with a Screw Driver Voltage Tester on the far end. However this actually makes the fan shroud rattle (and looks ridiculous). Eventually deforms the shroud so it always rattles. Still better than bending the card or the pci-e slot.

Enjoy your new card btw, red or green.


----------



## Agent Smith1984

Quote:


> Originally Posted by *neurotix*
> 
> 1200mhz is a great overclock for Pitcairn. I can recall people here only getting 1125-1175mhz max out of the 270 (especially) and 270X.
> 
> I have a golden 270X Vapor-X from Sapphire. 84% ASIC. It will do 1300mhz in some benches without artifacts (Valley and Heaven). However, it will crash in most games almost instantly at that speed.
> 
> I was able to game on it for hours at 1250mhz, with the RAM at 1500mhz. It could also pass Fire Strike at 1270mhz.
> 
> I don't really think the card is bandwidth limited either but I wouldn't reduce it below 1400mhz because there's simply no need (i.e. it is not going to make you more stable to reduce it further).
> 
> 1200mhz is already a good OC. If you're lucky, you may be able to game on it at 1250mhz. This may require more than 1.3v and a bios mod however.


Thanks for the info man! I'll keep testing! +1


----------



## Devildog83

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Thanks for the info man! I'll keep testing! +1


I ran my 7870 at 1200/1450 for months and it worked great but one day just died. Out of the blue up and died, The 270x I was running next to it was still going strong but one of my Devil's left me.


----------



## Roboyto

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Thanks for the info man! I'll keep testing! +1


I had a 7870 DC2 a few years back that I had a full cover block on. Solid card that ran around 1200/1500 and I recall those being pretty solid speeds back then. Would have to see if I could find some benchmark info somewhere for exactly how hard I was pushing it.

If you figure stock speeds for 7870 is 1.0 GHz, pulling 20% out of it isn't too shabby.

Figure you can run some benches, or take some FPS data from a game, with stock memory speeds and bump it up 50MHz at at time to see how much benefit you're really getting from it.


----------



## bamaredwingsfan

Currently I have the sapphire R9 280X, I'd like to start looking for my next GPU. If I go from the 280x to say the 380x or 390x is there a huge difference between these cards? Or should I just hang with what I have an wait for something else.


----------



## Roboyto

Quote:
Originally Posted by *Agent Smith1984* 

Thanks for the info man! I'll keep testing! +1
Originally Posted by *Roboyto* 


> I had a 7870 DC2 a few years back that I had a full cover block on. Solid card that ran around 1200/1500 and I recall those being pretty solid speeds back then. Would have to see if I could find some benchmark info somewhere for exactly how hard I was pushing it.


The 3DMark11 result files won't link to the site anymore I get an error; they're from march of 2013. But my nomenclature for the files dictates I was running 1225/1475 3DMark11 Extreme and 1250/1500 for FFXIV benchmark.

Quote:


> Originally Posted by *bamaredwingsfan*
> 
> Currently I have the sapphire R9 280X, I'd like to start looking for my next GPU. If I go from the 280x to say the 380x or 390x is there a huge difference between these cards? Or should I just hang with what I have an wait for something else.


380X would be more like a 'sidegrade' and definitely not worth it.

290(X)/390(X) could give you a decent jump up, but honestly you are probably better off waiting for Polaris. And then even if you don't buy one of those, than the pricing on 390(X) will come down to clear stock out.

If you were to grab a 290(X) used, that would be a great bang for your buck temporary upgrade. They do run hot though, so be prepared to buy one with a kickass air cooler or move to water to get the most out of Hawaii.


----------



## radier

Wait for Polaris.

Taptaptap Mlais M52


----------



## Catscratch

380x is supposed to be faster in dx12 because of more ACEs. However, with the current state of directx 12 and lack of real dx12 titles, it would be a waste. Wait for Polaris, should shift some prices anyway.


----------



## Horsemama1956

Quote:


> Originally Posted by *bamaredwingsfan*
> 
> Currently I have the sapphire R9 280X, I'd like to start looking for my next GPU. If I go from the 280x to say the 380x or 390x is there a huge difference between these cards? Or should I just hang with what I have an wait for something else.


390X will be a very nice upgrade and will easily hold you until AMD/nVidia release refresh cards.


----------



## Echoa

Is there anyone familiar enough with hex editing r9 270x bios to check if I did this right? Havent attempted to flash this but I did attempt to calculate all the proper hex values for Voltage and VID for the 4gb Gigabyte 270x. This if i did it right will set boost voltage to 1248mv or roughly 1.25v but again Im not sure. Id use VBE but Gigabyte used a newer version of the VRMs so VBE7 doesnt work with it and its 6+2 phase should allow for good overclocking though. If this is done properly then Id be happy to do it for the 2gb models also.

Pitcairn1248.zip 128k .zip file


Edit: You can open it in VBE7 though and it does read the 1248mv value


----------



## GoLDii3

Quote:


> Originally Posted by *Echoa*
> 
> Is there anyone familiar enough with hex editing r9 270x bios to check if I did this right? Havent attempted to flash this but I did attempt to calculate all the proper hex values for Voltage and VID for the 4gb Gigabyte 270x. This if i did it right will set boost voltage to 1248mv or roughly 1.25v but again Im not sure. Id use VBE but Gigabyte used a newer version of the VRMs so VBE7 doesnt work with it and its 6+2 phase should allow for good overclocking though. If this is done properly then Id be happy to do it for the 2gb models also.
> 
> Pitcairn1248.zip 128k .zip file
> 
> 
> Edit: You can open it in VBE7 though and it does read the 1248mv value


Why do you want to do that? Is your card voltage locked?


----------



## Echoa

Quote:


> Originally Posted by *GoLDii3*
> 
> Why do you want to do that? Is your card voltage locked?


Yea the Gigabyte cards are voltage locked and use the IR3567A vrms which are the newer version of CHL822x and aren't properly supported/are locked. This means the only way to do it is to hex edit the BIOS to change the voltage and vid. I've been kinda hesitant to do it as I'm not entirely sure on how they figured voltage and VID values so I've been reading on it best I can in a Russian forum but Google translation is 1/2 nonsense.


----------



## GoLDii3

Quote:


> Originally Posted by *Echoa*
> 
> Yea the Gigabyte cards are voltage locked and use the IR3567A vrms which are the newer version of CHL822x and aren't properly supported/are locked. This means the only way to do it is to hex edit the BIOS to change the voltage and vid. I've been kinda hesitant to do it as I'm not entirely sure on how they figured voltage and VID values so I've been reading on it best I can in a Russian forum but Google translation is 1/2 nonsense.


I wouldn't risk it. 270X does not have dual bios.

I suggest you try a round-up of every overclocking tool,eg. Sapphire TriXx,GPUTweak,MSI Afterburner;XFX's and HIS one's also.

Im currently using GPUTweak with my Sapphire 280X because it also has an unknown voltage regulator wich AB does not support,and TriXx has a bug.


----------



## Echoa

Quote:


> Originally Posted by *GoLDii3*
> 
> I wouldn't risk it. 270X does not have dual bios.
> 
> I suggest you try a round-up of every overclocking tool,eg. Sapphire TriXx,GPUTweak,MSI Afterburner;XFX's and HIS one's also.
> 
> Im currently using GPUTweak with my Sapphire 280X because it also has an unknown voltage regulator wich AB does not support,and TriXx has a bug.


I'll try that one, I know the hex edit works but I could only find undervolt guides for miners but not overvolt. I was just hoping maybe somebody knew if I did the mod right because I can't find anything but undervolt info


----------



## Echoa

Tried every tool i could from afterburner to gputweak, no voltage control. :/


----------



## PAiNGiveRR

hey guys hoping you do well









i have a xfx r9 280x dd black edition overclocked and i think i have a problem with power stuff cuz the power consumption is not stable at stock speed and it get below the stock voltage (2225mv) whenever i run a game or a 3d application, but Im not sure if its a psu or gpu problem or even some amd energy saver things

So i will post a pic of the sensors while running gpu-z rendering test (the black line shows the start of the test), hope someone will help me


----------



## Stige

Quote:


> Originally Posted by *PAiNGiveRR*
> 
> hey guys hoping you do well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i have a xfx r9 280x dd black edition overclocked and i think i have a problem with power stuff cuz the power consumption is not stable at stock speed and it get below the stock voltage (2225mv) whenever i run a game or a 3d application, but Im not sure if its a psu or gpu problem or even some amd energy saver things
> 
> So i will post a pic of the sensors while running gpu-z rendering test (the black line shows the start of the test), hope someone will help me


You should just measure it with DMM, they are usually off. Mine says something like 11.5V under load but the 12V line when measured with DMM stays at slightly above 12V at all times.


----------



## Echoa

Holy crap guys my modded bios worked....omg i was so scared lol my 4gb Gigabyte 270x is now reading 1.248v











gotta put it through some tests now

Edit:first thing i did was launch CRYSIS 3 and go right to a level i knew would cause a crash @1200mhz within 5min and played 20min no crash









Gained 20pts in heaven now @786 and Valley @1589

will test more later i work at 8am


----------



## PAiNGiveRR

guys this is normal??
GPU is inconsistant and can reach 1.3v+ even its in stock mode (1.175v) !!
and this pic is nt even in gaming, its while watching youtube lol


----------



## Echoa

Quote:


> Originally Posted by *PAiNGiveRR*
> 
> guys this is normal??
> GPU is inconsistant and can reach 1.3v+ even its in stock mode (1.175v) !!
> and this pic is nt even in gaming, its while watching youtube lol


Don't see a picture, and youre saying it reaches 1.3v on YouTube videos?


----------



## PAiNGiveRR

Quote:


> Originally Posted by *Echoa*
> 
> Don't see a picture, and youre saying it reaches 1.3v on YouTube videos?


Sry i edited the post..and yes 1.3+ even while watching youtube !
but 2day didnt go above 1.180 but the fluctuation still there and the 12v can go down as low as 11.5v when launching a game


----------



## Echoa

Quote:


> Originally Posted by *PAiNGiveRR*
> 
> Sry i edited the post..and yes 1.3+ even while watching youtube !
> but 2day didnt go above 1.180 but the fluctuation still there and the 12v can go down as low as 11.5v when launching a game


i dont believe your 12v should really fluctuate at all, maybe like. 1-.2 up or down. Seems odd , what's your ASIC? Has it shown any instability? Could be sensors reading wrong that happens.

Edit : as was said before you'll need to actually measure it yourself to confirm if that's happening, get a meter and measure as it could be reading wrong.


----------



## PAiNGiveRR

Quote:


> Originally Posted by *Echoa*
> 
> i dont believe your 12v should really fluctuate at all, maybe like. 1-.2 up or down. Seems odd , what's your ASIC? Has it shown any instability? Could be sensors reading wrong that happens.
> 
> Edit : as was said before you'll need to actually measure it yourself to confirm if that's happening, get a meter and measure as it could be reading wrong.


r9 280x xfx black overclocked edition ASIC:71.5
I can confirm there is a problem with power without the need of measuring cuz i get the "display adapter has stopped working..." message like everyday and sometimes it freeze and i need a hard power off.
The funny part that this can happens also in youtube nt only in games with high flickering


----------



## Echoa

Quote:


> Originally Posted by *PAiNGiveRR*
> 
> r9 280x xfx black overclocked edition ASIC:71.5
> I can confirm there is a problem with power without the need of measuring cuz i get the "display adapter has stopped working..." message like everyday and sometimes it freeze and i need a hard power off.
> The funny part that this can happens also in youtube nt only in games with high flickering


you still need to test to make sure its your PSU or GPU though, easiest way would be a meter. You could replace the GPU but if the PSU ends up being a problem then you gotta do that too. Test the PSU, if its fine then you can assume the GPU and move from there


----------



## PAiNGiveRR

Quote:


> Originally Posted by *Echoa*
> 
> you still need to test to make sure its your PSU or GPU though, easiest way would be a meter. You could replace the GPU but if the PSU ends up being a problem then you gotta do that too. Test the PSU, if its fine then you can assume the GPU and move from there


actually i tried to RMA gpu but they tested it and said its fine..so Im suspecting psu specially its from chieftec (I sure u never heard about them







), but can a psu problem cause flickering??
and what kind of meter u recommend to test the problem???cuz i never used a one
thank you for ur help


----------



## Echoa

Quote:


> Originally Posted by *PAiNGiveRR*
> 
> actually i tried to RMA gpu but they tested it and said its fine..so Im suspecting psu specially its from chieftec (I sure u never heard about them
> 
> 
> 
> 
> 
> 
> 
> ), but can a psu problem cause flickering??
> and what kind of meter u recommend to test the problem???cuz i never used a one
> thank you for ur help


Ive never personally had a PSU do that but Ive seen PSU problems cause any number of issues, but you can get a volt meter and test it or if you have access to a better PSU you can test it that way also. Never go cheap on a PSU its a pretty important part of a system.


----------



## Echoa

Well after further testing the extra 50ish mv allowed me a stable 1200mhz core but pushing farther causes errors. I could do 1.3v but i dont imagine ill get even 1250mhz out of that, maybe 1225mhz at best as mine doesn't appear to be A+ when it comes to overclocking. Ive settled on a nice 1200/1550 clocks (yes i tried memory at stock) and its running well. While not amazing im still happy with a good solid overclock


----------



## kivikas14

Opportunity to save team red from losing to team green.








http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd


----------



## lostsurfer

Guys, gotta question. So I over clocked my XFX 280x ASIC-60.5 to around 1200 and all was fine while benching, highest temp around 64c. But when I fire up the division on load the game times out, I've lowered it to 1150 with no issues and the game plays fine and i notice a definite gain. So would that tell me that 1150 or maybe little by little under 1200 is going to be my max OC for core speed, or just that particular game?


----------



## Roboyto

Quote:


> Originally Posted by *lostsurfer*
> 
> Guys, gotta question. So I over clocked my XFX 280x ASIC-60.5 to around 1200 and all was fine while benching, highest temp around 64c. But when I fire up the division on load the game times out, I've lowered it to 1150 with no issues and the game plays fine and i notice a definite gain. *So would that tell me that 1150 or maybe little by little under 1200 is going to be my max OC for core speed, or just that particular game?*


Most likely what you said here pending the amount of voltage you are running to the GPU.

Stability in benchmarks is a decent test of the capabilities of your hardware, but they are never a guaranteed stability for everything else. Usually the bench will get you in the ballpark and you can fine tune from there. Each game/benchmark loads the GPU in a different manor, so it's rarely a 'one size fits all' type of scenario.

One easy way to check stability with the benchmark is to make sure that you're score/results for the bench are actually going up with each increase in settings. Sometimes you may see a decrease in performance, and this is a telltale sign of an unstable OC.


----------



## norules

HI all,

I need some assistance please trying to help a friend, not sure if i can ask in the here. I have a friend who flashed his Gigabyte R9 280x oc card with the wrong bios and now it's not posting but i can see it in windows booting with another card, and worst off all he didn't make a backup of the original bios and techpower don't have a bios that's working and gigabyte support keeps telling me i need to RMA the card but there is no warranty on card.

will someone PLEASE share with me the 2 original bios files or one's that's currently working on their card.


----------



## GoLDii3

Quote:


> Originally Posted by *norules*
> 
> HI all,
> 
> I need some assistance please trying to help a friend, not sure if i can ask in the here. I have a friend who flashed his Gigabyte R9 280x oc card with the wrong bios and now it's not posting but i can see it in windows booting with another card, and worst off all he didn't make a backup of the original bios and techpower don't have a bios that's working and gigabyte support keeps telling me i need to RMA the card but there is no warranty on card.
> 
> will someone PLEASE share with me the 2 original bios files or one's that's currently working on their card.


Does he not have a BIOS switch? There should be 2 BIOS. Or he flashed both?

Anyways it's not that hard. Try one of those BIOS

Look up on google ATIFlash. You should be able to flash a new BIOS into the bad card by using a good card to boot into DOS and launch ATIFlash.

Im not so sure tho,i suggest you open a new thread for a issue like that.


----------



## norules

he flashed both. i've tried those and can't get it to work. hence asking if someone can provide me with a bios they know is working.


----------



## GoLDii3

Quote:


> Originally Posted by *norules*
> 
> he flashed both. i've tried those and can't get it to work. hence asking if someone can provide me with a bios they know is working.


You can't flash the BIOS or you can but it still does not work?

Maybe you can ask Gigabyte for the BIOS.


----------



## norules

Quote:


> Originally Posted by *GoLDii3*
> 
> You can't flash the BIOS or you can but it still does not work?
> 
> Maybe you can ask Gigabyte for the BIOS.


i've asked them and twice the told me the same story i need to RMA the card. they not really helping me.

so i flased the card and it posted but the moment i nstall the screen drivers windows just froze telling me the drivers is stil not correct.


----------



## ronnin426850

Hello, guys! Can I join?

Sapphire 280X DualX OC - 1200Mhz core, 1500 Mhz vRAM.

Can you people around 1200Mhz tell me how many volts do you pump through that core, do you see freq or usage throttle in Kombustor, and what FireStrike scores you get?

Here is my score and specs:
http://www.3dmark.com/fs/8163876

It gets around 65C and evens out after about half an hour of Kombustor, with ~25C ambient.

I've bumped the voltage to the absolute maximum allowed in VBIOS Editor. I've also done a lot of custom ghetto work on the card, are you interested in photos?

Thanks!


----------



## MiladEd

Quote:


> Originally Posted by *ronnin426850*
> 
> Hello, guys! Can I join?
> 
> Sapphire 280X DualX OC - 1200Mhz core, 1500 Mhz vRAM.
> 
> Can you people around 1200Mhz tell me how many volts do you pump through that core, do you see freq or usage throttle in Kombustor, and what FireStrike scores you get?
> 
> Here is my score and specs:
> http://www.3dmark.com/fs/8163876
> 
> It gets around 65C and evens out after about half an hour of Kombustor, with ~25C ambient.
> 
> I've bumped the voltage to the absolute maximum allowed in VBIOS Editor. I've also done a lot of custom ghetto work on the card, are you interested in photos?
> 
> Thanks!


Lucky you! What a great OC! My exact same card artifacts at anything higher than 1130 MHz!


----------



## ronnin426850

Quote:


> Originally Posted by *MiladEd*
> 
> Lucky you! What a great OC! My exact same card artifacts at anything higher than 1130 MHz!


What software do you use to OC? Have you raised the voltage? Are you using the stock cooling?


----------



## MiladEd

Quote:


> Originally Posted by *ronnin426850*
> 
> What software do you use to OC? Have you raised the voltage? Are you using the stock cooling?


I use TriXX, and increased the voltages to highest possible in that program (1300 vcore). Yes, stock cooling. My temps stay below 70 C running the most demanding game that I have, Witcher 3.


----------



## ronnin426850

Quote:


> Originally Posted by *MiladEd*
> 
> I use TriXX, and increased the voltages to highest possible in that program (1300 vcore). Yes, stock cooling. My temps stay below 70 C running the most demanding game that I have, Witcher 3.


I will bet you a thousand dollars right now that your VRM is overheating. The stock VRM cooling of this GPU consists of generic aluminum sink, low-grade thermal tape, and two spring pins to apply pressure. The overall thermal conduciveness of the setup is so bad, that after I glued some heatsinks to the BACK of the card, those heated up faster than the sink that is supposed to be cooling the VRM directly.
What I did, is I removed this entire nonsense, and used Arctic Alumina thermal *epoxy* (not to be confused with the paste) to glue some polished to mirror aluminum sinks to the MOSFETs directly. If you go for something like this, you should be extremely careful with the contact surface, because the MOSFETs are several tens of millimeter lower than some of the surrounding electrical elements, and you might end up shortening something, or not getting optimal contact surface.
Also, as I said, I have heatsinks glued to the back of the VRM, which to my own surprise help a lot.

Once you take care of the VRM cooling, OCing headroom should be expanded significantly.

Also look into doing your own custom BIOS, because this card throttles in usage as well as in frequency.

EDIT: I just discovered that the latest BIOSes for Sapphire's 280X push for lower power and lower fan speeds/noise, not better performance, so using stock up-to-date BIOS is not always the best.


----------



## MiladEd

Quote:


> Originally Posted by *ronnin426850*
> 
> I will bet you a thousand dollars right now that your VRM is overheating. The stock VRM cooling of this GPU consists of generic aluminum sink, low-grade thermal tape, and two spring pins to apply pressure. The overall thermal conduciveness of the setup is so bad, that after I glued some heatsinks to the BACK of the card, those heated up faster than the sink that is supposed to be cooling the VRM directly.
> What I did, is I removed this entire nonsense, and used Arctic Alumina thermal *epoxy* (not to be confused with the paste) to glue some polished to mirror aluminum sinks to the MOSFETs directly. If you go for something like this, you should be extremely careful with the contact surface, because the MOSFETs are several tens of millimeter lower than some of the surrounding electrical elements, and you might end up shortening something, or not getting optimal contact surface.
> Also, as I said, I have heatsinks glued to the back of the VRM, which to my own surprise help a lot.
> 
> Once you take care of the VRM cooling, OCing headroom should be expanded significantly.
> 
> Also look into doing your own custom BIOS, because this card throttles in usage as well as in frequency.


Good to know! Thanks for the info! I'd love to see some pics of your setup, if it's not too much trouble.

A few months back one of my GPU fans started to malfunction (it ran wobbly and slowly) I opened the heatsink, took out the fans and lubricated it, which fixed its problems, and also changed the GPUs thermal paste (which was pretty much dried out). It decreased my temps quite a bit.

I did notice the thermal paste to be of seemingly low quality, but I don't know what I can do about it?

Also how did you glue the heat sinks on the back of the card? What heatsink did you use?


----------



## ronnin426850

Quote:


> Originally Posted by *MiladEd*
> 
> Good to know! Thanks for the info! I'd love to see some pics of your setup, if it's not too much trouble.
> 
> A few months back one of my GPU fans started to malfunction (it ran wobbly and slowly) I opened the heatsink, took out the fans and lubricated it, which fixed its problems, and also changed the GPUs thermal paste (which was pretty much dried out). It decreased my temps quite a bit.
> 
> I did notice the thermal paste to be of seemingly low quality, but I don't know what I can do about it?
> 
> Also how did you glue the heat sinks on the back of the card? What heatsink did you use?


Sure, I'll take some photos and post back with all the details


----------



## GoLDii3

Quote:


> Originally Posted by *ronnin426850*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I will bet you a thousand dollars right now that your VRM is overheating. The stock VRM cooling of this GPU consists of generic aluminum sink, low-grade thermal tape, and two spring pins to apply pressure. The overall thermal conduciveness of the setup is so bad, that after I glued some heatsinks to the BACK of the card, those heated up faster than the sink that is supposed to be cooling the VRM directly.
> What I did, is I removed this entire nonsense, and used Arctic Alumina thermal *epoxy* (not to be confused with the paste) to glue some polished to mirror aluminum sinks to the MOSFETs directly. If you go for something like this, you should be extremely careful with the contact surface, because the MOSFETs are several tens of millimeter lower than some of the surrounding electrical elements, and you might end up shortening something, or not getting optimal contact surface.
> Also, as I said, I have heatsinks glued to the back of the VRM, which to my own surprise help a lot.
> 
> Once you take care of the VRM cooling, OCing headroom should be expanded significantly.
> 
> Also look into doing your own custom BIOS, because this card throttles in usage as well as in frequency.
> 
> EDIT: I just discovered that the latest BIOSes for Sapphire's 280X push for lower power and lower fan speeds/noise, not better performance, so using stock up-to-date BIOS is not always the best.


I don't think it has even to do with that honestly...i think both 3 of us have the same Dual-X model?

Well then i think it has to do with Sapphire using crap PCB's like always honestly on their cheaper models.

Compared to their top dog 280X,the 280X Toxic,wich boosts to 1150 MHz out of the box,the Dual-X version sucks.

I had a 7870 before wich was an AMD reference PCB one but with a custom cooler by Sapphire and i've never been able to hit clocks as good as with that with both this Sapphire 280X and a 7950.

Now getting to the main thing....i think the problem on this card is the voltage regulator. Thing drops voltage like mad. I mean if i set it a 1,30V it drops to something like 1,20-1,25V under load and peaks at something like 1,28V. Can't keep a stable OC like that.

Even at stock,standard voltage should be 1,20V yet the card drops to 1,15V under load. I don't know if this is real or it's just GPU-Z not reading the voltage regulator data correctly.

Atleast card has been working just fine for 2 years and something. But definitly not going anymore with Sapphire in the future unless they step their PCB game.


----------



## ronnin426850

Quote:


> Originally Posted by *GoLDii3*
> 
> I don't think it has even to do with that honestly...i think both 3 of us have the same Dual-X model?
> 
> Well then i think it has to do with Sapphire using crap PCB's like always honestly on their cheaper models.
> 
> Compared to their top dog 280X,the 280X Toxic,wich boosts to 1150 MHz out of the box,the Dual-X version sucks.
> 
> I had a 7870 before wich was an AMD reference PCB one but with a custom cooler by Sapphire and i've never been able to hit clocks as good as with that with both this Sapphire 280X and a 7950.
> 
> Now getting to the main thing....i think the problem on this card is the voltage regulator. Thing drops voltage like mad. I mean if i set it a 1,30V it drops to something like 1,20-1,25V under load and peaks at something like 1,28V. Can't keep a stable OC like that.
> 
> Even at stock,standard voltage should be 1,20V yet the card drops to 1,15V under load. I don't know if this is real or it's just GPU-Z not reading the voltage regulator data correctly.
> 
> Atleast card has been working just fine for 2 years and something. But definitly not going anymore with Sapphire in the future unless they step their PCB game.


You get what you paid for.

And VRM is the voltage regulator, so we're somewhat on the same page on this







Dual-X is 6-phase, unlike the Toxic which is 12-phase, of course your voltage will be worse, and of course your OC capacity will be hindered. Also, voltage is supposed to drop under load, fluctuations aside, it's called vDrop and/or vDroop (controversial topic on OCN







). It is done to avoid voltage spikes during power state changes and turbo boosting. It is by design.

The PCB itself is quite decent IMO. A lot of the hardware on that card is quite decent, if you push it properly


----------



## neurotix

This is why you should always pay more for a better cooler if you plan to overclock, or run the card on air with an aftermarket cooler.

Unless you got the card used or something... it's worth paying $20-30 more for the best cooler and a custom PCB.

I absolutely loved my 7970 Vapor-X that I used for a year and it ran fantastic when I had it. 1200mhz, 1.3v and right around 60C with 100% fan in Crysis 3.

I've had and benched many Sapphire cards including some Dual-X cards and they all ran great, were *voltage unlocked* and overclocked reasonably well on air.

I had a 7870 XT Dual-X and I still have (well, my brother does) an R7 265 Dual-X (aka 7850).

I don't really know about the 280X Dual-X but if it's overclocking bad or running too hot maybe it's because of higher ambients, or maybe it's something more like 'user error'.

Also, afaik for the Dual-X cards they usually use a reference PCB. Even the high end (well, formerly) Tri-X 290/X cards from Sapphire used a reference PCB. The reason for this is to keep them compatible with reference full coverage water blocks. (Even though Sapphire claims that water cooling violates the warranty, as long as you RMA your card with the cooler it came with on it, they will repair or replace it. I know because I've done it.)

Finally, the vdroop on AMD cards is totally normal and expected/acceptable. Afaik it happens on Nvidia too. They are basically engineered this way and vdroop is desirable under load because it keeps temps down. I've used many AMD GCN cards and the majority had vdroop. There's ways around it (e.g. BIOS editing) but in general, it's a normal trait. You probably don't want to be pushing more than 1.3v through these cards anyway on air unless you want to decrease their lifespan or possibly even blow VRMs and fry them.


----------



## MiladEd

Quote:


> Originally Posted by *neurotix*
> 
> This is why you should always pay more for a better cooler if you plan to overclock, or run the card on air with an aftermarket cooler.
> 
> Unless you got the card used or something... it's worth paying $20-30 more for the best cooler and a custom PCB.
> 
> I absolutely loved my 7970 Vapor-X that I used for a year and it ran fantastic when I had it. 1200mhz, 1.3v and right around 60C with 100% fan in Crysis 3.
> 
> I've had and benched many Sapphire cards including some Dual-X cards and they all ran great, were *voltage unlocked* and overclocked reasonably well on air.
> 
> I had a 7870 XT Dual-X and I still have (well, my brother does) an R7 265 Dual-X (aka 7850).
> 
> I don't really know about the 280X Dual-X but if it's overclocking bad or running too hot maybe it's because of higher ambients, or maybe it's something more like 'user error'.
> 
> Also, afaik for the Dual-X cards they usually use a reference PCB. Even the high end (well, formerly) Tri-X 290/X cards from Sapphire used a reference PCB. The reason for this is to keep them compatible with reference full coverage water blocks. (Even though Sapphire claims that water cooling violates the warranty, as long as you RMA your card with the cooler it came with on it, they will repair or replace it. I know because I've done it.)
> 
> Finally, the vdroop on AMD cards is totally normal and expected/acceptable. Afaik it happens on Nvidia too. They are basically engineered this way and vdroop is desirable under load because it keeps temps down. I've used many AMD GCN cards and the majority had vdroop. There's ways around it (e.g. BIOS editing) but in general, it's a normal trait. You probably don't want to be pushing more than 1.3v through these cards anyway on air unless you want to decrease their lifespan or possibly even blow VRMs and fry them.


When I bought the card 2 years ago, I didn't know much about overclocking and plan to overclock. Now if I want to upgrade, I'd definitely get "the best overlcocker" card.

Yes, the 280X Dual-X is reference PCB, but I don't plan to put it underwater. It's just not worth it, I'd rather save up the money for when Polaris drops.


----------



## neurotix

Indeed, I am also saving for Polaris, more likely Vega if the rumors are true. (i.e. if Polaris is a smaller die for the midrange. I will wait for big die Vega in Q1 2017.)

I will probably go for two of the best Vega cards available, most likely Sapphire Vapor-X.


----------



## ronnin426850

280X Dual-X is advrtised as a 870Mhx GPU on Sapphire's site, and is priced accordingly. And since most people got at least 1100Mhz out of it, I think it's quite a decent overclocker. Cooling does the job up to a point, but it is VERY advisable to edit your fan curve for more aggressive cooling, and eventually replace the cooler with something more OC designed, like I did with the Accelero, if you plan on going higher.
The highest OC here on forum is 1250, if i'm not mistaken.

Can you guys post some Firestrike?


----------



## MiladEd

Quote:


> Originally Posted by *ronnin426850*
> 
> 280X Dual-X is advrtised as a 870Mhx GPU on Sapphire's site, and is priced accordingly. And since most people got at least 1100Mhz out of it, I think it's quite a decent overclocker. Cooling does the job up to a point, but it is VERY advisable to edit your fan curve for more aggressive cooling, and eventually replace the cooler with something more OC designed, like I did with the Accelero, if you plan on going higher.
> The highest OC here on forum is 1250, if i'm not mistaken.
> 
> Can you guys post some Firestrike?


Yeah, but the boost clock is at 1020 MHz, IIRC. I've edited the fan curve too, on TriXX, I've set to hit 80% at 70 C, but I rarely reach that temperature anyway.

I'd post Firestrike if I had it. Unfortunately, it's too rich for my blood lol.


----------



## ronnin426850

Quote:


> Originally Posted by *MiladEd*
> 
> Yeah, but the boost clock is at 1020 MHz, IIRC. I've edited the fan curve too, on TriXX, I've set to hit 80% at 70 C, but I rarely reach that temperature anyway.
> 
> I'd post Firestrike if I had it. Unfortunately, it's too rich for my blood lol.


What do you mean, it's free?

Also, it does have boost clock of 1020, but thanks to the 6 phase and the TDP limit being lowered from 247 to 206W in the new BIOS, it almost never reaches it, except in some light games where you don't care about performance.
That's why I went with custom BIOS, disabled boost altogether and went for 1200Mhz steady.


----------



## MiladEd

Quote:


> Originally Posted by *ronnin426850*
> 
> What do you mean, it's free?
> 
> Also, it does have boost clock of 1020, but thanks to the 6 phase and the TDP limit being lowered from 247 to 206W in the new BIOS, it almost never reaches it, except in some light games where you don't care about performance.
> That's why I went with custom BIOS, disabled boost altogether and went for 1200Mhz steady.


3D mark isn't free? It's 25$ on Steam, unless they've a trial or something I don't know about...

I've set the clock to 1125 MHz in TriXX, and it stays in that clock permanently while I game, I check it with MSI afterburner. Is a custom BIOS really necessary?

Also, could expect some more overclockability from it? How safe is it? The card is dual BIOS if I'm not mistaken, so you can revert and fix it if it gets corrupted or something right?


----------



## ronnin426850

Quote:


> Originally Posted by *MiladEd*
> 
> 3D mark isn't free? It's 25$ on Steam, unless they've a trial or something I don't know about...
> 
> I've set the clock to 1125 MHz in TriXX, and it stays in that clock permanently while I game, I check it with MSI afterburner. Is a custom BIOS really necessary?
> 
> Also, could expect some more overclockability from it? How safe is it? The card is dual BIOS if I'm not mistaken, so you can revert and fix it if it gets corrupted or something right?


3DMark has a free version that allows you to run the default tests with default settings. The paid version allows you to customize settings, enable looped play, disable demos, etc extras.


----------



## PAiNGiveRR

hey guys i have a r9 280x xfx black oc edition and on the box it says *UNLOCKED VOLTAGE* but the card seems not affected with voltage changes even if changed it in bios cuz as i see in afterburner its only 1225mv while gaming and in gpu-z its around 1150mv, so really confused !


----------



## ronnin426850

Quote:


> Originally Posted by *PAiNGiveRR*
> 
> hey guys i have a r9 280x xfx black oc edition and on the box it says *UNLOCKED VOLTAGE* but the card seems not affected with voltage changes even if changed it in bios cuz as i see in afterburner its only 1225mv while gaming and in gpu-z its around 1150mv, so really confused !


Please describe step by step how you change the voltage in the card's BIOS. Usually there is no reason not to work.


----------



## ronnin426850

Ok, so I took some photos of the GPU, only to find that my dad forgot to give me back the USB cable for the camera, and since I don't have another just like the one I need, you'll be waiting a bit longer for the photos


----------



## ronnin426850

Ok, here it is:


Spoiler: Warning: Spoiler!

















Also I got the memory up to 1600. I bet it can go higher, but I don't need any more performance right now, so I'll be leaving it at 1600


----------



## MiladEd

Such cute tiny heat sinks!! Where can you get those?!


----------



## PAiNGiveRR

Quote:


> Originally Posted by *ronnin426850*
> 
> Please describe step by step how you change the voltage in the card's BIOS. Usually there is no reason not to work.


Nothing special, i used VBE7 bios editor and changed the VDDC in the performance tab only (multiple times) but no luck
and even i got a new problem, gpu is highly unstable in videos (flips from .8 to 1.3-1.5 v) every second so sometimes i got "display adapter has stopped working..."... the funny thing is that its stable in games


----------



## ronnin426850

Quote:


> Originally Posted by *PAiNGiveRR*
> 
> Nothing special, i used VBE7 bios editor and changed the VDDC in the performance tab only (multiple times) but no luck
> and even i got a new problem, gpu is highly unstable in videos (flips from .8 to 1.3-1.5 v) every second so sometimes i got "display adapter has stopped working..."... the funny thing is that its stable in games


I have no idea about your second problem, but about the first - for which power states did you change the VDDC? In the performance tab, there are two load states - #6 standard load state, and #0 turbo state.

If you changed VDDC for #0 only, you most probably hit the TDP ceiling and the card stays in #6 state. Can you please post a screenshot of the VBE7 power play screen?
Quote:


> Originally Posted by *MiladEd*
> 
> Such cute tiny heat sinks!! Where can you get those?!


Bulgaria







They are coated with something, probably insulating paint, I had to polish them on the contact surface to get to bare aluminum. Nothing special, but do the job nicely


----------



## PAiNGiveRR

Quote:


> Originally Posted by *ronnin426850*
> 
> I have no idea about your second problem, but about the first - for which power states did you change the VDDC? In the performance tab, there are two load states - #6 standard load state, and #0 turbo state.
> 
> If you changed VDDC for #0 only, you most probably hit the TDP ceiling and the card stays in #6 state. Can you please post a screenshot of the VBE7 power play screen?
> Bulgaria
> 
> 
> 
> 
> 
> 
> 
> They are coated with something, probably insulating paint, I had to polish them on the contact surface to get to bare aluminum. Nothing special, but do the job nicely


here you are
P.S:I also tried checking the "manual power limit adjusment", but no difference


----------



## ronnin426850

Quote:


> Originally Posted by *PAiNGiveRR*
> 
> here you are
> P.S:I also tried checking the "manual power limit adjusment", but no difference
> 
> 
> Spoiler: Warning: Spoiler!


Ah, ok, so it's a little bit different on the Sapphire, but I believe we're on the right track.

This is the page we need, 1: Performance:


Here it is on the Sapphire:


The labels are a bit different, the turbo state is #0 for me and #4 for you, but that should not be a problem. You're saying "in afterburner its only 1225mv while gaming and in gpu-z its around 1150mv".
Since Afterburner was often wrong about my GPU, I don't trust it as I did some years ago. What GPUZ is showing sounds about right, given that your #3 state is 1144mV.

Your GPU's TDP is quite low - only 204W. That means that the turbo state will be reached very rarely, if at all. My TDP is 247 stock, and 260 the one I manually set.

This is what I did with the power states:


As you see, my default state is the same as the turbo state, essentially disabling turbo and setting a steady frequency for every load scenario. If I lower the TDP to your levels, 204W, my GPU will lower usage to about 60% to stay within limits.

So if you want to get that 1200mV, you should set it to the #3 state as well. If you see your GPU usage is lower than 99% in Kombustor or other usually 100% load scenarios, like Witcher, you can try to lower the frequency or risk raising the TDP a bit, so the card won't throttle.

Is this your card?


If so, I strongly advice *against* rising the TDP. First off, it is 6-phase, like the Sapphire Dual-X, but this VRM cooling is even worse than my stock was. This is really a sorry excuse for VRM cooling.
Probably that's why your stock TDP is so low.

If the card throttles at 1200mV, get some custom cooling in there, and attempt doing the OC again.

Also, try lower voltages first, it's quite a jump from 1144 to 1200, especially for XFX's VRM.


----------



## PAiNGiveRR

Quote:


> Originally Posted by *ronnin426850*
> 
> Ah, ok, so it's a little bit different on the Sapphire, but I believe we're on the right track.
> 
> This is the page we need, 1: Performance:
> 
> 
> Here it is on the Sapphire:
> 
> 
> The labels are a bit different, the turbo state is #0 for me and #4 for you, but that should not be a problem. You're saying "in afterburner its only 1225mv while gaming and in gpu-z its around 1150mv".
> Since Afterburner was often wrong about my GPU, I don't trust it as I did some years ago. What GPUZ is showing sounds about right, given that your #3 state is 1144mV.
> 
> Your GPU's TDP is quite low - only 204W. That means that the turbo state will be reached very rarely, if at all. My TDP is 247 stock, and 260 the one I manually set.
> 
> This is what I did with the power states:
> 
> 
> As you see, my default state is the same as the turbo state, essentially disabling turbo and setting a steady frequency for every load scenario. If I lower the TDP to your levels, 204W, my GPU will lower usage to about 60% to stay within limits.
> 
> So if you want to get that 1200mV, you should set it to the #3 state as well. If you see your GPU usage is lower than 99% in Kombustor or other usually 100% load scenarios, like Witcher, you can try to lower the frequency or risk raising the TDP a bit, so the card won't throttle.
> 
> Is this your card?
> 
> 
> If so, I strongly advice *against* rising the TDP. First off, it is 6-phase, like the Sapphire Dual-X, but this VRM cooling is even worse than my stock was. This is really a sorry excuse for VRM cooling.
> Probably that's why your stock TDP is so low.
> 
> If the card throttles at 1200mV, get some custom cooling in there, and attempt doing the OC again.
> 
> Also, try lower voltages first, it's quite a jump from 1144 to 1200, especially for XFX's VRM.


Tnx bro i will try that but the problem is that i can't find my flash card








And lets assume this will fix the vddc problem, still the v12 rail will be lowered as soon as the gpu or cpu is stressed..so we can assume bad psu?
P.S: gpu is 68 degree in most cases and rarely reachs 70 after few hours of non stop gaming with max vrm temperature of 87, this is good right?


----------



## Cruentius

Is it possible to CF a 280x with a 650Watt PSU?

Because atm I have a Corsair RM650x, a very good PSU, and found another cheap 280x.

My motherboard is a Gigabyte gaming 7 and CPU is an i5 6600k @4.5.

I think it will be very close.


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *Cruentius*
> 
> Is it possible to CF a 280x with a 650Watt PSU?
> 
> Because atm I have a Corsair RM650x, a very good PSU, and found another cheap 280x.
> 
> My motherboard is a Gigabyte gaming 7 and CPU is an i5 6600k @4.5.
> 
> I think it will be very close.


should if you havne't oc'd your CPU.. I run 750w ... so


----------



## BulletSponge

Quote:


> Originally Posted by *Cruentius*
> 
> Is it possible to CF a 280x with a 650Watt PSU?
> 
> Because atm I have a Corsair RM650x, a very good PSU, and found another cheap 280x.
> 
> My motherboard is a Gigabyte gaming 7 and CPU is an i5 6600k @4.5.
> 
> I think it will be very close.


I had oc'ed crossfire 280X's with a 3570K at 4.2 and running Furmark and IBT together (briefly, just to check max power draw) my kill-a-watt showed 665 being drawn at the wall. I'm not sure what that translates into coming from the PSU but I suspect a quality 650 would be plenty.


----------



## ronnin426850

Quote:


> Originally Posted by *Cruentius*
> 
> Is it possible to CF a 280x with a 650Watt PSU?
> 
> Because atm I have a Corsair RM650x, a very good PSU, and found another cheap 280x.
> 
> My motherboard is a Gigabyte gaming 7 and CPU is an i5 6600k @4.5.
> 
> I think it will be very close.


That is a 54A monorail PSU, a single stock 280X requires about 18-20A under full load, so you'll probably be fine with 14A to spare on the CPU and peripherals.

Also, note that you probably won't be able to push the 280X to 20A in realistic gaming scenarios, so you should be fine with that PSU.

Just don't OC the cards.


----------



## Cruentius

Quote:


> Originally Posted by *ronnin426850*
> 
> That is a 54A monorail PSU, a single stock 280X requires about 18-20A under full load, so you'll probably be fine with 14A to spare on the CPU and peripherals.
> 
> Also, note that you probably won't be able to push the 280X to 20A in realistic gaming scenarios, so you should be fine with that PSU.
> 
> Just don't OC the cards.


Alright thank you, GPU isn't overclocked


----------



## MiladEd

Guys. I've got a very strange problem. GTA V causes my GPU to go into suicide mode.

I played some yesterday and it ran fine. Nothing like today. Now, it's acting very strange. My 280X is overclocked to 1125 MHz, and has been for a very long time, temps very always good and in my most demanding game, Witcher 3, it peaked about 69 C barely touching 70 C. Now, in GTA V which would get my temps up to about 65-66 C, it reaches well over 90 Cin a short matter of time! I got a screen shot of MSI Afterburner, so you can see how quickly it climbs.



As you can see, fans correctly react and ramp up to 100%.

I need to add, it first happened in 16.4.2 drivers. When this happened in GTA V, I exited, tested Witcher 3, it ran fine, same as before. I launched GTA V again, the temps were fine but it stuttered and my framerate varied a lot, drops all the way down to 30 and it was mostly at 40-50, not like 60 FPS that I used to get.

Then, I updated the drivers to 16.5.1, rebooted, tested GTA V again, same temperature fiasco happened, and now I'm writing this. I'll test it again, and see what happens.

Also, I said I played some yesterday as well, and that also was with 16.4.2 drivers, and it was fine. Dunno what's going on.


----------



## NeoReaper

Can someone here with a R9 270x with unlocked voltage/Hynix-chipped memory PM me? I want to use their bios because all of the bios files on the internet (Yes, the databases) are either for Elphida-chipped memory (Which my card won't boot at all) and the very rare hynix supported files won't even let me install a driver.


----------



## ronnin426850

Quote:


> Originally Posted by *NeoReaper*
> 
> Can someone here with a R9 270x with unlocked voltage/Hynix-chipped memory PM me? I want to use their bios because all of the bios files on the internet (Yes, the databases) are either for Elphida-chipped memory (Which my card won't boot at all) and the very rare hynix supported files won't even let me install a driver.


Maybe this will do it for you?

Sapphire_270X_modded.zip 41k .zip file


----------



## Echoa

Quote:


> Originally Posted by *NeoReaper*
> 
> Can someone here with a R9 270x with unlocked voltage/Hynix-chipped memory PM me? I want to use their bios because all of the bios files on the internet (Yes, the databases) are either for Elphida-chipped memory (Which my card won't boot at all) and the very rare hynix supported files won't even let me install a driver.


Make sure the bios supports your VRMs, some use newer ones with different signaling, I.E gigabyte cards use ones that are ID as IR but are newer and post them being bought (I forget who aquired them).


----------



## NeoReaper

Thanks for the bios! I will give it a try probably tomorrow since I don't wanna be up late tonight trying to restore the old bios if this one dun goofs like the other ones did! ^^
Well if it helps in terms of my VRM information, the only thing I can say without removing the entire cooling shroud from my card is that I can read the VRM temps and VDDC (Which is 1.2v which I believe is holding my core clock from going any further since the power limiter seems to lose all overclocking effectiveness past 1100mhz/1500mhz, even with the registry mod to increase it to 50% in afterburner)
Also, a bit off topic, could you guys take a peek here with another issue related to my card that popped up recently:
http://www.overclock.net/t/1599323/screen-glitching-and-artefacts/0_100
EDIT: That bios is for a 2 gig model right?


----------



## ronnin426850

Quote:


> Originally Posted by *NeoReaper*
> 
> Thanks for the bios! I will give it a try probably tomorrow since I don't wanna be up late tonight trying to restore the old bios if this one dun goofs like the other ones did! ^^
> Well if it helps in terms of my VRM information, the only thing I can say without removing the entire cooling shroud from my card is that I can read the VRM temps and VDDC (Which is 1.2v which I believe is holding my core clock from going any further since the power limiter seems to lose all overclocking effectiveness past 1100mhz/1500mhz, even with the registry mod to increase it to 50% in afterburner)
> Also, a bit off topic, could you guys take a peek here with another issue related to my card that popped up recently:
> http://www.overclock.net/t/1599323/screen-glitching-and-artefacts/0_100
> EDIT: That bios is for a 2 gig model right?


Yes.


----------



## GsNp

Anyone experience an issue with an Asus R9 270x, not displaying through the HDMI port? I got this PC yesterday from my gfs' brother,
Build is an I7-5820K, 1k PSU, Asus R9 270x, MSI X99 SLI plus, Corsair 4x4gb 2400 Ram. Booted it up fine after clearing the CMOS, updated drivers and everything, but have been unable to get my TV connected through the HDMI port to display, I spent a good few hours, searching forum boards trying to find a solution but have been unlucky thus far, All drivers are up to date, I uninstalled my Nvidia Drivers thinking maybe that was causing a problem, tried booting into safe mode with only the TV as main display multiple instances and times without safemode, still nothing. Tried connecting my Benq XL2411z monitor to the card through the HDMI port, with multiple HDMI cords, still nothing. I'm chocking this up to a bad HDMI port on the card, but wondering if there's possibly something I'm overlooking or unaware of being this is my first AMD card.

Thanks!


----------



## Boinz

Quote:


> Originally Posted by *GsNp*
> 
> Anyone experience an issue with an Asus R9 270x, not displaying through the HDMI port? I got this PC yesterday from my gfs' brother,
> Build is an I7-5820K, 1k PSU, Asus R9 270x, MSI X99 SLI plus, Corsair 4x4gb 2400 Ram. Booted it up fine after clearing the CMOS, updated drivers and everything, but have been unable to get my TV connected through the HDMI port to display, I spent a good few hours, searching forum boards trying to find a solution but have been unlucky thus far, All drivers are up to date, I uninstalled my Nvidia Drivers thinking maybe that was causing a problem, tried booting into safe mode with only the TV as main display multiple instances and times without safemode, still nothing. Tried connecting my Benq XL2411z monitor to the card through the HDMI port, with multiple HDMI cords, still nothing. I'm chocking this up to a bad HDMI port on the card, but wondering if there's possibly something I'm overlooking or unaware of being this is my first AMD card.
> 
> Thanks!


Did some googling, apparently one guy found out the hdmi display sensor is deeper into the cord, try pushing the cord a bit deeper if you can into the both the ports on the tv and the card (not excessive force, don't damage it)

http://www.tomshardware.com/forum/id-2114530/xfx-270x-hdmi-video.html
Quote:


> Guess what guys? Apparently, detection of external monitor through HDMI happens at tip of port. The actual signal is deeper in the port itself. If you only plug in halfway, you will get recognition but no signal to and from. My card is a bit recessed in my aging case so the HDMI cable (which is probably bulky) was plugged in just a hair short. I fixed this problem by just forcing it in further. Thanks for your help guys. Really appreciate the responses.


worth a try.


----------



## GsNp

Quote:


> Originally Posted by *Boinz*
> 
> Did some googling, apparently one guy found out the hdmi display sensor is deeper into the cord, try pushing the cord a bit deeper if you can into the both the ports on the tv and the card (not excessive force, don't damage it)
> 
> http://www.tomshardware.com/forum/id-2114530/xfx-270x-hdmi-video.html
> worth a try.


You sir are awesome, I did notice the case he used seems a bit rigid and the card doesn't fully slot into the opening, clearly not wanting to damage it, I plugged it in as normally would with any other port. The TV Display was recognized in device manager and desktop display settings, so there's that. I'm presently at work so I'll try this as soon as I get home.


----------



## NeoReaper

Quote:


> Originally Posted by *ronnin426850*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NeoReaper*
> 
> Thanks for the bios! I will give it a try probably tomorrow since I don't wanna be up late tonight trying to restore the old bios if this one dun goofs like the other ones did! ^^
> Well if it helps in terms of my VRM information, the only thing I can say without removing the entire cooling shroud from my card is that I can read the VRM temps and VDDC (Which is 1.2v which I believe is holding my core clock from going any further since the power limiter seems to lose all overclocking effectiveness past 1100mhz/1500mhz, even with the registry mod to increase it to 50% in afterburner)
> Also, a bit off topic, could you guys take a peek here with another issue related to my card that popped up recently:
> http://www.overclock.net/t/1599323/screen-glitching-and-artefacts/0_100
> EDIT: That bios is for a 2 gig model right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes.
Click to expand...

Flashed it, used DDU to remove all drivers in safe mode (It booted into safe mode fine), installed new drivers, went to do anything on the desktop after booting with the new drivers, card crashed and black-screened (There was also artefacts on the login screen after driver install) so the Sapphire bios' are out of question. Thanks for the try though ^^
(Re-flashed original bios after)


----------



## ronnin426850

Quote:


> Originally Posted by *NeoReaper*
> 
> Flashed it, used DDU to remove all drivers in safe mode (It booted into safe mode fine), installed new drivers, went to do anything on the desktop after booting with the new drivers, card crashed and black-screened (There was also artefacts on the login screen after driver install) so the Sapphire bios' are out of question. Thanks for the try though ^^
> (Re-flashed original bios after)


Sorry it didn't work for you. Maybe due to VRM incompatibility, because it definitely runs on Hynx RAM. Also, did you check the voltages? The BIOS may actually work, just require some editing in VBE8. I know different edition of 280X have *vastly* different voltages, performance states and powerplay settings, so maybe the same goes for 270X.

I can't honestly recommend that though, if it crashed so bad on the first try, it's a risk to the card.

Good luck finding the one!


----------



## NeoReaper

My cards lowest 2D clock: 300/150 @ 0.875v
Second state: 450/1400 @ 0.949v
Highest state (Overclocked with 20% power limiter only since voltage is locked): 1115/1500 @ 1.199v
These are the only states my card go into as far as I know if this helps at all and as said before, 1 VRM temp is readable as I know some 270x's don't support that feature.


----------



## Echoa

Quote:


> Originally Posted by *NeoReaper*
> 
> My cards lowest 2D clock: 300/150 @ 0.875v
> Second state: 450/1400 @ 0.949v
> Highest state (Overclocked with 20% power limiter only since voltage is locked): 1115/1500 @ 1.199v
> These are the only states my card go into as far as I know if this helps at all and as said before, 1 VRM temp is readable as I know some 270x's don't support that feature.


google your VRMs, i mixed it up a little before its that the VRMS will show as something like CHL822x when theyre actually the newer IR3567A or some variant (my bad xP). You may have to manually adjust your voltage in bios using this method

https://archive.litecointalk.org/index.php?topic=16907.0

my Gigabyte 270x you have to do it this way, its kinda a pain but it works and if you take your time youll be fine. My 270x even with +50mv barely clocked 1200mhz so I decided it wasnt worth the extra heat and clocked it back to 1175mhz and the stock voltage/bios. You might maybe get 1-2fps as the 270/270x really just doesnt get much from overclocking. My opinion is if its voltage locked get the highest core/memory you can at stock voltage and leave it, you arent missing out on much anyway.


----------



## GsNp

Quote:


> Originally Posted by *Boinz*
> 
> Did some googling, apparently one guy found out the hdmi display sensor is deeper into the cord, try pushing the cord a bit deeper if you can into the both the ports on the tv and the card (not excessive force, don't damage it)
> 
> http://www.tomshardware.com/forum/id-2114530/xfx-270x-hdmi-video.html
> worth a try.


Yea this wasn't the case, I'm gonna head to microcenter and pick up an HDMI to DVI converter or cable.


----------



## NeoReaper

Quote:


> Originally Posted by *Echoa*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NeoReaper*
> 
> My cards lowest 2D clock: 300/150 @ 0.875v
> Second state: 450/1400 @ 0.949v
> Highest state (Overclocked with 20% power limiter only since voltage is locked): 1115/1500 @ 1.199v
> These are the only states my card go into as far as I know if this helps at all and as said before, 1 VRM temp is readable as I know some 270x's don't support that feature.
> 
> 
> 
> google your VRMs, i mixed it up a little before its that the VRMS will show as something like CHL822x when theyre actually the newer IR3567A or some variant (my bad xP). You may have to manually adjust your voltage in bios using this method
> 
> https://archive.litecointalk.org/index.php?topic=16907.0
> 
> my Gigabyte 270x you have to do it this way, its kinda a pain but it works and if you take your time youll be fine. My 270x even with +50mv barely clocked 1200mhz so I decided it wasnt worth the extra heat and clocked it back to 1175mhz and the stock voltage/bios. You might maybe get 1-2fps as the 270/270x really just doesnt get much from overclocking. My opinion is if its voltage locked get the highest core/memory you can at stock voltage and leave it, you arent missing out on much anyway.
Click to expand...

I tried finding a PCB picture of my card, could not find one with high enough res to see the VRM's id. I would just check mine but I am out of thermal paste so I can't remove the cooler atm.
One of the reasons why I wanted to see if I could unlock the voltage is that my card does not even pass 68*C after hours on bf4 and it sometimes is even randomly unstable on the current clock. I used to be able to hit that golden 1200/1500 mark but my core is slowly dropping back the the power limiter (Even with the registry edit to extend it) does not make any difference anymore.


----------



## Echoa

Quote:


> Originally Posted by *NeoReaper*
> 
> I tried finding a PCB picture of my card, could not find one with high enough res to see the VRM's id. I would just check mine but I am out of thermal paste so I can't remove the cooler atm.
> One of the reasons why I wanted to see if I could unlock the voltage is that my card does not even pass 68*C after hours on bf4 and it sometimes is even randomly unstable on the current clock. I used to be able to hit that golden 1200/1500 mark but my core is slowly dropping back the the power limiter (Even with the registry edit to extend it) does not make any difference anymore.


Just open your bios in VBE7, if it says chl822x it's probably the newer IR vrms, someone posted which ones yours have most likely just google like I did "gigabyte 270x vrms" except with your model. Most of the locked voltage 270x use IR vrms from what I've seen so that's probably your issue


----------



## mori170

Quote:


> Originally Posted by *ronnin426850*
> 
> Hello, guys! Can I join?
> 
> Sapphire 280X DualX OC - 1200Mhz core, 1500 Mhz vRAM.
> 
> Can you people around 1200Mhz tell me how many volts do you pump through that core, do you see freq or usage throttle in Kombustor, and what FireStrike scores you get?
> 
> Here is my score and specs:
> http://www.3dmark.com/fs/8163876
> 
> It gets around 65C and evens out after about half an hour of Kombustor, with ~25C ambient.
> 
> I've bumped the voltage to the absolute maximum allowed in VBIOS Editor. I've also done a lot of custom ghetto work on the card, are you interested in photos?
> 
> Thanks!


I have a Sapphire R9 280x Tri-x, clocked at 1150mhz core with 1.2v and 1600mhz memory with stock memory voltage (1.5v) for 24/7 use. It does 1200mhz core with 1.25v and 1900mhz memory at 1.7v if I keep the temps under 65°C for benchmarks, but it's way too noisy for everyday use so I just stick with 1150mhz







. I run a custom bios with voltage up to 1.4v, but I can't keep the card under 60°C (going over 60°C it starts to artifact at clocks over 1200) at 1.4v with clocks over 1200mhz, so the max i ran was 1.3v.


----------



## NeoReaper

Quote:


> Originally Posted by *Echoa*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NeoReaper*
> 
> I tried finding a PCB picture of my card, could not find one with high enough res to see the VRM's id. I would just check mine but I am out of thermal paste so I can't remove the cooler atm.
> One of the reasons why I wanted to see if I could unlock the voltage is that my card does not even pass 68*C after hours on bf4 and it sometimes is even randomly unstable on the current clock. I used to be able to hit that golden 1200/1500 mark but my core is slowly dropping back the the power limiter (Even with the registry edit to extend it) does not make any difference anymore.
> 
> 
> 
> Just open your bios in VBE7, if it says chl822x it's probably the newer IR vrms, someone posted which ones yours have most likely just google like I did "gigabyte 270x vrms" except with your model. Most of the locked voltage 270x use IR vrms from what I've seen so that's probably your issue
Click to expand...

Quote:


> XFX uses a CHiL CHL8225G, which is a very common voltage controller that offers software voltage control and monitoring.


And I already know of VBE, but its not applicable here since I have fast boot, uefi as well as secure boot enabled. VBE removes the UEFI section from the vBios.
But if this quote is true, then its literally the bios only stopping me from gaining control.


----------



## Echoa

Quote:


> Originally Posted by *NeoReaper*
> 
> And I already know of VBE, but its not applicable here since I have fast boot, uefi as well as secure boot enabled. VBE removes the UEFI section from the vBios.
> But if this quote is true, then its literally the bios only stopping me from gaining control.


no man just open your saved bios from your card in VBE, not edit it lol

itll tell you which VRMs you have









I cant remember but if you mod it using the link i gave before you might be ok to continue using uefi and secure boot but youll have to double check. Most likely it is the bios holding you back yes, if no other bios work your only option is to edit them yourself in a hex editor


----------



## TheLazyCow

Hi, built a new system about 3 weeks ago and I was waiting for the nex gen cards to come out but it seems like its going to take at least a month and a half before I could get one, so pranced upon a cheap 2nd hand R9 280X and I've got to say that I really like it!

My model is the ASUS DirectCU2 TOP. I was pretty surprised my card was able to hit 1200mhz on the core clock on air without changing the voltage (stock 1.15~ around there) and bumped the memory to 1630mhz (using a 1080p screen so rather OC the core clock more)

Checked the ASIC and it was 58.8% so maybe I was lucky, ran 2 full rounds of unigine heaven just for a short test..

Will go back home later to try and push it to 1250 while keeping it below 1.25v and 75celcius!

At what point does improvements on core clock diminish and memory clock gets more performance? Thanks!

Will post pics of the card when I get back!


----------



## ronnin426850

Quote:


> Originally Posted by *mori170*
> 
> I have a Sapphire R9 280x Tri-x, clocked at 1150mhz core with 1.2v and 1600mhz memory with stock memory voltage (1.5v) for 24/7 use. It does 1200mhz core with 1.25v and 1900mhz memory at 1.7v if I keep the temps under 65°C for benchmarks, but it's way too noisy for everyday use so I just stick with 1150mhz
> 
> 
> 
> 
> 
> 
> 
> . I run a custom bios with voltage up to 1.4v, but I can't keep the card under 60°C (going over 60°C it starts to artifact at clocks over 1200) at 1.4v with clocks over 1200mhz, so the max i ran was 1.3v.


Good clocks there







Tri-X has much better VRM compared to Dual-X, and that gives you the base for stable clocks. Arctic Accelero is barely audible at 100% fans by the way.
Also, I sold my card, no longer have it, I'm running a dusty old 8800 GTX now until the new AMD line shows up in stores here


----------



## NeoReaper

Quote:


> Originally Posted by *Echoa*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NeoReaper*
> 
> And I already know of VBE, but its not applicable here since I have fast boot, uefi as well as secure boot enabled. VBE removes the UEFI section from the vBios.
> But if this quote is true, then its literally the bios only stopping me from gaining control.
> 
> 
> 
> no man just open your saved bios from your card in VBE, not edit it lol
> 
> itll tell you which VRMs you have
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I cant remember but if you mod it using the link i gave before you might be ok to continue using uefi and secure boot but youll have to double check. Most likely it is the bios holding you back yes, if no other bios work your only option is to edit them yourself in a hex editor
Click to expand...

Is VBE telling me I have 2 different Vram chips?


----------



## Echoa

Quote:


> Originally Posted by *NeoReaper*
> 
> Is VBE telling me I have 2 different Vram chips?


no just that your bios supports both hynix and elpida as your model could possibly have either one.


----------



## NeoReaper

Well I do know that the Vram identifier program I used a while back identified my card as using Hynix as the memory.


----------



## GoLDii3

Welp. I sold my R9 280X. Time to upgrade to Polaris/Pascal.

Great card,has served me greatly for 3 years.


----------



## TheLazyCow

super dirty but ill send it for RMA come November when the warranty ends and ill give it to my friend.



Didnt have time to play with the voltage yesterday as I went out the whole night, will see if I have time later tonight!


----------



## mori170

Why RMA?


----------



## TheLazyCow

Don't people do that? When the card is going EOL soon, get it as close to brand new as possible. See a lot of people selling older cards with expired / expiring warranty saying its just been RMAed


----------



## F3ERS 2 ASH3S

Quote:


> Originally Posted by *TheLazyCow*
> 
> Don't people do that? When the card is going EOL soon, get it as close to brand new as possible. See a lot of people selling older cards with expired / expiring warranty saying its just been RMAed


Thats a bit on the fraudulent side honestly.... let me go wreck my car real quick so that I can get half of it replaced.. cause engine is BRAND NEW... etc etc..


----------



## Echoa

Quote:


> Originally Posted by *TheLazyCow*
> 
> Don't people do that? When the card is going EOL soon, get it as close to brand new as possible. See a lot of people selling older cards with expired / expiring warranty saying its just been RMAed


yea i wasnt aware of it being a thing, mght be just you man. Its kinda fraudulent like was said and unless it has some issue id just clean it and give it away to your friend and save the time. Assuming you havent put it through hell not much point.


----------



## TheLazyCow

Quote:


> Originally Posted by *F3ERS 2 ASH3S*
> 
> Thats a bit on the fraudulent side honestly.... let me go wreck my car real quick so that I can get half of it replaced.. cause engine is BRAND NEW... etc etc..


Quote:


> Originally Posted by *Echoa*
> 
> yea i wasnt aware of it being a thing, mght be just you man. Its kinda fraudulent like was said and unless it has some issue id just clean it and give it away to your friend and save the time. Assuming you havent put it through hell not much point.


I just got it from a guy 2nd hand, he's been using it for about 2½ years, so im not sure if he's been aggressively overclocking it. I just have it sitting at 1170 core and 1670 memory, 100mhz above stock without any additional voltage so its fine, but is it really that bad to think that way? I find warranty and rma is part of the price package so having one last service stop is reasonable... hmm maybe i'm wrong there. Yeah anyway will clean it up and apply some MX4 before giving it to my friend, warranty is EOY so most likely giving it before then, he can RMA it if he so chooses to.


----------



## JackCY

RMA is for defects so if there is none you are just wasting money on shipping and you get back the same card and the seller will not like you and will remember you for being a douche if you do this all the time. Sometimes the people checking the returned products are benevolent and trust the description of the issue if it's not easily replicable but if it is and such issue is not found they will toss it back to you and you pay the shipping. Maybe in US they don't care but that's how it goes in EU.


----------



## Bartouille




----------



## alexze89

Anyone here know if I can crossfire an R9 280 with an R9 380X ???
I have an old R9 280 and want better GPU performance while staying close to $200, I could switch to a GTX 970, but I'm sure I could get more performance out of a crossfire setup, IF I can crossfire the R9 280 with an R9 380X

What is the consensus?

EDIT:

nevermind, I just realized my MoBo doesnt support X-fire, only SLI T_T
used PCs FTW right? now i'm stuck with this 1500W modular PSU and no reasonable need for that much power.


----------



## Echoa

Quote:


> Originally Posted by *alexze89*
> 
> Anyone here know if I can crossfire an R9 280 with an R9 380X ???
> I have an old R9 280 and want better GPU performance while staying close to $200, I could switch to a GTX 970, but I'm sure I could get more performance out of a crossfire setup, IF I can crossfire the R9 280 with an R9 380X
> 
> What is the consensus?
> 
> EDIT:
> 
> nevermind, I just realized my MoBo doesnt support X-fire, only SLI T_T
> used PCs FTW right? now i'm stuck with this 1500W modular PSU and no reasonable need for that much power.


Don't believe so no, different gcn cores (gcn1 vs 1.2)

Well look at it as you'll have longer before it degrades below your needed output lol psu lose maximum output over time so it's not so bad


----------



## alexze89

Quote:


> Originally Posted by *Echoa*
> 
> Don't believe so no, different gcn cores (gcn1 vs 1.2)
> 
> Well look at it as you'll have longer before it degrades below your needed output lol psu lose maximum output over time so it's not so bad


haha at least there's that. I may end up getting a different more modest PSU for an 80+ platinum rated 750W one. This one started making funny buzzing noises last night so I suspect its nearing EOL. Looks like GPU upgrade will need to be put off anyhow now again.


----------



## JackCY

RMA here we go, finally "pulling the plug" and putting a stop to the artifacts hopefully. Not sure they will be able to replicate it but I've got plenty of videos and screenshots to prove the card is bad when 2.5-3GB VRAM is used and there is no FPS limit turned on in games. It's the typical ASUS 280x DC2T issue, flickering artifacts or straight out black noise spots on the final 2D output. Man I've seen it all by now in various games but didn't play as much to force me to RMA it, now there should be less replacements so either they upgrade me or money back. Will see.


----------



## aaronsta1

so a couple years ago i got a MSI R9 270x gaming 2g

it says on the box it should run at 1120/1500 and ive already had it replaced 2 times.. the first time the fan locked up and i got a white box refurbished card back and it stopped working after a few days and they sent me a retail replacement..

since then the card has NOT reliably ran at 1120.. the default clock is 1080. it might hit 1100 in some games, but most of the time i get screen corruption and the display driver stops responding after a few minutes..

ive heard a few times they dont fix 270x anymore and they send out 280x..

has anyone gotten a replacement card better then the card they send in?

this would be the 3rd time i send it in..


----------



## radier

Send it again and find out.

Taptaptap Mlais M52


----------



## JackCY

Man I would be happy if it was 3rd time it would mean money back.
But I'm sending my 280x first time because I want to sell it so first I want to see what they give me under warranty as replacement or money back would be awesome too, if nothing good then at least a 280x that doesn't artifact once a week randomly and I can sell it if that's what I decide to do. I got it 2nd hand and the seller didn't know or didn't bother to RMA it when his CF was having issues, well no wonder when the card is not 100% stable but only 99%+, load 2.5GB VRAM or more with full blast on the core and it is bound to artifact.

I have no idea what they give back under warranty at this time for 270 nor 280 series, really depends on the service center it gets to I think.


----------



## DuranXL

Hi all,

Got a quick question.

I'm messing around with voltage control in afterburner on my R9 280X.
Basically, it only works when I tick "force constant voltage". Balbabla

Nevermind, I'm an idiot. Voltage control works fine without constant voltage, but have to put load on the GPU







.

Anyway might as well add some info....

Got a Club3d R9 280X RoyalQueen. It's stock 1000/1500mhz boost, rather than pretty much any other R9 280X which boosts to 1ghz.
I thought it was gonna be a bad OC'er because of this, but I got it firestrike stable at 1170/1700mhz @ 1.275v. Not bad I'd say.
Doesn't really get above 65 degrees at 100% fan, VRMs don't go above 80.


----------



## PowerSlide

well my sapphire r9 280x vapor-x tri-x decide to show some artifacts while i was browsing this forum

luckily still a month to go before warranty expires, time to send in and wait for ages for it to return

living other part of the world RMA is very very long


----------



## Catscratch

Quote:


> Originally Posted by *JackCY*
> 
> RMA here we go, finally "pulling the plug" and putting a stop to the artifacts hopefully. Not sure they will be able to replicate it but I've got plenty of videos and screenshots to prove the card is bad when 2.5-3GB VRAM is used and there is no FPS limit turned on in games. It's the typical ASUS 280x DC2T issue, flickering artifacts or straight out black noise spots on the final 2D output. Man I've seen it all by now in various games but didn't play as much to force me to RMA it, now there should be less replacements so either they upgrade me or money back. Will see.


Quote:


> Originally Posted by *JackCY*
> 
> Man I would be happy if it was 3rd time it would mean money back.
> But I'm sending my 280x first time because I want to sell it so first I want to see what they give me under warranty as replacement or money back would be awesome too, if nothing good then at least a 280x that doesn't artifact once a week randomly and I can sell it if that's what I decide to do. I got it 2nd hand and the seller didn't know or didn't bother to RMA it when his CF was having issues, well no wonder when the card is not 100% stable but only 99%+, load 2.5GB VRAM or more with full blast on the core and it is bound to artifact.
> 
> I have no idea what they give back under warranty at this time for 270 nor 280 series, really depends on the service center it gets to I think.


Newegg's rate for that card is really low. Part of reason why I went with Sapphire.
Quote:


> Originally Posted by *PowerSlide*
> 
> well my sapphire r9 280x vapor-x tri-x decide to show some artifacts while i was browsing this forum
> 
> luckily still a month to go before warranty expires, time to send in and wait for ages for it to return
> 
> living other part of the world RMA is very very long


Wow, that card is supposed to be the best 280x out there. Tri-x with Vapor-x. I wanted that card so bad, but by the time my 6850 gave out, it wasn't available here








I had 2 choices, the regular 280x trix with boost, and the non-regular 280x tri-x with no boost. I never liked boost or changing states on gpus, now they have more than 2 states, idle, playing video, gaming, etc. I wonder if vrms can wear quicker when they change voltage all the time than just 1 or 2 states.

Best of luck to both of you on RMA.


----------



## JackCY

VRMs don't care. As long as you don't overheat them above 90-100C.
The ASUS 280x I've sent back because of the artifacts is not a bad card, it was a price/performance killer and still is until cards like 970 drop in 2nd hand prices. Never throttles, the only way to make it lower clocks because of power limit was to lower the power limit to 85-90%, at 100% default limit always 100% clock speed. Unfortunately ASUS or AMD botched something up, I think ASUS did with it's VRAMs being OCed to 6400MHz which is fine I could put them up to 7200MHz but they overvolted them to 1.6V or so and that caused them to artifact possibly because of too much heat on the VRAM, lower clocks on VRAM only helped very little and it's impossible to lower the voltage on VRAM except the Matrix edition I believe. Otherwise the cooler for the chip is fine but it has no cooling for the overvolted and OCed VRAM. They made a Version 2 later with non OC and non overvolted VRAM.

All GPUs have multiple states that they switch based on the performance needed, you can play video at minimum clock but also maximum clock depending on how much of the GPU power you need. It's not fixed to a specific application or something it adjusts based on need. I find Nvidia more ridiculous with this where their GPUs bounced around with the clock like crazy and never stay at the clock they start at. 280x always max clock all the time, no bouncing and being limited by power demand or temperature.

280x, 1080p high settings, plays fine 60-120fps depending on game but it's a power hog.


----------



## Echoa

After spending a couple months with this 4gb 270x I gotta say it's the best 115$ I've ever spent. It's absolutely great at 1080p and for 115$ I don't think I could beat it (except maybe a 280 if it was priced the same but I lose vram) I'm saving this card as a backup if Polaris ends up being amazing


----------



## NeoReaper

Quote:


> Originally Posted by *Echoa*
> 
> After spending a couple months with this 4gb 270x I gotta say it's the best 115$ I've ever spent. It's absolutely great at 1080p and for 115$ I don't think I could beat it (except maybe a 280 if it was priced the same but I lose vram) I'm saving this card as a backup if Polaris ends up being amazing


I would love to say the same but since XFX seems to have thrown some random buggy bios on to my card, I ain't getting anything near that feel from this card. I did run that GPU-Z ASIC thing and here is the results for my card if it means anything:


----------



## JackCY

ASIC irrelevant.


----------



## Echoa

Quote:


> Originally Posted by *NeoReaper*
> 
> I would love to say the same but since XFX seems to have thrown some random buggy bios on to my card, I ain't getting anything near that feel from this card. I did run that GPU-Z ASIC thing and here is the results for my card if it means anything:


Mostly that doesn't matter except if you're trying to get certain clocks under certain conditions it might help but still a lottery.


----------



## NeoReaper

Quote:


> Originally Posted by *JackCY*
> 
> ASIC irrelevant.


ASIC might be irrelevant but XFX's excuses for why they won't fix my cards buggy bios is irrelevant.


----------



## Catscratch

It seems I have a weird card according to techpowerup database. My card in appearance is a sapphire 280x tri-x OC non-boost version. 1020 mhz / 6000 mhz. The SKU on the back is correct 11221-22, according to sapphiretech.com.

Sapphiretech used to have the exact page for this card in the past, now the specs seem different for the sku. They did quite the shuffling around







The bios version 015.046.000.016.000000 which I have is listed for Sapphire Dual-X 280x on techpowerup







I guess this is a low bin chip that could only sustain these specs so they put a static speed bios for it







I could do with 6400 memory thou :/


----------



## JackCY

TPU VGA bioses are a complete chaos. I wouldn't download and use one from there, not a trusted accurate source for VGA bios IMHO.
I had the same or similar code on my ASUS 280x DC2T bios. Definitely 15.46 kind. I don't think the numbers really represent a unique VGA bios version, more like oh hey it's a Tahiti bios XD


----------



## MasterBillyQuizBoy

I was wondering what some of you folks with the 270X are getting in the demo version of 3DMark, Firestrike.

I get around 5500 with a single card and 8300 with Crossfire. This is with an FX-6300 clocked to 4.4ghz.


----------



## bichael

Hey. I used to get up to around 5600 in Firestrike (graphics a fraction under 7000) with a 270x and G3258 at around 4.7GHz, so pretty similar. Now using a 390 which gets around 8700 (graphics close to 13000).


----------



## MasterBillyQuizBoy

Quote:


> Originally Posted by *NeoReaper*
> 
> ASIC might be irrelevant but XFX's excuses for why they won't fix my cards buggy bios is irrelevant.


Strange. I bought a used card of ebay, sent it in for RMA, they couldn't find anything wrong with it but replaced it anyway.

What was their explanation? I'm thinking of sticking with them since I was pretty impressed with their customer service.


----------



## MasterBillyQuizBoy

Quote:


> Originally Posted by *bichael*
> 
> Hey. I used to get up to around 5600 in Firestrike (graphics a fraction under 7000) with a 270x and G3258 at around 4.7GHz, so pretty similar. Now using a 390 which gets around 8700 (graphics close to 13000).


Cool, seems to be in line. Waiting for the next few months. X-Plane eats VRAM for breakfast and I may pick up an 8gb 390X once I see the prices move around.


----------



## NeoReaper

Quote:


> Originally Posted by *MasterBillyQuizBoy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NeoReaper*
> 
> ASIC might be irrelevant but XFX's excuses for why they won't fix my cards buggy bios is irrelevant.
> 
> 
> 
> Strange. I bought a used card of ebay, sent it in for RMA, they couldn't find anything wrong with it but replaced it anyway.
> 
> What was their explanation? I'm thinking of sticking with them since I was pretty impressed with their customer service.
Click to expand...

Well firstly the card is about to hit 2 years old since I got it and secondly my card was shipped to me without its Warranty sticker attached to the PCB and I never noticed.


----------



## Retrorockit

I have some specific needs in a video card for an older computer. I purchased a couple of cards that were too new and didn't support support the BIOS in my older computers. I'm building another old computer that has Crossfire support. For now Ill be using a pair of HD6970s I already have. I'm looking for the most powerful video cards that are compatible with my old architecture. The HD7970, and related R9-280X look like they might be what I need.
I've made a hobby of overclocking old Dell BTX computers. Specifically LGA775 stuff.
This means locked BIOS, no aftermarket cooling parts available, Legacy BIOS usually. PCIE 16x 2.x ,and SATA 3GB speed.

The problem I've had in the past is with R9-285 ITX cards.
First problem is they don't support VESA video mode 103. So can't open my BIOS Setup screen.
The second problem is I can't Crossfire them because the ITX cards are a little thicker than normal 2 slot cards and there's no room for airflow into the second card.
The Sapphire cards I used have a dual BIOS switch, so they're not only compatible but future proof also. (Except for VESA 103 issue.)

The VESA mode 103 issue seems to have started with the R series cards. I know of 1 person who Crossfired a pair of HD7850s in the same computer I'm working on. So I think HD7970 should work. I've seem reports that HD7970 can Crossfire with R9-280X also.

Here's what I think I need.
1- Legacy BIOS support, preferably Dual BIOS.
2- Vesa mode 103 to display Dell BIOS screen. (AMD says GCN cards don't support this.) HD7xxx seems good for this.
3- Crossfire bridge connection due to PCIe v.2.0 speed.
4- More than 2 GB memory would be nice

I've tried AMD, Sapphire, and the usual BIOS mod suspects ( MB, and GPU ) and haven't found a solution.

My last project was a Dell Dimension E520. This Pentium 4 era beast ran a QX6800 at speeds up to 4 GHZ (validation only) and at 3.72GHz, and with an R9-285 scored 7000 in Firestrike (58% rating). See link in my sig.

My current project is a 1 year newer Dell Precision T3400 workstation. It supports QX9650, and 2- PCIe 16x full speed slots. VESA mode 103 still applies. I will probably start out with a QX6800, and HD6970s. this will tell me a lot about the MB VRM and cooling system mods.

Any advice on which video cards would be suitable for this would be appreciated. Especially any Brands that don't support my requirements.


----------



## Catscratch

Quote:


> Originally Posted by *Retrorockit*
> 
> I have some specific needs in a video card for an older computer. I purchased a couple of cards that were too new and didn't support support the BIOS in my older computers. I'm building another old computer that has Crossfire support. For now Ill be using a pair of HD6970s I already have. I'm looking for the most powerful video cards that are compatible with my old architecture. The HD7970, and related R9-280X look like they might be what I need.
> I've made a hobby of overclocking old Dell BTX computers. Specifically LGA775 stuff.
> This means locked BIOS, no aftermarket cooling parts available, Legacy BIOS usually. PCIE 16x 2.x ,and SATA 3GB speed.
> 
> The problem I've had in the past is with R9-285 ITX cards.
> First problem is they don't support VESA video mode 103. So can't open my BIOS Setup screen.
> The second problem is I can't Crossfire them because the ITX cards are a little thicker than normal 2 slot cards and there's no room for airflow into the second card.
> The Sapphire cards I used have a dual BIOS switch, so they're not only compatible but future proof also. (Except for VESA 103 issue.)
> 
> The VESA mode 103 issue seems to have started with the R series cards. I know of 1 person who Crossfired a pair of HD7850s in the same computer I'm working on. So I think HD7970 should work. I've seem reports that HD7970 can Crossfire with R9-280X also.
> 
> Here's what I think I need.
> 1- Legacy BIOS support, preferably Dual BIOS.
> 2- Vesa mode 103 to display Dell BIOS screen. (AMD says GCN cards don't support this.) HD7xxx seems good for this.
> 3- Crossfire bridge connection due to PCIe v.2.0 speed.
> 4- More than 2 GB memory would be nice
> 
> I've tried AMD, Sapphire, and the usual BIOS mod suspects ( MB, and GPU ) and haven't found a solution.
> 
> My last project was a Dell Dimension E520. This Pentium 4 era beast ran a QX6800 at speeds up to 4 GHZ (validation only) and at 3.72GHz, and with an R9-285 scored 7000 in Firestrike (58% rating). See link in my sig.
> 
> My current project is a 1 year newer Dell Precision T3400 workstation. It supports QX9650, and 2- PCIe 16x full speed slots. VESA mode 103 still applies. I will probably start out with a QX6800, and HD6970s. this will tell me a lot about the MB VRM and cooling system mods.
> 
> Any advice on which video cards would be suitable for this would be appreciated. Especially any Brands that don't support my requirements.


Cool hobby.
Quote:


> I use a specific GTX 960 and it works just fine in my Precision T3400. I do have problems with ATI R Series like the R7 260X where they wont go into cmos setup complaining that they cannot do DOS VESA Mode 103.
> 
> EVGA 02G-P4-2966-KR GTX 960 2GB 128-Bit GDDR5 SuperSC ACX 2.0+ Video Card


from http://en.community.dell.com/support-forums/desktop/f/3514/t/19642933

Everything suggests R7 and R9 won't work for vesa 103. I wasn't aware of that too.


----------



## JackCY

Time to upgrade maybe? Do similar NV cards support your needed Vesa 103 prehistoric mode? 800x600x256 colors, for real?


----------



## Retrorockit

Yes NVidia cards don't have the VESA 103 issue. But then I would have to figure out the SLI vs Crossfire issue. I'm hoping the HD7xxx DNA of the R9-280 specifically might work in my favor. Just to be clear this is a video card BIOS issue, and not drivers. I would love to SLI a pair of GTX 960s, or 970s. But I already have the HD6970s to play with, and a 850W PSU also.

I post at Tomshardware and one of the most common questions is what GPU can I put in the old refurbished Dell I just bought. There are literally millions of these computers out there. I was very happy with the price/performance of the R9-285 ITX card and the fact that it almost fits in a Micro BTX. The R9-280X is slightly faster than the 285 and the 285 doesn't have the Crossfire bridge I would need.

Dell forums is moderated by Dell employees and they will take down posts that help upgrade their older computers. They took down one page that listed Q9650 CPU compatibility for the Optiplex 780 when they only sold them with Q6700. So you can post problems with old computers, but don't look for solutions there. Last time I tried to log on there my account was marked "read only". Dell also will not help support the use of non Dell parts.

This is their reply to my request for help with the VESA 103 issue.

"That $300 video card is for a high powered, much faster and much more recent computer. The E520 is an obsolete model now and cannot run that card. You should return the card and put that money towards a new computer. You can buy a new computer now that cost less than the video card."

I paid $170 for the ITX 2 GB R9-285. But it was 1/2 step too far I think.

The T3400 has a X38 chipset which is not listed for the HD6,HD7, and R series cards, yet Justajohn at Toms got 4.15GHz and 7500 in Firestrike with a QX9650, and 2- HD7850s. Sometimes you just have to find out for yourself what works and what doesn't.

The T3400 IS my upgrade!


----------



## JackCY

You don't really want to touch those DELLs, they are closed proprietary system much like Apple and their proprietary hardware that is quite comparable to DELL these days since both use modern PC parts just custom mobos and other stuff to lock the platform down and prevent easy upgrading.
You can build a cheap recent PC no problem. Those DELLs are for business, say government, schools and such and when you have to touch them at those places you often wanna take a hammer and wreck that piece of slow crap


----------



## Catscratch

Quote:


> Originally Posted by *Retrorockit*
> 
> Yes NVidia cards don't have the VESA 103 issue. But then I would have to figure out the SLI vs Crossfire issue. I'm hoping the HD7xxx DNA of the R9-280 specifically might work in my favor. Just to be clear this is a video card BIOS issue, and not drivers. I would love to SLI a pair of GTX 960s, or 970s. But I already have the HD6970s to play with, and a 850W PSU also.
> 
> I post at Tomshardware and one of the most common questions is what GPU can I put in the old refurbished Dell I just bought. There are literally millions of these computers out there. I was very happy with the price/performance of the R9-285 ITX card and the fact that it almost fits in a Micro BTX. The R9-280X is slightly faster than the 285 and the 285 doesn't have the Crossfire bridge I would need.
> 
> Dell forums is moderated by Dell employees and they will take down posts that help upgrade their older computers. They took down one page that listed Q9650 CPU compatibility for the Optiplex 780 when they only sold them with Q6700. So you can post problems with old computers, but don't look for solutions there. Last time I tried to log on there my account was marked "read only". Dell also will not help support the use of non Dell parts.
> 
> This is their reply to my request for help with the VESA 103 issue.
> 
> "That $300 video card is for a high powered, much faster and much more recent computer. The E520 is an obsolete model now and cannot run that card. You should return the card and put that money towards a new computer. You can buy a new computer now that cost less than the video card."
> 
> I paid $170 for the ITX 2 GB R9-285. But it was 1/2 step too far I think.
> 
> The T3400 has a X38 chipset which is not listed for the HD6,HD7, and R series cards, yet Justajohn at Toms got 4.15GHz and 7500 in Firestrike with a QX9650, and 2- HD7850s. Sometimes you just have to find out for yourself what works and what doesn't.
> 
> The T3400 IS my upgrade!


You might get away with 280x is basically being 7970 but you never know if they restrict things in bios. Some 7970s accept 280x bios. Vice versa could also be true but you need to get a cheap one to justify if it goes south. r9 285 is a good card despite the 2gb memory. It's gcn 1.2 (280x is gcn 1.0) It has more ACEs which matter in dx12 and it'll likely to edge out 280x in the future. The lack of 1gb doesn't matter because it has better compression thus needs less space for textures.


----------



## Retrorockit

My Dell belongs to ME. It's out of warranty, and Dell is not responsible for any thing that happens with it now. It started out as a home computer with 2.3GHz Core2Duo and 2GB RAM,with 256mb GPU. I'm retired and I could build a new computer in about an hour or so. It would be just like everyone else's computer. As far as I can tell I have the only Dell Dimension E520 That's gone 4GHz. Open the link in my sig. then click on the image there. You will see where I'm at related to other QX6800s. I have to discover or invent my cooling systems. I don't care if it takes weeks, or even months to figure something out. The LGA 775s are obsolete in the business environment and are being scrapped by the thousands. There are a lot of people who get them for little or nothing and want to upgrade them. When computers are out of warranty, or off of service contract, businesses don't want them anymore.

I mostly post in LGA775 club where my E520 isn't OT. I'd really like to hear some replies about the R9-280, HD7970 features that interest me, and less about how old my computer is. I also post in Delta Fanatics forum.


----------



## Retrorockit

Quote:


> Originally Posted by *Catscratch*
> 
> You might get away with 280x is basically being 7970 but you never know if they restrict things in bios. Some 7970s accept 280x bios. Vice versa could also be true but you need to get a cheap one to justify if it goes south. r9 285 is a good card despite the 2gb memory. It's gcn 1.2 (280x is gcn 1.0) It has more ACEs which matter in dx12 and it'll likely to edge out 280x in the future. The lack of 1gb doesn't matter because it has better compression thus needs less space for textures.


Yes I think it might work. The 285 was the right card for the Micro BTX cases, and very few old Dells are going to be running with over 1080p monitors anyway. It was perfect except for the VESA 103 issue. It was a nice step up from theGTX750 that's kind of marginal really.
But now The thickness of the card keeps it from being good in my Crossfire setup ( I own 2 of them and tried it for fit), and no Crossfire bridge means it needs PCIe 3.0 so I'm looking for something equivalent that won't have the BIOS issue. It actually kind of funny to be locked out of a BIOS that's locked in the first place. I can probably set everything up with the 6970s than use a card that blocks the BIOS anyway. But I couldn't recomend it to others. I just order 3 128GB SSDs today.($40 each). I can run 3 way RAID 0 and add 1 HDD for backup in RAID1. But if I go Crossfire I should probably break the 1080p barrier at the same time. Maybe an HD7970 with a 280X in Crossfire. Then I could find out what's up with VESA 103 by switching them.

Maybe I should get the 280X first and see if VESA 103 is there. Then choose the second card based on that. I could contact tech support and see what they have to say about it first. I'm not the first one to have this problem, but I might be the only one that's going to bother to solve it.


----------



## neurotix

Quote:


> Originally Posted by *Retrorockit*
> 
> Yes I think it might work. The 285 was the right card for the Micro BTX cases, and very few old Dells are going to be running with over 1080p monitors anyway. It was perfect except for the VESA 103 issue. It was a nice step up from theGTX750 that's kind of marginal really.
> But now The thickness of the card keeps it from being good in my Crossfire setup ( I own 2 of them and tried it for fit), *and no Crossfire bridge means it needs PCIe 3.0* so I'm looking for something equivalent that won't have the BIOS issue. It actually kind of funny to be locked out of a BIOS that's locked in the first place. I can probably set everything up with the 6970s than use a card that blocks the BIOS anyway. But I couldn't recomend it to others. I just order 3 128GB SSDs today.($40 each). I can run 3 way RAID 0 and add 1 HDD for backup in RAID1. But if I go Crossfire I should probably break the 1080p barrier at the same time. Maybe an HD7970 with a 280X in Crossfire. Then I could find out what's up with VESA 103 by switching them.


I don't think XDMA Crossfire requires any specific PCI-E version.

I know because I've personally done XDMA Crossfire with 2x 290s on a Crosshair V Formula, which only supports PCI-E 2.0.

I don't know if it would have problems on PCI-E 1.0 or something, which you might have.


----------



## Retrorockit

Thanks for the info. I do have 2- 16X PCIe 2.0. I've read somewhere that PCIe 2.0 doesn't have the bandwidth for non linked Crossfire. But then again my whole hobby consists of ignoring most of the "Information" that's out there. The VESA 103 issue seems to begin right where the linked Crossfire ends. It also seems to begin right where ITX cards begin to appear. So for now it looks like HD7970 for Crossfire, and Nvidia ITX cards for single GPU in the BTX mini towers.. This is my first Crossfire setup so I don't want to get into software SLI if I don't have to. R9-280X is still a question. If my R9-285s weren't ITX I could test it myself.


----------



## Echoa

Quote:


> Originally Posted by *Retrorockit*
> 
> Thanks for the info. I do have 2- 16X PCIe 2.0. I've read somewhere that PCIe 2.0 doesn't have the bandwidth for non linked Crossfire. But then again my whole hobby consists of ignoring most of the "Information" that's out there. The VESA 103 issue seems to begin right where the linked Crossfire ends. It also seems to begin right where ITX cards begin to appear. So for now it looks like HD7970 for Crossfire, and Nvidia ITX cards for single GPU in the BTX mini towers.. This is my first Crossfire setup so I don't want to get into software SLI if I don't have to. R9-280X is still a question. If my R9-285s weren't ITX I could test it myself.


Depends on the cards, and even with something like a fury or 390x pcie 2 might only cut a couple fps at most and that's still a maybe and unlikely. The performance loss at 2.0x8 is at most 3%, or effectively nothing


----------



## Retrorockit

I actually have 2 PCIe 2.0 16x slots, and an 8,4,1x. slot. I believe that only if I use the 8x slot will my second 16x drop down to 8x. The only issue I've heard of with PCIe 2.0 is putting Crossfire data on the bus in addition to the normal video data. 3.0 has the extra bandwidth. I could run a normal R9-285 2GB behind one of the ITX cards and see what happens. My maximum CPU is going to be about 4.2GHz QX9650. I wonder at what point it will bottleneck?


----------



## Echoa

Quote:


> Originally Posted by *Retrorockit*
> 
> I actually have 2 PCIe 2.0 16x slots, and an 8,4,1x. slot. I believe that only if I use the 8x slot will my second 16x drop down to 8x. The only issue I've heard of with PCIe 2.0 is putting Crossfire data on the bus in addition to the normal video data. 3.0 has the extra bandwidth. I could run a normal R9-285 2GB behind one of the ITX cards and see what happens. My maximum CPU is going to be about 4.2GHz QX9650. I wonder at what point it will bottleneck?


CPU is your slowest point there, the rest is all fine, don't worry pcie 2x16 is fine it's the same as [email protected] so it won't be a bottleneck. You'll hit bottlenecks for pcie at x4 2.0 and pcie 1.1, your pcie bus is the least concern you have


----------



## Retrorockit

I just remembered seeing PCIe 16x extension cables, and 90* adapters. I looked at some of them and I think I could Crossfire my ITX cards that way. This would also uncover my PCIe 8x slot so 16x,8x,8x might be possible. 3 way R9-285 2GB? The 90* adapters all turned the wrong way. I also saw some 16x spacers that would help with a few BTXs where large MB caps get in the way of 2 slot GPUs. There was a 8x to 16x cable listed. Maybe I'll run this by the guys at the Tonga forum.


----------



## Bubblewhale

Has anyone gotten any luck with their Asus 270X TOP? I think i may have a good 270X, as i can reach 1300MHZ Core.
Max i could do was 1205MHZ on core, as 1310MHZ would blue screen.
1205MHZ was the max i could do in Fire Strike, but i run 1255MHZ daily.
http://www.3dmark.com/fs/8761440


----------



## neurotix

What's the ASIC?

My 270X can do 1300mhz as well but only in Valley, in most games it will freeze pretty quickly.

I can game on it at 1250mhz though.


----------



## Bubblewhale

Quote:


> Originally Posted by *neurotix*
> 
> What's the ASIC?
> 
> My 270X can do 1300mhz as well but only in Valley, in most games it will freeze pretty quickly.
> 
> I can game on it at 1250mhz though.


70.7%, didn't expect it to OC that high though..
I can do 1300MHZ stable in games, but i just run 1255MHZ daily.


----------



## Agiel

PCIex 3.0 vs 2.0 was 1% back when 7990 was the best GPU ... so i dont think u will notice any fps lost now ...


----------



## Agiel

can i be part of this club ?? i have a Sapphire r9 280 3GB OC 940/1250 i havent tried to Oc it any further cause my CX600M maybe cant handle more power from CPU and GPU OC'd


----------



## Bubblewhale

Quote:


> Originally Posted by *Agiel*
> 
> can i be part of this club ?? i have a Sapphire r9 280 3GB OC 940/1250 i havent tried to Oc it any further cause my CX600M maybe cant handle more power from CPU and GPU OC'd


I have a CX750M, planning to get a EVGA P2 1000W and i wonder if that's gonna help with "stability".


----------



## Agiel

well some one told me im fine as long as dont get crazy in OC anything, i will try to get a EVA Supernova as soon as i can afford it.


----------



## JackCY

And the ordeal begins, 32 days since RMA started, trying to get my money back from the seller.
Lucky as I am the person processing it is on holiday this week, awesome isn't it...

Goodbye ASUS R9 280x DC2T, don't want to see you anymore


----------



## Agiel

Quote:


> Originally Posted by *JackCY*
> 
> And the ordeal begins, 32 days since RMA started, trying to get my money back from the seller.
> Lucky as I am the person processing it is on holiday this week, awesome isn't it...
> 
> Goodbye ASUS R9 280x DC2T, don't want to see you anymore


why the card died ? i heard that Asus Matrix has problems, neve heard that DCT2 also has problems


----------



## JackCY

280x DC2T v1 has overvolted VRAM by ASUS and most of them artifact due to VRAM issues, search it and it comes up #1 on Google for years like "ASUS 280x RMA". It happens when running without ingame nor outgame fps limiter, simply when the card is at 100% usage it's much more likely to run into the VRAM artifact issue especially with games, modern ones, that use around or over 2.5GB VRAM. It can also be caused by the core chip itself but in case of these ASUS cards it was IMHO caused by ASUS themselves most of the time. They later did a v2 version which had nonOC VRAM voltage and reduced VRAM clocks too with a bigger/more slots occupied cooler.

They simply pushed the 7970 chips in 280x a lot in terms of clocks and even the VRAM when it came to ASUS.
Some were lucky and it worked, but many weren't. I've had the issue since I got the card but it was kind of random to occur and I often used fps limiter. But later newer games, no limit, more artifacts, ... I've had enough videos and screenshots of the artifacts so I sent it to RMA. Problem is I'm a second owner and the shop won't pay me back after 30 days of the issue not being resolved and I have to wait for ASUS RMA to process the card so I'm pretty much at the mercy of ASUS right now, how fast they are and what they give me back. 32 days and counting.


----------



## MiladEd

Any fellow R9 280X owners planning an upgrade to RX480? I sure am. I think after I saw the benches, I'll get the 8 GB ref. version, as I think the cooler looks freakishly sexy, and I've got a smaller case with a window (my own mod) so I'd like to see the sexy cooler all the time. I also have a very overpowered PSU (850 Watt 80+ Platinum) so I can CF it later if I decided.


----------



## JackCY

I might upgrade depending on how long and how the RMA goes.
8GB for sure, but an AIB version, the reference cooler is small which is fine for the stock reference cards but not for OC. Plus you might want to wait for AIB small versions for ITX cases. The reference PCB is small but the cooler is much larger than the PCB due to the blower style. You could mod it with an axial fan for sure and have a tiny DIY card.


----------



## Catscratch

Quote:


> Originally Posted by *MiladEd*
> 
> Any fellow R9 280X owners planning an upgrade to RX480? I sure am. I think after I saw the benches, I'll get the 8 GB ref. version, as I think the cooler looks freakishly sexy, and I've got a smaller case with a window (my own mod) so I'd like to see the sexy cooler all the time. I also have a very overpowered PSU (850 Watt 80+ Platinum) so I can CF it later if I decided.


I'm gonna wait and see the entire lineup until the end of the year and the begining of the next. I might go for a possible RX 485 if they ever release that.


----------



## Bubblewhale

Quote:


> Originally Posted by *MiladEd*
> 
> Any fellow R9 280X owners planning an upgrade to RX480? I sure am. I think after I saw the benches, I'll get the 8 GB ref. version, as I think the cooler looks freakishly sexy, and I've got a smaller case with a window (my own mod) so I'd like to see the sexy cooler all the time. I also have a very overpowered PSU (850 Watt 80+ Platinum) so I can CF it later if I decided.


I'm planning a upgrade to a RX 480 if there is a MSI Lightning model, otherwise it's waiting for Vega.


----------



## MiladEd

Quote:


> Originally Posted by *Bubblewhale*
> 
> I'm planning a upgrade to a RX 480 if there is a MSI Lightning model, otherwise it's waiting for Vega.


I saw some pictures of an alleged "Lightning" RX480, with 8+6 pin power connectors. So keep your fingers crossed!


----------



## Agiel

MSI and Asus were "cought" cheating resently, turns out, that they are sending cards to reviewers with higher Clocks to make a better impresion ... lol. shame of them !!


----------



## Bubblewhale

Quote:


> Originally Posted by *MiladEd*
> 
> I saw some pictures of an alleged "Lightning" RX480, with 8+6 pin power connectors. So keep your fingers crossed!


That kinda looked weird, as it was a 290X cooler slapped on a 290/390/x PCB or so.


----------



## JackCY

So ASUS is sending something back from RMA, I dunno what it is but I bet they dug out somewhere a different refurbished 280x or just changed the serial number sticker. Will see in a couple days.
Expected total time for RMA... 17.5. - 30.6. ... 45 freakin' days.


----------



## Bubblewhale

Quote:


> Originally Posted by *JackCY*
> 
> So ASUS is sending something back from RMA, I dunno what it is but I bet they dug out somewhere a different refurbished 280x or just changed the serial number sticker. Will see in a couple days.
> Expected total time for RMA... 17.5. - 30.6. ... 45 freakin' days.


It's Asus!
At least you got through RMA...


----------



## Echoa

So with the rx480 out now I decided to push my 270x as far as I could. I pushed the voltage to 1296mv and have managed to get 1300/1500 crysis3 , heaven, valley, and Occt gpu test stable and error free (so far). Will test witcher 3, tomb raider, and bio shock on my day off to further confirm, but should hold me over till I can get a 480


----------



## aaronsta1

Quote:


> Originally Posted by *Echoa*
> 
> So with the rx480 out now I decided to push my 270x as far as I could. I pushed the voltage to 1296mv and have managed to get 1300/1500 crysis3 , heaven, valley, and Occt gpu test stable and error free (so far). Will test witcher 3, tomb raider, and bio shock on my day off to further confirm, but should hold me over till I can get a 480


wow thats a good OC..

my msi r9 270X oc gaming 2g wont even do the specified 1120. i was thinking about RMAing it.


----------



## PowerSlide

RMA....i'm going thru a bad one

born and live in a ****ty country you get ****ty service

if anybody interested to read it's below, posted on local forum where i live

https://forum.lowyat.net/topic/3990244


----------



## MiladEd

So, RX480 benches are out, it's pretty much what I expected. 390-390X performance for 240$, which is pretty good. The power usage is still high compared to Nvidia, but I don't really care about that. What was most disappointing for me was the reference cooler. I kinda half expected it to suck, but I hoped that it would be good, since I really like the looks of it. Anyway, I'm gonna wait for the Sapphire Nitro RX480 to be available, with an 8 pin connector (or so I've heard) and some decent overclocking headroom. Hopefully if I get a decent chip I can OC it up to ~1450 MHz and maybe slightly surpass 390X levels. That's plenty of graphical processing power for me, I only game at 1080p, and I don't even have any new AAA titles of 2016, my newest games are MGSV, Witcher 3, Mad Max and GTA V.


----------



## JackCY

*ASUS never again*, this is what ASUS RMA sent back after 45 days, a card I can't even put into the PCIe because it's so bent up and misfit all around from different used parts:




The shoddy job on the back side when it comes to soldering quality around the VRMs I didn't even mention, like I wouldn't let that through QA out of the factory coz I'm not sure all the caps are even soldered on and the traces aren't broken somewhere around the pins.
Card looks cleaned up when before it was a huge ball of dust and gunk IMHO, I can still see some hiding say around the CF connector and other hidden tiny places.

So RMA #2...

Anyone know what place is ASUS monitoring? Like some companies have reddit or other social media and when you complain about bad services they actually try to resolve it so their public image doesn't get trashed too much?


----------



## PowerSlide

WOW that is bad! it looks like someone sit on the card


----------



## JackCY

Yeah man, either someone dropped it or they messed up pretty bad. I think my previous one was a little bent too but it was minimal, this one though... I can't get it into the PCIe not without forcing it in and the metal I/O scraping the edge of my mobo, no thank you. Even the bottom DVI is bent.


----------



## PowerSlide

Quote:


> Originally Posted by *JackCY*
> 
> Yeah man, either someone dropped it or they messed up pretty bad. I think my previous one was a little bent too but it was minimal, this one though... I can't get it into the PCIe not without forcing it in and the metal I/O scraping the edge of my mobo, no thank you. Even the bottom DVI is bent.


the card looks like being dumped in a bin somewhere with stacks of card on top each other, now you need one they just dust it off and off it goes to you

goodluck with the next RMA


----------



## JackCY

Well I looked at the card even more today and the whole DVI housing is warped. It was IMHO definitely dropped at an angle right at the bottom corner of the I/O ports, such as you hold a cube by one corner and it falls on the opposite corner kind of drop. And by dropped I mean some pretty big force impact, probably full card including the heavy cooler on it, it takes some damn force to bend the metal I/O, warp the DVI housing while it's all screwed and soldered to a PCB. It's not that easy to capture all the warping going on in pictures but in hand and looking close it's damn obvious something is wrong. The DVI housing might have also hit the top cooler cover and that possibly move/rotate the fan mounting strips which would explain the tilted fans.

I'm giving the shop a couple days to reply with something sensible (such as negotiating an upgrade or money back from ASUS), they've been informed ASAP, after that I'm sending it back and they can slap ASUS with it themselves because it's ASUS's fault.
I bet ASUS doesn't have these cards anymore beside some complete junk, they (local ASUS RMA/service center) have to send a card back to factory (half across the globe) and they send them some trash in return. I mean come on ASUS, in the end it will cost them more in shipping than what the card is worth.


----------



## LeoKesler

Can someone provide a bios for my R9 270x vapor-x (UEFI) ?

I checked with VBE7 and with my original BIOS I get the warning about "UEFI image detected". But with techpowerup bios I dont get any warning.

Thanks.

P/N: 299-1E270-000SA 2GB GDDR5 Hynix


----------



## Catscratch

Quote:


> Originally Posted by *PowerSlide*
> 
> RMA....i'm going thru a bad one
> 
> born and live in a ****ty country you get ****ty service
> 
> if anybody interested to read it's below, posted on local forum where i live
> 
> https://forum.lowyat.net/topic/3990244


What do they mean by "out of manufacturer warranty" ? is it first year manufacturer and second year distributor's ? You shouldn't wait to go to consumer/customer court. I know it can be a bit more hassle but their influence is larger and they know 1 or 2 tricks to push them to actually pay you back what you spent at least.


----------



## JackCY

Well the product's warranty starts with the manufacturer when it ships out from them down the supply chain. The warranty starts going on the product. BUT when you buy it it could have been, usually is resold 2 times at least between distributors, shipped half across the globe and stuck in storage of a shop, say 6 months have passed since the manufacturer last saw the card when they sold it but for you you only see it today and for you the shop, distributor, I don't honestly care who, offer say 3 year warranty. Problem is who ever is going to claim warranty with the manufacturer has it running 6 months earlier than you. Of course some companies have their own distribution and service centers etc. so the middleman resellers are fewer and this warranty time not as big always.
But you can still run into this when you return in warranty at the end, the company that bought it from the manufacturer doesn't have the warranty anymore on it with the manufacturer and they try to sweep it all under the carpet so they don't have to eat that cost of the card as a loss. But by law in most developed countries you do have warranty, they don't, it's their problem, not yours, but they may still try to put it on you to eat the loss instead.

It used to be quite normal years ago that when product was not made anymore but still in warranty they would upgrade you to a recent or at least more recent product that they do have. Say RMA 280x now, get a decent refurbished or new 380x back, sure I wouldn't expect RX 480 new since it just launched and they probably have some left over 380xs. Still this is how it should go, if they cannot replace with the same working product they should either upgrade you free of charge or give you money back.

It's always a big hassle to return anything as it's not up to the shop what you get but up to who ever they bought it from.

Unfortunately finding the necessary laws and such to know how it is precisely in your country and what your rights are to the letter is almost impossible. Unless you know a lawyer or own a shop and the shop has a lawyer then you most likely won't find out.

I've had warranty on my 280x until April 2017, now what they've sent me back it says 24 months, so until end of June 2018 and it's pretty much DOA. Too bad I don't live next to the shop so I can RMA it for free, but I gotta send it to the shop which costs me







Otherwise I would happily RMA every damn crap they send me until they finally get it right.


----------



## PowerSlide

Quote:


> Originally Posted by *JackCY*
> 
> Well the product's warranty starts with the manufacturer when it ships out from them down the supply chain. The warranty starts going on the product. BUT when you buy it it could have been, usually is resold 2 times at least between distributors, shipped half across the globe and stuck in storage of a shop, say 6 months have passed since the manufacturer last saw the card when they sold it but for you you only see it today and for you the shop, distributor, I don't honestly care who, offer say 3 year warranty. Problem is who ever is going to claim warranty with the manufacturer has it running 6 months earlier than you. Of course some companies have their own distribution and service centers etc. so the middleman resellers are fewer and this warranty time not as big always.
> But you can still run into this when you return in warranty at the end, the company that bought it from the manufacturer doesn't have the warranty anymore on it with the manufacturer and they try to sweep it all under the carpet so they don't have to eat that cost of the card as a loss. But by law in most developed countries you do have warranty, they don't, it's their problem, not yours, but they may still try to put it on you to eat the loss instead.
> 
> It used to be quite normal years ago that when product was not made anymore but still in warranty they would upgrade you to a recent or at least more recent product that they do have. Say RMA 280x now, get a decent refurbished or new 380x back, sure I wouldn't expect RX 480 new since it just launched and they probably have some left over 380xs. Still this is how it should go, if they cannot replace with the same working product they should either upgrade you free of charge or give you money back.
> 
> It's always a big hassle to return anything as it's not up to the shop what you get but up to who ever they bought it from.
> 
> Unfortunately finding the necessary laws and such to know how it is precisely in your country and what your rights are to the letter is almost impossible. Unless you know a lawyer or own a shop and the shop has a lawyer then you most likely won't find out.
> 
> I've had warranty on my 280x until April 2017, now what they've sent me back it says 24 months, so until end of June 2018 and it's pretty much DOA. Too bad I don't live next to the shop so I can RMA it for free, but I gotta send it to the shop which costs me
> 
> 
> 
> 
> 
> 
> 
> Otherwise I would happily RMA every damn crap they send me until they finally get it right.


good explanation







and a rep for you

i have checked the law in my country and nothing is said for product that still in warranty but EOL

but some kind hearted guys help me contact sapphire directly from marketing to customer service but i don't put any hope sapphire will help me out in anyway

so all in all i'm screwed


----------



## JackCY

You claim warranty where the card was bought (often computer parts shop), or who ever has sold the card to the end user (you or someone else). It is their problem to sort things out and resolve the issue, get you a replacement or money back. They contact their supplier and RMA it with them which can be the manufacturer's official distributor, service center, or it could also be some 3rd party unofficial distributor. Point being the food chain should be that you are on the top and poop (bad card, issues, ...) falls down to the manufacturer not the other way around. You've got your contract with the shop that states the warranty period for you and they have to respect it, otherwise they are probably breaking some laws and trying to not have to eat the loss and instead force it on you.

Here, IMHO the biggest shops are the most expensive because they put these losses from warranties into the buy price of the products. It makes it easier for them to handle returned products and care less about having to eat the loss because that loss has already been put into the price of the product. Say it's common that 1/100 card is faulty on average, they raise the price of the product by 1% and as a result don't have to be bothered about possible losses from returned cards. It's often the small shops that do not do this and try to compete price wise with the big shops that do care about the losses and do not want to eat them and as a result can be a hassle to deal with in RMA.

In EU it's ok, but you have to be an end user, consumer that has bought the card as a person.
In my RMA I am not that, because I don't have a contract with the shop where the card was bought, the person I bought the card from has it, and as a result they treat me as a "company" not as a "person" and those nice EU warranty thingies to protect the person don't apply to me or so they say, the warranty rules/protection for people and companies are a bit different. As a result they won't pay me back when RMA takes longer than 30 days because I don't have a contract with them that says my name and price I bought it at. They are still obliged to process the RMA since they sold the card but I am kind of at the mercy of ASUS and they are not being very merciful so far and instead send me back from RMA a DOA card.


----------



## Echoa

Well after testing today and the past few days ive settled on 1350/1450mhz clocks and am pretty happy. If i was under water Im sure i could push farther but 1.3v on air and these clocks seem good and im pretty happy with the results. Looking around i seem to be scoring in 280 territory but i could be wrong on that, still a nice jump from original clocks and Heaven gets an 812 on ultra. If OP wants to update my clocks thatd be nice but no big deal


----------



## Bubblewhale

Quote:


> Originally Posted by *Echoa*
> 
> Well after testing today and the past few days ive settled on 1350/1450mhz clocks and am pretty happy. If i was under water Im sure i could push farther but 1.3v on air and these clocks seem good and im pretty happy with the results. Looking around i seem to be scoring in 280 territory but i could be wrong on that, still a nice jump from original clocks and Heaven gets an 812 on ultra. If OP wants to update my clocks thatd be nice but no big deal


1350 core....
The max i ever gotten was 1305 stable on my Asus 270X....


----------



## Echoa

Quote:


> Originally Posted by *Bubblewhale*
> 
> 1350 core....
> The max i ever gotten was 1305 stable on my Asus 270X....


I didn't expect it considering I barely got 1225mhz with 1.25v but going to 1.3v really opened it up. Been gaming all day not a single issue and temps at 65c.


----------



## Bubblewhale

Quote:


> Originally Posted by *Echoa*
> 
> I didn't expect it considering I barely got 1225mhz with 1.25v but going to 1.3v really opened it up. Been gaming all day not a single issue and temps at 65c.


Yeah, i was able to get 1305 with 1.3V
If it wasn't for my 2GB VRAM, i'd still keep my 270X. I'm just waiting for the aftermarket 480s with some good OC potential, if any of the 480s can hit/stock ~1300MHZ, then i'd get one.


----------



## Agiel

is any one here having a pain in the A*** with every Crimson driver ?? i have a sapphire r9 280, and atmost every game i play crash at some pointand this is the DLL to blame

Fault Module Name: atidxx64.dll i really dont know what to do now, i have tried w7/w8/w8.1/w10 from 15.** to 16.**** same problem with all games, except WOW and Dota .... this are the games i was playing and keep crashing

Assassin's Creed Rogue
Assassins Creed Syndicate
Far Cry 4
The Witcher 3 Wild Hunt


----------



## Echoa

Quote:


> Originally Posted by *Agiel*
> 
> is any one here having a pain in the A*** with every Crimson driver ?? i have a sapphire r9 280, and atmost every game i play crash at some pointand this is the DLL to blame
> 
> Fault Module Name: atidxx64.dll i really dont know what to do now, i have tried w7/w8/w8.1/w10 from 15.** to 16.**** same problem with all games, except WOW and Dota .... this are the games i was playing and keep crashing
> 
> Assassin's Creed Rogue
> Assassins Creed Syndicate
> Far Cry 4
> The Witcher 3 Wild Hunt


Can't say I have

Played

Dishonored
Witcher 3
Tomb raider
Crysis 3
Warframe

Not a single issue, running perfectly with the new crimson drivers


----------



## Agiel

this is really pissing me off, well i did a fresh w10 install lets see now, tell ya later folks, and thanks to answer bro


----------



## Catscratch

Quote:


> Originally Posted by *Echoa*
> 
> Well after testing today and the past few days ive settled on 1350/1450mhz clocks and am pretty happy. If i was under water Im sure i could push farther but 1.3v on air and these clocks seem good and im pretty happy with the results. Looking around i seem to be scoring in 280 territory but i could be wrong on that, still a nice jump from original clocks and Heaven gets an 812 on ultra. If OP wants to update my clocks thatd be nice but no big deal


Gasp, almost caught my 7176







So you chew on 280s


----------



## JackCY

Quote:


> Originally Posted by *Agiel*
> 
> is any one here having a pain in the A*** with every Crimson driver ?? i have a sapphire r9 280, and atmost every game i play crash at some pointand this is the DLL to blame
> 
> Fault Module Name: atidxx64.dll i really dont know what to do now, i have tried w7/w8/w8.1/w10 from 15.** to 16.**** same problem with all games, except WOW and Dota .... this are the games i was playing and keep crashing
> 
> Assassin's Creed Rogue
> Assassins Creed Syndicate
> Far Cry 4
> The Witcher 3 Wild Hunt


No problems here when I was using 15 crimson. I've heard of similar issues before but dunno what was the solution anymore. Try searching the DLL name along with windows driver crash. But if it's just the games maybe the card isn't stable. It really depends on what error you get.


----------



## Echoa

Quote:


> Originally Posted by *Catscratch*
> 
> Gasp, almost caught my 7176
> 
> 
> 
> 
> 
> 
> 
> So you chew on 280s


you a 280x or 280?







if you have a 280x id have to OC alot more, don't think I'd be able to catch you.


----------



## cusideabelincoln

Quote:


> Originally Posted by *Agiel*
> 
> is any one here having a pain in the A*** with every Crimson driver ?? i have a sapphire r9 280, and atmost every game i play crash at some pointand this is the DLL to blame
> 
> Fault Module Name: atidxx64.dll i really dont know what to do now, i have tried w7/w8/w8.1/w10 from 15.** to 16.**** same problem with all games, except WOW and Dota .... this are the games i was playing and keep crashing
> 
> Assassin's Creed Rogue
> Assassins Creed Syndicate
> Far Cry 4
> The Witcher 3 Wild Hunt


no crashes in game but when I'm on the desktop or watching a movie the driver will reset itself randomly. I cleaned out and installed the latest beta drivers and have been good for 2 days now, but we'll see.


----------



## Agiel

im back to WHQL-amd-catalyst-omega-14.12 Crimson BUG drivers just dont let me play, im this near to trade this AMD card to any matching Nvidia Card


----------



## JackCY

You're doing something wrong. Corrupted system or unstable card, who knows.


----------



## johnfrancis

I have the same problem with driver crashes using windows 10, R9 270x and the crimson driver. It can happen playing pretty much any game. It's even crashed while I’ve been playing age of empires 3, which is hardly graphics intensive.


----------



## Agiel

i dont think its faulty, i bought it from a friend, pulled from a gaming pc i use to use with him, system cant be corrupted since i have been doing fresh installs, and how come only happens with Crimsoms ? i tried 12.9, 13.12, 14.9 and with this drivers never crash games ... also i put my card into a stress for 1h and nothing happens ... im pretty sure its not hardware related


----------



## Agiel

Quote:


> Originally Posted by *johnfrancis*
> 
> I have the same problem with driver crashes using windows 10, R9 270x and the crimson driver. It can happen playing pretty much any game. It's even crashed while I've been playing age of empires 3, which is hardly graphics intensive.


never happened tyo me on wow cata, or wow draenor, dota2 may crash even on intel graphics as i have read in many frums, it only happens on Assassins Creed at hight or max settings, and in Witcher happens less often, temps are fine, volts, i made my hobby to find out what it is and im about to give up


----------



## johnfrancis

Quote:


> Originally Posted by *Agiel*
> 
> never happened tyo me on wow cata, or wow draenor, dota2 may crash even on intel graphics as i have read in many frums, it only happens on Assassins Creed at hight or max settings, and in Witcher happens less often, temps are fine, volts, i have been my hobby to find out what it is and im about to give up


Actually, thinking about it, I've never had a crash while playing skyrim and thats a pretty graphic intensive game..
Anyway, it's not a hardware problem. I use dual boot with linux and i've never had a driver crash playing a game on linux - even with graphic intensive games like metro last light. So it must be a windows driver bug.
Googling a bit, it seems like this windows setting may be the issue.
https://community.amd.com/thread/180166


----------



## JackCY

What about browser hardware acceleration and such stuff, I've seen this linked to strange Windows reporting driver crash. Unfortunately from what I know and have seen it's more of a combination than a single cause of the problem.


----------



## MiladEd

After some thought, I decided to not upgrade to the RX 480. I thought 30-40% more performance isn't worth spending that much money for, instead, I'll upgrade to anything that's around the same price, but at least 70-80% faster. Besides, my GPU runs all my games decent enough!

So, in light of me postponing the upgrade, anything I can do to extract more performance out this card? My core is OC to 1125 MHz, and anything higher will result in artifacts at least in some games. My temps are okay around 65-70 C depending on ambient. I previously tried to OC the VRAM to 1600 (from stock 1500 MHz) but it got too hot and there were some texture glitches in GTA V. Any tips for better OCing?


----------



## comagnum

Quote:


> Originally Posted by *MiladEd*
> 
> After some thought, I decided to not upgrade to the RX 480. I thought 30-40% more performance isn't worth spending that much money for, instead, I'll upgrade to anything that's around the same price, but at least 70-80% faster. Besides, my GPU runs all my games decent enough!
> 
> So, in light of me postponing the upgrade, anything I can do to extract more performance out this card? My core is OC to 1125 MHz, and anything higher will result in artifacts at least in some games. My temps are okay around 65-70 C depending on ambient. I previously tried to OC the VRAM to 1600 (from stock 1500 MHz) but it got too hot and there were some texture glitches in GTA V. Any tips for better OCing?


I upgraded to the 480 from a 280x and I am way happy I did. Wait until the AIB models come out, and reconsider.


----------



## Echoa

Quote:


> Originally Posted by *MiladEd*
> 
> After some thought, I decided to not upgrade to the RX 480. I thought 30-40% more performance isn't worth spending that much money for, instead, I'll upgrade to anything that's around the same price, but at least 70-80% faster. Besides, my GPU runs all my games decent enough!
> 
> So, in light of me postponing the upgrade, anything I can do to extract more performance out this card? My core is OC to 1125 MHz, and anything higher will result in artifacts at least in some games. My temps are okay around 65-70 C depending on ambient. I previously tried to OC the VRAM to 1600 (from stock 1500 MHz) but it got too hot and there were some texture glitches in GTA V. Any tips for better OCing?


Whats your card and voltage? (on mobile can't see it) things you can do

Aftermarket cooler
Raise voltage
Drop memory speed and focus on core first

Your main concern is core, with good vrm cooling you can do 1.3v on air daily just fine. First keep your memory at stock and start bumping core till you get artifacts then back off till you don't. I bump in 25mhz then back in 10mhz till stable. You might consider a better cooler even if you have an AIB cooler. Make sure to get the best heatsink you can on the vrms.

This would be pretty good but you will need to get extra sink for vrms and memory

http://m.newegg.com/Product/index?itemnumber=N82E16835186154

For the best fit make sure you have a dremel so you can trim the heatsinks that don't come with the kit. Get the biggest heatsink you can fit within reason on the vrms and don't be afraid to push 1.3v, avoided higher unless you do full block water


----------



## JackCY

Forget changing your AIB cooler, worthless to do so unless you have some truly crappy AIB cooler. Save the cash and get 14/16nm GPU.


----------



## Echoa

Quote:


> Originally Posted by *JackCY*
> 
> Forget changing your AIB cooler, worthless to do so unless you have some truly crappy AIB cooler. Save the cash and get 14/16nm GPU.


This would be the better idea, but if he's set on just overclocking that's what he can do


----------



## Agiel

oh hey, why my card only has 43% fan speed if i left it on Auto ?? i dont wanna install third party soft just to control fan speed ... what can i do ?

i try to avoid MSI AF, or any other soft, Trixx its a peace of yunk, hard to conf and odd peace of soft


----------



## MiladEd

Quote:


> Originally Posted by *Echoa*
> 
> Whats your card and voltage? (on mobile can't see it) things you can do
> 
> Aftermarket cooler
> Raise voltage
> Drop memory speed and focus on core first
> 
> Your main concern is core, with good vrm cooling you can do 1.3v on air daily just fine. First keep your memory at stock and start bumping core till you get artifacts then back off till you don't. I bump in 25mhz then back in 10mhz till stable. You might consider a better cooler even if you have an AIB cooler. Make sure to get the best heatsink you can on the vrms.
> 
> This would be pretty good but you will need to get extra sink for vrms and memory
> 
> http://m.newegg.com/Product/index?itemnumber=N82E16835186154
> 
> For the best fit make sure you have a dremel so you can trim the heatsinks that don't come with the kit. Get the biggest heatsink you can fit within reason on the vrms and don't be afraid to push 1.3v, avoided higher unless you do full block water


I've the Sapphire Dual-X version. My memory was at stock, and my voltages are set to 1.3 V, core clock set to 1125 MHz. I really can't get any more out of the core without artifacting. I've, however, OC the memory to 1700 MHz and it gave me about 2 extra avg FPS in SoM benchmark. Not much difference in other games.


----------



## JackCY

Stupid holidays, everyone chillin' somewhere RMAs not getting processed. Card sent anyway for 2nd RMA, well RMA? they've sent me a DOA card from RMA so I don't know to call it really. Not gonna wait forever for them to sort crap out and maybe maybe let me know a month later.


----------



## xLPGx

Hello. I got a second 280x from Gigabyte, I have the issue with the top card running very toasty, I saw it top out at 88C. I've increased the fan speed in my curve but it seems to always wanna get that hot unless I go all out 747 and run it at 80+%.
The bottom card runs alot cooler, at max 74C.

Do you think swapping the cards would be good or will the bottom one run just as hot? The top one is not that dusty but it does have some burnt in at the top and some I couldn't get out with my compressor. The bottom one is very dust free.

Or can I ease the strain on the top one and put my 3 monitors on the bottom one maybe? The top card is kinda stuck between the bottom card and my D14 with fans running very slow. I'm not sure how I can optimize it to make sure it doesn't run that hot.

Suggestions?


----------



## Echoa

Quote:


> Originally Posted by *xLPGx*
> 
> Hello. I got a second 280x from Gigabyte, I have the issue with the top card running very toasty, I saw it top out at 88C. I've increased the fan speed in my curve but it seems to always wanna get that hot unless I go all out 747 and run it at 80+%.
> The bottom card runs alot cooler, at max 74C.
> 
> Do you think swapping the cards would be good or will the bottom one run just as hot? The top one is not that dusty but it does have some burnt in at the top and some I couldn't get out with my compressor. The bottom one is very dust free.
> 
> Or can I ease the strain on the top one and put my 3 monitors on the bottom one maybe? The top card is kinda stuck between the bottom card and my D14 with fans running very slow. I'm not sure how I can optimize it to make sure it doesn't run that hot.
> 
> Suggestions?


Top will always run hot as it's getting heat from the bottom card. Aftermarket or water is the only way to improve it but it'll still run hotter


----------



## Echoa

Quote:


> Originally Posted by *MiladEd*
> 
> I've the Sapphire Dual-X version. My memory was at stock, and my voltages are set to 1.3 V, core clock set to 1125 MHz. I really can't get any more out of the core without artifacting. I've, however, OC the memory to 1700 MHz and it gave me about 2 extra avg FPS in SoM benchmark. Not much difference in other games.


Tbh sounds like you're pretty much at your peak GPU wise, not much more to do


----------



## xLPGx

Quote:


> Originally Posted by *Echoa*
> 
> Top will always run hot as it's getting heat from the bottom card. Aftermarket or water is the only way to improve it but it'll still run hotter


Yes of course I expected it to run hotter, but I didn't expect this much hotter. Much higher fan speed and higher temperature at the same time.

I don't want water in my rig again. I'm gonna see what I can do with the fans I have at the moment.


----------



## JackCY

If your airflow to GPUs is bad what do you expect?


----------



## Catscratch

I guess we all have to switch gpus to latest gen sooner than later to actually enjoy dx12 games in the future. Warhammer ignores gcn1.0 and now RoTR's Async Compute patch too. I guess it's more work to optimize their code for the slowest GCN architecture 79xx - 280(x) (only 2 ACE meaning only 16 compute commands)


----------



## JackCY

Yeah but what to switch to, still waiting for any decent cards to hit the shops.
RX 480 AIBs not available yet.
Reference RX 480 4GB only available on selected markets, like NA.
1060 not even paper launched yet.
1070 overpriced and not available due to shortages in supply chain.
1080 if someone donates it why not.

Prices still hiked up due to shortages, newness, stocks of old cards in shops, ...


----------



## Agiel

Quote:


> Originally Posted by *MiladEd*
> 
> I've the Sapphire Dual-X version. My memory was at stock, and my voltages are set to 1.3 V, core clock set to 1125 MHz. I really can't get any more out of the core without artifacting. I've, however, OC the memory to 1700 MHz and it gave me about 2 extra avg FPS in SoM benchmark. Not much difference in other games.


it is due to the 384 Bus Wide, u wont get tooo much performance, the core clock its the sweet spot for this cards


----------



## Agiel

yo, any one know if i must wait an specific time after applying thermal paster to my video card ? i see the card now hotter than bfore


----------



## Catscratch

Quote:


> Originally Posted by *Agiel*
> 
> yo, any one know if i must wait an specific time after applying thermal paster to my video card ? i see the card now hotter than bfore


What did you use ?


----------



## JackCY

You don't need to wait. But you need to use a quality paste and mount it back properly. Losing warranty along the way.


----------



## Agiel

Quote:


> Originally Posted by *JackCY*
> 
> You don't need to wait. But you need to use a quality paste and mount it back properly. Losing warranty along the way.


im cursed, i have apply paste 3 times and remount it, and every time is the same, is there any tip? its a r9 280 3GB Sapphire dual-x factory OC, 940/1250, it was 67c top before i clean it and apply new paste, dunno, it has some peaces of sort of silicon in every memory in the board, cut be that causing the overheat ?


----------



## JackCY

When everything fits nice and tight there shouldn't be any issues. Those pads on memory chips are wannabe thermal pads what they are made of I don't know but they are bad at conducting heat but better than I guess not having any pads and no contact.


----------



## Agiel

Quote:


> Originally Posted by *JackCY*
> 
> When everything fits nice and tight there shouldn't be any issues. Those pads on memory chips are wannabe thermal pads what they are made of I don't know but they are bad at conducting heat but better than I guess not having any pads and no contact.


then what should i do man ?. i really frustrated, been 2 days in this hell, and i dont want to keep unmounting/mounting it


----------



## neurotix

What's wrong with 67C? That's a totally normal temp for that card. If it's not artifacting or crashing due to heat it should be fine.

Are you running your fans at 100% while it's at 67C? I always game with the fans on full speed and use full cover headphones with the volume up and I can't even hear the cards (And I have 2!)

When I repasted my 7970 Vapor-X, I had issues with the temps being bad until I used an excessive amount of paste. You might be using too little. Try using more, that's my tip.

And no you won't lose your warranty by repasting a card, I've sent numerous cards to Sapphire that had been repasted and they always replaced the card as long as you send it in with the stock cooler it came with attached.


----------



## Agiel

Quote:


> Originally Posted by *neurotix*
> 
> What's wrong with 67C? That's a totally normal temp for that card. If it's not artifacting or crashing due to heat it should be fine.
> 
> Are you running your fans at 100% while it's at 67C? I always game with the fans on full speed and use full cover headphones with the volume up and I can't even hear the cards (And I have 2!)
> 
> When I repasted my 7970 Vapor-X, I had issues with the temps being bad until I used an excessive amount of paste. You might be using too little. Try using more, that's my tip.
> 
> And no you won't lose your warranty by repasting a card, I've sent numerous cards to Sapphire that had been repasted and they always replaced the card as long as you send it in with the stock cooler it came with attached.


bro 67c was previously i clean it, now its 80c and keep up if i dont stop gaming right away.


----------



## Agiel

@neurotix:

i will try with more paste, tell ya later, i will do it in the morning.


----------



## neurotix

Well, try what I said. When I did it I needed a lot of thermal paste or temps would be higher.


----------



## Agiel

Quote:


> Originally Posted by *neurotix*
> 
> Well, try what I said. When I did it I needed a lot of thermal paste or temps would be higher.


wel y zp-360 its over i had to use a low grade one and i used more than normal, temps are worse now, i wont touch it again till i get some decent paste i will have to statr digging ppol cause here in cuba thats imposible to buy







third wolrd country sucks


----------



## PowerSlide

After 2 months and going thru worst RMA finally got a replacement 380X

Sapphire and distributor doesn't wana do anything, none of them even cared just because my 280X manufacturer warranty is finished. Then after some heat from people in forum and facebook only a guy from AMD step up and solve for me.

Long warranty is great but once EOL they could treat us like trash. No more sapphire card for me after this.


----------



## JackCY

Quote:


> Originally Posted by *PowerSlide*
> 
> After 2 months and going thru worst RMA finally got a replacement 380X
> 
> Sapphire and distributor doesn't wana do anything, none of them even cared just because my 280X manufacturer warranty is finished. Then after some heat from people in forum and facebook only a guy from AMD step up and solve for me.
> 
> Long warranty is great but once EOL they could treat us like trash. No more sapphire card for me after this.


Similar for my 280x and ASUS, care to PM any connections to someone who could help step on it?
Unfortunately can't even RMA with ASUS directly so getting the RMA case info is almost impossible via the shop where I have to apply RMA








Otherwise I have one guy from ASUS Taiwan that could elevate the priority of the RMA case if only I could get the case number and info from the shop if they even have it themselves from ASUS service center.

First RMA total time 45 days, they sent me a physically damaged practically DOA card back. Now in 2nd RMA since then. About 2 months total without GPU and nowhere near any resolution.

They really don't expect RMAs when product is EOL and do not want to give you an upgrade as a replacement nor money back automatically. They try to take forever and find a replacement card somewhere in a trash bin, which means you're most likely gonna send it back to RMA again, and again, and again, ...


----------



## PowerSlide

Quote:


> Originally Posted by *JackCY*
> 
> Similar for my 280x and ASUS, care to PM any connections to someone who could help step on it?
> Unfortunately can't even RMA with ASUS directly so getting the RMA case info is almost impossible via the shop where I have to apply RMA
> 
> 
> 
> 
> 
> 
> 
> 
> Otherwise I have one guy from ASUS Taiwan that could elevate the priority of the RMA case if only I could get the case number and info from the shop if they even have it themselves from ASUS service center.
> 
> First RMA total time 45 days, they sent me a physically damaged practically DOA card back. Now in 2nd RMA since then. About 2 months total without GPU and nowhere near any resolution.
> 
> They really don't expect RMAs when product is EOL and do not want to give you an upgrade as a replacement nor money back automatically. They try to take forever and find a replacement card somewhere in a trash bin, which means you're most likely gonna send it back to RMA again, and again, and again, ...


Bad RMA is like freaking disease spreading everywhere, waste of time dealing with such crap.

Not only that, the guy even show photo to me that my card is damaged, a small cap(i think) near pcie connector got torn off..i send in a artifact card not physical damaged card as i never dropped the card or knocked it into anything..at least the guy didnt blame me for that or he couldnt put me on it because of the bad situation the useless distributor and sapphire put me into. But i learn my lesson, must take photos before sending in for evidence. Never know what will come out.

I'm from malaysia though, don't have use i give you the contact..very sorry and i hope asus sort it for you properly. When i get the card time for me to sell it away, i dont wana have any sapphire in my pc anymore.


----------



## JackCY

Yeah I wanted to RMA and resell a working card with new longer warranty or get money back. I would have gotten money back from the shop if only I had a document that shows I bought the card from the shop for a specified sum, but I only have a document with warranty info for the card and not the sum so the shop refused to pay me when the RMA took longer than 30 days which is what would happen if I had the right document, it's just silly right? Like they can't look it up in their system and check the price paid and pay me that back. So I'm stuck with relying on ASUS to resolve the RMA.

I've just sent as much info as I can to the admin on asus forum so he can put ASUS Taiwan to check the RMA... and hopefully slap someone along the RMA process line.

Similar here, I send in artifacting card, they send me back physically damaged card. I don't know what these companies are doing with the cards but they must have some true cheap workers that throw the cards around like bowling balls.

I always take pictures when I report the RMA and when I receive a product from RMA, even the box it arrives in.
I've asked the ASUS person what options there are if I can request via the shop and upgrade or money back etc. but no reply to that.

Maybe they will try communicating with me directly now that I gave them as much info as I can so the 280x EOL RMA can be resolved.

As always they only will notice when you start reporting issues on forums and other social media and get attention of their personnel that way, at least they check their own forum. It's a shame as RMAs should be treated equal and one shouldn't need to bring in media to get a satisfactory outcome from the RMA.


----------



## choLOL

Hey guys, anyone know why my r9 280x never reaches boost clocks? PowerColor says the core clock is 880 MHz, then boosts to 1030 MHz. I ran Unigine Valley, and the core clock never went past 880 MHz.
I've OC'ed my processor a couple of years ago, and now I'm trying to OC my GPU. I haven't done anything to my GPU clocks yet, just wanted to see my stock performance in Unigine Valley.

I use Afterburner and HWiNFO64 to monitor my GPU stats. I used Unigine Valley in Extreme HD setting to stress test.



Or is that really how it's supposed to be? Thanks. +rep

Edit: Fixed it.


----------



## radier

Set Power limit +20%.

Taptaptap Mlais M52


----------



## Bubblewhale

Quote:


> Originally Posted by *JackCY*
> 
> Yeah but what to switch to, still waiting for any decent cards to hit the shops.
> RX 480 AIBs not available yet.
> Reference RX 480 4GB only available on selected markets, like NA.
> 1060 not even paper launched yet.
> 1070 overpriced and not available due to shortages in supply chain.
> 1080 if someone donates it why not.
> 
> Prices still hiked up due to shortages, newness, stocks of old cards in shops, ...


There's sources that say the Strix 480 has 2 PCIE connectors, but 6+6 wouldn't make sense as it's 150w total where it'd be 225w with 8+6. I'm sold on the Strix if it's 8+6 with great cooling ...looking at you 290/x and 390/x. I'm just waiting for the perfect 480 to replace my 270X.


----------



## JackCY

No way am I buying another ASUS AMD GPU. A single 8pin is more than enough for RX 480, that's 225W max capacity officially.

The only way I'm gonna keep an ASUS AMD GPU is if they give me a decent one from RMA that I won't need to sell.


----------



## Roboyto

Quote:


> Originally Posted by *Bubblewhale*
> 
> There's sources that say the Strix 480 has 2 PCIE connectors, but 6+6 wouldn't make sense as it's 150w total where it'd be 225w with 8+6. I'm sold on the Strix if it's 8+6 with great cooling ...looking at you 290/x and 390/x. I'm just waiting for the perfect 480 to replace my 270X.


Quote:


> Originally Posted by *JackCY*
> 
> No way am I buying another ASUS AMD GPU. A single 8pin is more than enough for RX 480, that's 225W max capacity officially.
> 
> The only way I'm gonna keep an ASUS AMD GPU is if they give me a decent one from RMA that I won't need to sell.


@JackCY is absolutely correct. You won't see any RX480 with 8+6 pin. The chip is designed around a single 6-pin to begin with...there will be some with 6+6, but you'll likely see 8-pin as that takes up less real-estate on the PCB.

I also agree about ASUS card. Hopefully they actually have a cooler designed specifically for the 480 and not something they're just going to slap on there that was designed for the competitions GPU; I'm speaking of DC2 290 cards in particular here...the coolers were simply taken from the 780 and stuck on there which resulted in 3/5 heatpipes being utilized.

Add ASUS' questionable warranty/RMA service to the mix and I strongly wouldn't suggest them either. My terrible RMA experience with them was enough to literally sell/destroy every ASUS product that I owned, except 1, at that time...which was laptops, monitors, motherboards, router, bluetooth adapter, external DVD-RW, etc. They lost an extremely loyal customer over an *RMA that I wanted to pay for *as I was openly admitting fault for damaging CPU pins.

My suggestion would be an XFX from personal experience with their customer support and the fact that they don't crucify your warranty if you want to take it apart to change thermal paste/pads or watercool the card. XFX may have tamper stickers on their card, but that does not apply to US market the last I checked with an XFX representative.

For everyone's information I just bought a reference Sapphire RX480. It has a tamper sticker on the back. Opened a ticket asking about warranty void, and they replied:

*YOU WILL VOID THE WARRANTY IF THE STICKER IS TAMPERED WITH OR REMOVED*



I will be purchasing Micro Center's 2yr replacement warranty for $30 so I can use the card as I intended with a waterblock*.*

Sad...no more Sapphire cards for me after this RX480 with that warranty policy in place.


----------



## radier

Buying reference card for water knowing it is so power limited? With AiB models and GTX1060 around the corner? At least you could wait a week or two.


----------



## Mr.N00bLaR

I haven't been active here much but I have an R9-280X Toxic. Seems like I cannot increase the voltage past 1.256. I remember being able to go to 1.3v on my other 7970s. I think with .05v more I might be able to push a bit more out of the core. Does anyone know if this is a BIOS limitation and if so, what BIOS I can flash to the card?


----------



## Roboyto

Quote:


> Originally Posted by *radier*
> 
> Buying reference card for water knowing it is so power limited? With AiB models and GTX1060 around the corner? At least you could wait a week or two.


You are entitled to your opinion; even if it's wrong not knowing the whole story or looking at the big picture.

Leaked information these days is proving to be more and more accurate surrounding video cards, and the rumors currently are that all of the AIB 480s are custom PCB. It's a miracle that a card in this price range even has a waterblock, so if none of the AIB cards are reference, then I wouldn't have bought one anyhow. My HTPC is already setup for watercooled GPU, all I need to do is get the block for the card.

I didn't overpay for the card, and it already looks like a good overclocker hitting 1350/2250 with stock voltage. Watercooling the card will drop power consumption drastically and give me more headroom for overclocking. There is already a modded BIOS out to increase power draw if I'm not happy with results and I want to really push things. I have good motherboard/PSU that will hold up to abuse if need be.

With I2C mod and cooling upgrades Der8auer got a reference card to 1500. This could very likely be a good indicator of where AIB cards will top out: https://www.youtube.com/watch?v=Jq47qmwcus8

Unless he simply had a terrible sample GPU, my card isn't doing too bad all things considered.

Additionally, I have 30 days to return the card to Micro Center if the AIB cards are released and one does have a 6+6/8-pin configuration with the reference PCB.

Lastly, I will never purchase an Ngreedia GPU as long as AMD is still around.


----------



## JackCY

I would say AMD is with GPUs in the same situation vs NV as with CPUs vs Intel. They have decent products, even offer nice features competitors don't but such features are not utilized in reality by software due to low market share, and majorly they are always late with their products to the market, cannot innovate as fast as competitor due to limited budget, ... On top of that they are often power hogs, which in part is on GPUs due to their higher flexibility which unfortunately gets barely any use.
Just look at how much money NV said they invested into Pascal, it was being compared with a mission to Mars lol. NV dumps crazy amount of $ into R&D year by year. Although I find their solutions and overall development less exciting and revolutionary than AMDs...

I have other ASUS products and while they may not show some excessive quality they work so far. Of course these days when 99% is made in 3rd world countries with price/profit being the main objective the quality of products is long gone.


----------



## Mr.N00bLaR

Seems like I'm going to need to go into the 1.33-1.36v territory for 1200MHz core on my 280x Toxic. I don't see it being worth it so stock it will remain =\


----------



## Bartouille

Quote:


> Originally Posted by *Mr.N00bLaR*
> 
> Seems like I'm going to need to go into the 1.33-1.36v territory for 1200MHz core on my 280x Toxic. I don't see it being worth it so stock it will remain =\


I think that's fine. You probably have a very low asic card. My cards have 50% and 52% asic and they both need 1337mv for 1200 and 1387 (max voltage i can get) for 1250.


----------



## xLPGx

My cards have 61% and 63% ASIC. The 61% one won't get over 1130 being stable on stock 1.2V (?). I'm gonna test my other card sometime whenever I can but It's stable at 1130 too.
Gigabyte rev 1 280xs with 1100 being stock.
Is it relatively good or not? I don't know what people get out of these generally.

Locked voltage sucks


----------



## Mr.N00bLaR

Quote:


> Originally Posted by *Bartouille*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mr.N00bLaR*
> 
> Seems like I'm going to need to go into the 1.33-1.36v territory for 1200MHz core on my 280x Toxic. I don't see it being worth it so stock it will remain =\
> 
> 
> 
> I think that's fine. You probably have a very low asic card. My cards have 50% and 52% asic and they both need 1337mv for 1200 and 1387 (max voltage i can get) for 1250.
Click to expand...

Ah, I wasn't sure if that voltage was dangerous or not. I will have to try to re-asses my case airflow to maximize cooling as the card was ~87-89C in my case with an ambient of 75..

EDIT: Checked the ASIC quality, GPU-Z says it is 68.5% on my card


----------



## Bartouille

Quote:


> Originally Posted by *Mr.N00bLaR*
> 
> Ah, I wasn't sure if that voltage was dangerous or not. I will have to try to re-asses my case airflow to maximize cooling as the card was ~87-89C in my case with an ambient of 75..
> 
> EDIT: Checked the ASIC quality, GPU-Z says it is 68.5% on my card


https://www.techpowerup.com/reviews/Sapphire/HD_7970_Toxic_6_GB/32.html

This card had 1.35v from the factory for its 1200mhz oc bios. I think it's fine if you can keep temps under control (under 75c).


----------



## JackCY

ASUS service centers are absolute jokers, they've sent back to retailer the GPU but without box and all accessories because they've put it somewhere and couldn't find it.
Screw ASUS RMA, completely useless 3 year warranty.

Update:
From ASUS Taiwan (an ASUS rog forum admin that monitors the AMD GPU threads and 280x artifacts thread) and shop I've got to know that service center (SC) didn't inform retailer not to send the original packaging and as such SC disposed of it no matter how stupid that is. On top of that ASUS admitted that during their transport the card was damaged, hell no wonder when they don't even use the original packaging and instead send the cards packed in who knows what half across the globe. Not once did they offer an upgrade for free or with payment needed, nor money back, they seem to be hell bent on giving back refurbished 280xs no matter in what sorry state those cards are.
ASUS Taiwan has apologized and that's it.
Shop should receive the missing CF bridge from SC this week.

Maybe I will get some card back by end of the week, maybe.

In case it artifacts again, I'm not even sure I want do a 3rd RMA and their handling of RMAs is certainly deterring people from doing so.
Should have sold the original card and not RMA it... stupid ASUS RMA, what a joke.

And service center calls after ASUS Taiwan finally got to them. Pretty much repeating the previous, apologizing and saying that retailer RMAed with them instead of via a distributor like with first RMA so they weren't informed not to send the box, asked what accessories I need that were lost in the process. Personally I am not surprised that the retailer RMAed directly with service center since the first RMA via distributor took over a month and resulted in a refurbished yet damaged card, I bet they weren't happy about it either.

RMA in EU is such a PITA lengthy process especially when you don't have the document saying how much you paid the retailer for the product and only have a warranty document. One can't RMA with ASUS directly, as far as it seems neither should retailer and only distributor can, so the card has to be sent from customer to retailer to distributor to ASUS service center you can see how stupidly long that is and how much money is wasted on shipping the card around.

Card is enroute, finally. Having ASUS Taiwan talk to them does seem to work to speed things up a little and improve their service. All I can say is if someone has an issue with ASUS RMA, try ASUS forum, ROG forum, and get a hold of someone from ASUS there to help out, it still takes time though before they look at the RMA and give the service center a call. You kind of want to get above the service center in the "food chain".


----------



## JackCY

Anyone knows if 280x with 16.7.3 driver has issues running MadVR or random lock ups in 3D when FRTC at 200fps is enabled?
I smell a 3rd RMA in a row









---

14.12 - MadVR runs fine, it's the oldest driver I have, but I still saw that weird screen blink glitch kind of greenish
82-85C under 100% game load, 2400-2500rpm fan, insane heat coming out of the card. The cooling is maybe not mounted the best but I would rather say the chip consumes much much more power at ASIC 60% than it should, it's nuts.

On my original card I would have had 70-75C and 1500-1600rpm with less heat coming out of the card at 100% game load.

15.12 - same story as 14.12 driver...

Even newer 16.7.3 driver launched today this time without VHQL in it's name, same story as before, MadVR crashes, didn't try games and FRTC, as I was starting to test it I read the latest message from ASUS and they've decided to refund me, no idea how much, short time later, retailer calls with the same info and to send the card back to them, again no refund amount info. Card sent back. Will see.

The card was a mess, consumed insane amount of power which made it run +10C on both chip and VRMs than it should, fans spun like a jet liner. Newest drivers don't work well at all, which are the only ones that include Vulkan support. Card was also damaged again in transport between ASUS local service center and their Asian service center this time it was minor and not affecting installation but they definitely need to fix their internal shipping packaging and handling.


----------



## aaronsta1

Quote:


> Originally Posted by *Roboyto*
> 
> For everyone's information I just bought a reference Sapphire RX480. It has a tamper sticker on the back. Opened a ticket asking about warranty void, and they replied:
> 
> *YOU WILL VOID THE WARRANTY IF THE STICKER IS TAMPERED WITH OR REMOVED*
> 
> 
> 
> 
> I will be purchasing Micro Center's 2yr replacement warranty for $30 so I can use the card as I intended with a waterblock*.*
> 
> Sad...no more Sapphire cards for me after this RX480 with that warranty policy in place.


if you live in USA they passed some law in the 70s that says you are freely able to take those stickers off and it will not void your warranty..

i got two xfx rx 480s and thats the first thing i did. one of them the paste wasnt even applied over the whole die.. it was off to the side..

here is what XFX told me.

7/6/2016 12:03:28 AM By [email protected] Type:2
Hi,

Thank you for contacting XFX Technical Support. If you have purchased the card in the North America, removing or damaging the "Warranty void if removed" stickers wont void your warranty as long as you don't incur any physical damage to the card and you keep your stock cooler on hand in case you need to send the item to us (you need to return it to its original form). It means that replacing the thermal paste of the GPU will not void your warranty. Please don't hesitate to ask if you have any other questions.

Yours,

Krell

Technical Support


----------



## Roboyto

Quote:


> Originally Posted by *aaronsta1*
> 
> if you live in USA they passed some law in the 70s that says you are freely able to take those stickers off and it will not void your warranty..
> 
> i got two xfx rx 480s and thats the first thing i did. one of them the paste wasnt even applied over the whole die.. it was off to the side..
> 
> here is what XFX told me.
> 
> 7/6/2016 12:03:28 AM By [email protected] Type:2
> Hi,
> 
> Thank you for contacting XFX Technical Support. If you have purchased the card in the North America, removing or damaging the "Warranty void if removed" stickers wont void your warranty as long as you don't incur any physical damage to the card and you keep your stock cooler on hand in case you need to send the item to us (you need to return it to its original form). It means that replacing the thermal paste of the GPU will not void your warranty. Please don't hesitate to ask if you have any other questions.
> 
> Yours,
> 
> Krell
> 
> Technical Support


I know what XFX's policy is, but I contacted Sapphire support and they stated warranty would be void. Notice how XFX specifically said North America, because their warranty varies depending upon region.


----------



## aaronsta1

Quote:


> Originally Posted by *Roboyto*
> 
> I know what XFX's policy is, but I contacted Sapphire support and they stated warranty would be void. Notice how XFX specifically said North America, because their warranty varies depending upon region.


yeah but their policy is illegal, as per this law..
https://www.law.cornell.edu/uscode/text/15/chapter-50


----------



## JackCY

Quote:


> Originally Posted by *aaronsta1*
> 
> yeah but their policy is illegal, as per this law..
> https://www.law.cornell.edu/uscode/text/15/chapter-50


Who? Sapphire?

They are always saying that any OC or any fiddling with the card from it's stock state will void warranty. Sure if you have a law that allows you to remove stickers then just hammer them with that when they complain about missing stickers on an RMAed card.

XFX honestly seems much better than Sapphire to me.


----------



## xLPGx

Could someone confirm if there are issues with the 16.7.3 driver on Tahiti.

I want to know if this driver fix dirt rally crashing on high/ultra settings but people have been reporting other issues so i'm holding off for now.


----------



## Boinz

Quote:


> Originally Posted by *xLPGx*
> 
> Could someone confirm if there are issues with the 16.7.3 driver on Tahiti.
> 
> I want to know if this driver fix dirt rally crashing on high/ultra settings but people have been reporting other issues so i'm holding off for now.


My understanding was that it only affected the RX 480 and only affected you if you were overclocking the VRAM.


----------



## slick2500

Quote:


> Originally Posted by *JackCY*
> 
> Who? Sapphire?
> 
> They are always saying that any OC or any fiddling with the card from it's stock state will void warranty. Sure if you have a law that allows you to remove stickers then just hammer them with that when they complain about missing stickers on an RMAed card.
> 
> XFX honestly seems much better than Sapphire to me.


Yup Sapphires warranty is about worthless anyway, they wouldn't honor mine because I bought it bnib on eBay.


----------



## JackCY

Warranty process changes depending on where you live/want to RMA. IMHO they should accept an RMA of any of their products at the authorized service center as long as the card is in stock form. But they might have found some law loop holes that allow them not to.


----------



## slick2500

Quote:


> Originally Posted by *JackCY*
> 
> Warranty process changes depending on where you live/want to RMA. IMHO they should accept an RMA of any of their products at the authorized service center as long as the card is in stock form. But they might have found some law loop holes that allow them not to.


The dumb thing is that the card only had a 2 year warranty on it. My Sapphire 4890 had a lifetime warranty on it and I didn't have to jump through so many hoops getting an rma for that card.


----------



## JackCY

Quote:


> Originally Posted by *slick2500*
> 
> The dumb thing is that the card only had a 2 year warranty on it. My Sapphire 4890 had a lifetime warranty on it and I didn't have to jump through so many hoops getting an rma for that card.


Yeah RMAs were easier long time ago. My friend had some 3xxx or 4xxx card too and when he RMAed it he got a replacement which was a newer card, so he got upgraded automatically. Nowadays everyone is trying to save money and having lifetime warranty is expensive for complex PC components especially GPUs. You put a lifetime warranty on a GPU and miners will buy 100 of them and then RMA all 100 when they don't want them anymore or some of them are bad. So the service center can have fun dealing with a 100 cards RMA at once lol Now they may do that 4+ years when the card is EOL in such a case they have to upgrade 100 cards or give back money for 100 cards. So no more lifetime warranties.

Outside EU the warranty often is just 1 year, more than that is a courtesy of the manufacturer.


----------



## neurotix

I really don't know what you guys are talking about. The Ebay thing I can believe or whatever, but I've sent no less than 5 cards back to Sapphire that had had the thermal paste changed, non stock cooler attached and so forth. As long as you send it back with the original cooler on it, and not physically damaged, with a legit serial, they should honor the warranty and send you a replacement or repair your card.

You know, I've always been very nice to Althon Micro in my communication, and they've always been really nice to me in return. Perhaps that's part of the problem?

And since when did this thread turn into nothing but "complain about RMAs" anyway?


----------



## javamocha

greetings,

i'm new here, and i want to ask something regarding my gigabyte windforce r9 280x. my gpu clocks at 300MHz with 1100 Mhz of memory clock at idle. but a few days ago the gpu clocks at 501 Mhz and 1500 Mhz memory clock at idle on GPU-Z. is there something wrong with the settings? because id didn't OC it at all.

sorry for bad english. thanks


----------



## rdr09

Quote:


> Originally Posted by *javamocha*
> 
> greetings,
> 
> i'm new here, and i want to ask something regarding my gigabyte windforce r9 280x. my gpu clocks at 300MHz with 1100 Mhz of memory clock at idle. but a few days ago the gpu clocks at 501 Mhz and 1500 Mhz memory clock at idle on GPU-Z. is there something wrong with the settings? because id didn't OC it at all.
> 
> sorry for bad english. thanks


I think your card should idle at 300 core and 150 memory. Stuff that might make those speeds go higher are 144Hz, multi-monitor setups or just borked driver install.


----------



## javamocha

Quote:


> Originally Posted by *rdr09*
> 
> I think your card should idle at 300 core and 150 memory. Stuff that might make those speeds go higher are 144Hz, multi-monitor setups or just borked driver install.


thanks for your reply.
what should i do then? if i'm not mistaken, it happens after i installed the 16.7.3 driver.

i'm totally noob in tuning GP.


----------



## rdr09

Quote:


> Originally Posted by *javamocha*
> 
> thanks for your reply.
> what should i do then? if i'm not mistaken, it happens after i installed the 16.7.3 driver.
> 
> i'm totally noob in tuning GP.


If you are not running a 144Hz monitor or using more than one monitor, then it should go down to 300/150. Are you sure this is happening with just sitting idle on the desktop? 'Cause watching You Tube or using Flash can load the memory.

Anyways, you can revert back to 16.7.2, which is pretty much same as 16.7.3 and see if it helps.

I normally just use the uninstall feature in Crimson before finally installing it. I don't use DDU, have you?


----------



## javamocha

Quote:


> Originally Posted by *rdr09*
> 
> If you are not running a 144Hz monitor or using more than one monitor, then it should go down to 300/150. Are you sure this is happening with just sitting idle on the desktop? 'Cause watching You Tube or using Flash can load the memory.
> 
> Anyways, you can revert back to 16.7.2, which is pretty much same as 16.7.3 and see if it helps.
> 
> I normally just use the uninstall feature in Crimson before finally installing it. I don't use DDU, have you?


yes it's happening...i don't know why...it sits at 501/1500 idle...no browser or youtube open, and unfortunately i use DDU. coz update from crimson failed couple of times when i tried to update from 16.7.2 to 16.7.3.

did you mean amd clean uninstall utility? coz i can't find the uninstall feature on crimson (noob)









update:
i just roll back the driver to 16.7.2
and the clocks back to normal 300/150....the problem is on the new driver i guess.


----------



## rdr09

Quote:


> Originally Posted by *javamocha*
> 
> yes it's happening...i don't know why...it sits at 501/1500 idle...no browser or youtube open, and unfortunately i use DDU. coz update from crimson failed couple of times when i tried to update from 16.7.2 to 16.7.3.
> 
> did you mean amd clean uninstall utility? coz i can't find the uninstall feature on crimson (noob)


I see other use DDU with success. I just don't use it.

This is just a suggestion.

1. Download 16.7.2 from here . . .

http://support.amd.com/en-us/download/desktop/previous?os=Windows%2010%20-%2064

2. After download, run it like you are going to install it and you'll see this . . .



Choose Uninstall>Express.

3. Run it again but this time Install it. I normally uncheck Evolved 'cause I don't use it. Reboot.

4. Check your clocks.

EDIT: I also uncheck - Automatically download latest drivers. I always download manually.


----------



## javamocha

Quote:


> Originally Posted by *rdr09*
> 
> I see other use DDU with success. I just don't use it.
> 
> This is just a suggestion.
> 
> 1. Download 16.7.2 manually from here . . .
> 
> http://support.amd.com/en-us/download
> 
> make sure you pick the right Windows OS.
> 
> 2. After download, run it like you are going to install it and you'll see this . . .
> 
> 
> 
> Choose Uninstall>Express.
> 
> 3. Run it again but this time Install it. I normally uncheck Evolved 'cause I don't use it. Reboot.
> 
> 4. Check your clocks.
> 
> EDIT: I also uncheck - Automatically download latest drivers. I always download manually.


the clocks are normal again after i rollback to 16.7.2 driver, 300/150. i guess the problem was the newest driver...i'll skip it i guess..thanks for the reply and the time to explain it to me...i appreciate it.


----------



## Bubblewhale

Quote:


> Originally Posted by *Roboyto*
> 
> @JackCY
> is absolutely correct. You won't see any RX480 with 8+6 pin. The chip is designed around a single 6-pin to begin with...there will be some with 6+6, but you'll likely see 8-pin as that takes up less real-estate on the PCB.
> 
> I also agree about ASUS card. Hopefully they actually have a cooler designed specifically for the 480 and not something they're just going to slap on there that was designed for the competitions GPU; I'm speaking of DC2 290 cards in particular here...the coolers were simply taken from the 780 and stuck on there which resulted in 3/5 heatpipes being utilized.


Well there's reviews for the Strix 480 and I'd say it's fine, having 8+6 on a 480 is probably a dream anyway


----------



## Roboyto

Quote:


> Originally Posted by *Bubblewhale*
> 
> Well there's reviews for the Strix 480 and I'd say it's fine, having 8+6 on a 480 is probably a dream anyway


Except for the fact that they did *exactly* what I said they could do...they took the cooler from the 1080 and slapped it on the RX 480.

I quote from TPU:

"ASUS's thermal solution uses five heatpipes; *four* of these directly touch the GPU's surface. You can also see a gray thermal pad that cools the voltage regulation circuitry. This is the *exact same heatsink* as on the ASUS GTX 1080 STRIX Gaming."

https://www.techpowerup.com/reviews/ASUS/RX_480_STRIX_OC/4.html



See how they say that '*four* of these (heatpipes) directly touch the GPU's surface'? Maybe I'm







but it looks to me like the thermal paste is technically touching four of the heatpipes, but the GPU isn't. The GPU is only effectively utilizing 2 of the heatpipes, while partially contacting the 3rd. It appears that there may be a miniscule fraction of the corner of the GPU die which might be contacting the 4th pipe...But, from the looks of the picture to me, you're only reaping the *cooling benefits of* ~*2.**5 of the heatpipes.*

Just like the 290/780 scenario I mentioned. If they could have altered it to better suit the chip, then performance likely could have been better...especially in the noise department when the MSI Gaming is *7dB* quieter, while only running 5C hotter, with only 2 fans! https://www.techpowerup.com/reviews/MSI/RX_480_Gaming_X/22.html

I looked at another review that altered the temperature target to 75C and it still doesn't best the MSI: http://www.kitguru.net/components/graphic-cards/zardon/asus-rx-480-strix-gaming-oc-aura-rgb-8192mb/27/

Also louder than MSI here: http://hexus.net/tech/reviews/graphics/94945-asus-rog-strix-radeon-rx-480-oc/?page=13

Their custom PCB doesn't appear to have given additional OC headroom; at least with Watt Man. Maybe we will see more out of the cards once end users have them in their hands and can use better OC software...TPU could have gotten a poor sample.

The MSI looks like the better card to me currently...but we will see once there is more information from end users.

I highly doubt we will see an 8+6. I can't envision a tangible benefit to doing this.


----------



## JackCY

Technically it touches 4 but cooling wise only 2 have any significant contact. Obviously the dies are so small getting more pipes to touch is impossible. This goes for any small die being cooled with many heatpipes.
ASUS, MSI, most of them reuse coolers from previously released cards especially NV cards. MSI reuses 1060 cooler but at least it works it seems.
ASUS is not good at cooling GPUs, nor with bloody RMAs. I'm still waiting for my refund they've said they will give. Have to again start emailing maybe even writing to ASUS Taiwan again to kick their bum because it has been a week since the card arrived back to them.


----------



## Roboyto

Quote:


> Originally Posted by *JackCY*
> 
> Technically it touches 4 but cooling wise only 2 have any significant contact. Obviously the dies are so small getting more pipes to touch is impossible. This goes for any small die being cooled with many heatpipes.
> ASUS, MSI, most of them reuse coolers from previously released cards especially NV cards. MSI reuses 1060 cooler but at least it works it seems.
> ASUS is not good at cooling GPUs, nor with bloody RMAs. I'm still waiting for my refund they've said they will give. Have to again start emailing maybe even writing to ASUS Taiwan again to kick their bum because it has been a week since the card arrived back to them.


The GTX 1080 has better thermals/acoustics with similar heatpipe contact. This is why it is pure rubbish they couldn't do something to customize the cooler for the card.

A large copper shim/plate to help transfer heat to the other pipes that aren't in direct contact would be a pretty simple fix..like MSI has done in the past.



I wish you the best of luck with ASUS RMA, but from personal experience you're more likely to be insulted than helped.


----------



## MrPerforations

hello 280x owners,
been testing full screen apps for a while now, but I cant get my pair in crossfire to clock up to the stated 1070mhz that they are supposed to do.








this is very annoying as I bought the pair to replace a single because they had a higher standard clock.







yes, I have three 280x and one is doing the right thing of being 950mhz.
*has anyone actually got the card to the stated 1070mhz clock on the box please?...*
using a sensor like gpu-z or something, as I found that others lie about the clock (benchmarking software).
*has any one got a pair in crossfire to the stated 1070mhz clock on the box please?...
*or is it due to crossfire?


----------



## neurotix

Quote:


> Originally Posted by *MrPerforations*
> 
> hello 280x owners,
> been testing full screen apps for a while now, but I cant get my pair in crossfire to clock up to the stated 1070mhz that they are supposed to do.
> 
> 
> 
> 
> 
> 
> 
> 
> this is very annoying as I bought the pair to replace a single because they had a higher standard clock.
> 
> 
> 
> 
> 
> 
> 
> yes, I have three 280x and one is doing the right thing of being 950mhz.
> *has anyone actually got the card to the stated 1070mhz clock on the box please?...*
> using a sensor like gpu-z or something, as I found that others lie about the clock (benchmarking software).
> *has any one got a pair in crossfire to the stated 1070mhz clock on the box please?...
> *or is it due to crossfire?


This usually happens with all AMD cards depending on the game. It can also be that whatever software you are using to monitor clocks is reporting them incorrectly.

Make sure you are using a DirectX11 game, the newer the better. Also let us know what monitoring software you are using.

Basically, in older games (DX9) or in less demanding games, your cards might not hit the rated clock speed because they don't need to to hit 60 fps (or whatever the refresh rate of your monitor is).


----------



## aaronsta1

Quote:


> Originally Posted by *MrPerforations*
> 
> hello 280x owners,
> been testing full screen apps for a while now, but I cant get my pair in crossfire to clock up to the stated 1070mhz that they are supposed to do.
> 
> 
> 
> 
> 
> 
> 
> 
> this is very annoying as I bought the pair to replace a single because they had a higher standard clock.
> 
> 
> 
> 
> 
> 
> 
> yes, I have three 280x and one is doing the right thing of being 950mhz.
> *has anyone actually got the card to the stated 1070mhz clock on the box please?...*
> using a sensor like gpu-z or something, as I found that others lie about the clock (benchmarking software).
> *has any one got a pair in crossfire to the stated 1070mhz clock on the box please?...
> *or is it due to crossfire?


if you have xfire they will all set themselves to the lowest clock speed..

you will probably want to set that one card at 950 to 1070 in the control panel.


----------



## JackCY

Quote:


> Originally Posted by *Roboyto*
> 
> The GTX 1080 has better thermals/acoustics with similar heatpipe contact. This is why it is pure rubbish they couldn't do something to customize the cooler for the card.
> 
> A large copper shim/plate to help transfer heat to the other pipes that aren't in direct contact would be a pretty simple fix..like MSI has done in the past.
> 
> 
> 
> I wish you the best of luck with ASUS RMA, but from personal experience you're more likely to be insulted than helped.


The plates don't help much if at all I think, the heatpipes have contact with each other so there shouldn't be a need for a plate. But it shouldn't hurt much if on is used.

ASUS actually gets better when it comes to communication when you step on them via ASUS Taiwan, sure the stuff they send back in the RMA is unusable but otherwise they do try and communicate and try to speed it up.
Right now the stupid retailer is being the slow one and while they complicate things by trying to do stuff by the book when it comes to me the customer (down the supply chain) they do not do stuff by the book when it comes to their distributor and ASUS (up the supply chain).

If ASUS could pay me back directly I would have already had my money back but it's been a week since the retailer has the card and they still didn't get it to ASUS yet so ASUS can't start the lengthy refund process because they didn't get the gorram card back yet.

In some ways the EU warranty process is OK but overall it is unnecessarily complicated and if there are 10 distributors/resellers then everything should go through all 10 of them if done by the book. I have 3 in my case that I know about and I already jumped 1 to make the whole thing shorter, faster, so did the shop try the same but now they shouldn't with the refund and they were still trying to from what ASUS told me...

While ASUS can't properly deal with warranties of EOL products or ship cards internally without damaging them when you step on their toes via ASUS customer service Taiwan HQ they do try to resolve it, call, apologize for this and that, etc.
Unfortunately I don't have any such trick for the retailer. I knew an owner of one of the big computer shops here decades ago when the shop was small but that's a different shop.


----------



## Roboyto

Quote:


> Originally Posted by *JackCY*
> 
> The plates don't help much if at all I think, the heatpipes have contact with each other so there shouldn't be a need for a plate. But it shouldn't hurt much if on is used.


I swear sometimes you just type things for the sake of typing things.

I think the plate actually does make make a difference especially with the RX 480.

MSI is using a smaller cooler with less fans and manages a card that is *8dB quieter*, that's a lot, while only sacrificing 5C in thermals.


----------



## JackCY

Quote:


> Originally Posted by *Roboyto*
> 
> I swear sometimes you just type things for the sake of typing things.
> 
> I think the plate actually does make make a difference especially with the RX 480.
> 
> MSI is using a smaller cooler with less fans and manages a card that is *8dB quieter*, that's a lot, while only sacrificing 5C in thermals.


The ASUS was IMHO tested with the broken temp target of 65C instead of 75C, many reviews of 480s are messed up by that. MSI 480 and 1060 cooler is the same.


----------



## Agiel

i think the problems i have its that my PSU cant handle my system, in specs i read 45A at 552W sometimes my games crash, sometimes my pc shut off, i may need to look for another PSU damn it !


----------



## Roboyto

Quote:


> Originally Posted by *JackCY*
> 
> The ASUS was IMHO tested with the broken temp target of 65C instead of 75C, many reviews of 480s are messed up by that. MSI 480 and 1060 cooler is the same.


If that is the case, the it couldn't meet that target while being nearly as loud as the reference cooler. If the MSI's target was 75, why did it peak at 73? It could have been even quieter to run at 75.

Even if the target temp was 75 there's no way it would eliminate the 8dB gap. MSI is just better in the GPU market if you ask me...And most likely better with RMA..can't speak from experience but they can't be worse than ASUS.

Yes, MSI may be using the same cooler, but at least it does a good job on BOTH cards.


----------



## JackCY

The targets aren't to be met 100%







Doing so makes the fans spin up and down too much which is most noticeable. The fans ramp up way before the target.
The MSI cooler is OK but it has gotten MSI cards way too hyped which skyrockets their pricing. Plus all the MSI cards look one like the other.


----------



## ColdDeckEd

Quote:


> Originally Posted by *xLPGx*
> 
> My cards have 61% and 63% ASIC. The 61% one won't get over 1130 being stable on stock 1.2V (?). I'm gonna test my other card sometime whenever I can but It's stable at 1130 too.
> Gigabyte rev 1 280xs with 1100 being stock.
> Is it relatively good or not? I don't know what people get out of these generally.
> 
> Locked voltage sucks


Dude search for the gigabyte 280x voltage bios unlock thread, you can flash a bios and add voltage. I just did it on my GB 280x WF Oc rev 2. At stock (locked) voltage it was at 1.09 and I was able to Oc to 1140. With the new bios I'm at 1240 core, 1650 on memory. I think I finally won a silicon lottery with this card though my asic is 71ish. Can get through benchmarks at 1250 so I think a little bit of voltage can get me to 1260. Just be sure to follow all directions. Be sure to set your bios switch to 1, backup your bios, and then follow the thread.

http://www.overclock.net/t/1450456/gigabyte-r9-280x-v2-voltage-lock-bypass-through-bios-mod

Fyi voltage control is still locked in trixx and AB haven't been able to work around that yet but I can see in gpuz that my vddc is indeed 1.3 just have to change voltage thru bios ATM.


----------



## oddcam

I've got the *Gigabyte R9 280x Rev 1.0* and I would like to update to the newest BIOS, but I'm unsure of which one to pick; the Gigabyte site (link below) has 3 different versions available - F71, F4, and F13.

http://www.gigabyte.us/products/product-page.aspx?pid=4845#bios

Windows lists my BIOS information as 113-C3865000-X01

GPU-Z says my BIOS version is 015.038.000.003.003435

Secondly, has anyone here had any success raising the voltage on this thing? I saw a thread where many users were able to do it on Rev 2.0 and 3.0, but none for 1.0. Is it possible? Is it worth it? My card runs about 40 idle, 76 max, and I haven't overclocked it so far because it looks like there is very little stability headroom (+40 core clock, +140 memory clock), and I'm also a total noob to overclocking and don't want to ruin anything.

Does anyone have knowledge of this? Thanks in advance for any help!


----------



## ColdDeckEd

Check out the thread I linked I'm pretty sure there's a rev 1 bios in there. Download the gigabyte bios flash program to find out what version bios you have. Use the gigabyte bios flasher to backup your switch 1 bios only. When you go to save the original it should say what bios version you have. Do not use that program to flash though. Better to use the command line version of atiflash IMO it just works better.

No offense but did you even bother to read just one post above yours?

Don't sweat being a noob everyone was at one point. One very nice feature of these cards is that there's a bios switch so if you mess up there's a way to still use the card. Still you will want to backup your original. Just read read and read some more and follow all the directions.


----------



## MrPerforations

hello again,
I did manage to get it to go to 1070 in world of tanks of all things I don't want to play.
so it can hit the target clock.


----------



## Hequaqua

OK....I'm not late to the party...but just late.









Back Story:
I purchased this card on 2/1/2014. I used it in a old Dell 4700 computer. I played BF4 with it and it was the cat's meow. I didn't do any overclocking at all on it. I really didn't know much about overclocking in general really.

Fast Forward:
I purchased the nVidia GTX 970 on 9-24-2014 to replace this card. I then set about learning how to overclock GPU's. I learned how to overclock and mod the bios. I then built another rig(Transformer in sig). So my son has been on it ever since.

Present Day:
I decided after nVidia/AMD released their new cards that it was time for a upgrade. I'm not partial to either brand, so I decided that for the perf/$$ that I would upgrade his rig to the Powercolor Devil RX480. It will be here in a few days. Today, I swapped out one of my 970's for the R9 270x(Oh it's a Sapphire 4gb model). I set about doing some baseline benchmarks. I left it at stock so I can compare the two cards later. As painful as it was to watch....I set about running Valley/Heaven/Firestrike/Time Spy. I believe the base clocks are 1070/1400. It has no problems maintaining those. I ran each test twice, as was actually surprised at the scores(not that they are great, but very consistent). I only used the latest driver. I may go back a few sets and see if there is much difference.

In my short time....I was able to get to 1150/[email protected](Valley only). I didn't try any of the others. I did the benchmarks pretty much how I do them with my 970's(they have modded bios, so no software for core/memory/voltage). I use Afterburner...set the power limit all the way up, and I set the fans to about 75%. When I do my other cards, I normally set them to 100% because of being in SLI. I tried 1170/1500. It didn't crash, but did hiccup a few times and dropped the score below what I got at 1150/1500.

Do you think this is the limit?

Is there anyway to add more voltage?

Is there a guide on modding the bios, if it can be done?

Again...sorry I'm late...and honestly, I didn't gain but about 2-3fps in Valley. I'm also not sure what I will do with the card once I get the RX470.

Thanks for the patience...hope I got everything explained. I am sure the answers to my questions are in this thread(I tried the search function), but there are just way too many pages.









EDIT: The final clocks that I was able to reach and be stable were 1125/6000.


----------



## neurotix

Sapphire cards should have unlocked voltage. I've had probably 10 Sapphire cards since 2008 and not a single one didn't have unlocked voltage.

Use Sapphire Trixx to add more voltage.

There should be no real reason to modify the bios. You can probably get higher clocks just by adding more voltage. Make sure you raise the power limit all the way, too.

The "2-3" fps in Valley is basically normal for that card. If you use 3dmark11 or Fire Strike you should see a good difference in score overclocked vs not overclocked. In Valley you probably got around 35 fps or close to it, and OCing probably bumped it up closer to 40. Well, if you're not on LN2, then 40 fps in Valley Extreme HD is basically the best you can hope for with a 270X. Even my "golden" 270X that could pass Valley at 1300mhz only got like 42 fps. (And it was totally unstable in everything else and would crash immediately.)


----------



## Alexium

Guys, how's 72° C typical / 76° peak for VRM temp on a reference 280X PCB?


----------



## neurotix

That's actually excellent.

I would guess your core is around 60C.

Most VRMs are actually safe up to 120C so you have nothing to worry about.


----------



## Alexium

Thank you! I was worried it approaches 80. Aren't capacitors typically rated at 105? I mean, the MOSFETs themselves might hold up to 120, but not necessarily their surroundings.

My core is topping at 51° at 1050 MHz / 1.125V in games (didn't dare nor care to run Furmark or Kombustor). I use a universal GPU waterblock, and cooling VRM separately is always a pain in the rear end.


----------



## JackCY

Caps differ not all are rated for over 100C. Also FETs have many characteristics and reading them properly is not that easy as oh yeah it can do 50A at 125C continuously, nah it most likely can't. But they are usually good enough as long as you cool them.


----------



## Retrorockit

I have a couple questions that relate to multi GPU settups. Since the R9-280 is a rebadged HD7970 which came in 2 CPU HD7990 I though I'd ask here.

Does a dual GPU single card setup perform "better" (not faster) than a Crossfire setup? Less micro stutter?

I know that memory doesn't add up in multi GPU setups. 2x3GB=3gb. But I also know video cards use system memory from the top down to store textures. Do multiple GPUs need more system memory for this purpose?


----------



## AyyMD

My most recent benchmark



It's a Sapphire R9 280x Tri-X TOXIC, and it's clocked at 1200/1600.


----------



## MiladEd

1200 MHz core! That's a good OC!


----------



## AyyMD

Quote:


> Originally Posted by *MiladEd*
> 
> 1200 MHz core! That's a good OC!


It was my friend's before (and everything he buys is like Silicon platinum), and he managed to reach that clock speed with a Corsair H55 ziptied to it, but when I got it from him, it was clocked like that thanks to the custom BIOS.

Hell, he got 1.8GHz on his ASUS R9 390's memory clock.


----------



## JackCY

Fingers crossed, may finally get my money back tomorrow from the endless R9 280x RMA that started 17th May...
ASUS has already agreed to a refund month ago, but you know, the distributors, retailer etc. it takes time to hunt down the money and actually get it.


----------



## Echoa

Quote:


> Originally Posted by *Retrorockit*
> 
> I have a couple questions that relate to multi GPU settups. Since the R9-280 is a rebadged HD7970 which came in 2 CPU HD7990 I though I'd ask here.
> 
> Does a dual GPU single card setup perform "better" (not faster) than a Crossfire setup? Less micro stutter?
> 
> I know that memory doesn't add up in multi GPU setups. 2x3GB=3gb. But I also know video cards use system memory from the top down to store textures. Do multiple GPUs need more system memory for this purpose?


no its the exact same thing as crossfire they just do it on that board vs over pcie bus


----------



## aaronsta1

i have 2 r9 270X cards, one MSI and one Sapphire.. they have been pretty happy in my living room gaming pc for a couple years.. hooked up to my 55 inch 1080 led tv.
the MSI card started giving me red screens and white dots all over the screen, so i sent it back. (i tested this by taking the Sapphire card out and it was still doing it)

i just got the new one today.. well its not new.. it looks like they removed the heatsink as the screws are all stripped out, and the card smells like it was burnt.. like maybe they reflowed it?

its not the card i sent them, as the serial numbers dont match and it has hynx 4bfr ram instead of the 4afr ram my old one had.. it also shows the vrm temps in gpu-z and my old one did not.

the MSI card works great by itself.. the Sapphire card works great by itself..
but, if i plug the crossfire cable into the 2 cards, its not working. as soon as i fire up a game, black screen.. a couple times windows 10 says it rebooted to keep my computer from being damaged or something and once i got a system thread not handled message, but i think it was from really old drivers i tried and it didnt like the game i tried to play with them.

when the crossfire cable is on the 2nd cards fans never turn on.. also the clocks on the primary card never go above 300/150.. right clicking the desktop or opening a window takes FOREVER.. its like my pc is going 1 or 2 fps.

i do have a couple crossfire cables and i tried them both.

i was thinking its a driver issue, but ive uninstalled and reinstalled several driver versions.. i also put my 7870XTs in, and they work with no issues.. as well as the rx 480s that i just bought to replace these ones...

im almost certain its the MSI card.. before i send it back i was seeing if anyone else had this issue..

i plan on selling these 270xs and i dont want to give someone a bad one.


----------



## lanofsong

Hey R9 270/270X and 280/280X owners,

Would you consider putting all that power to a good cause for the next 2 days? If so, come *sign up* and fold with us for our monthly Foldathons - see attached link.

September Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## aaronsta1

Quote:


> Originally Posted by *aaronsta1*
> 
> i have 2 r9 270X cards, one MSI and one Sapphire.. they have been pretty happy in my living room gaming pc for a couple years.. hooked up to my 55 inch 1080 led tv.
> the MSI card started giving me red screens and white dots all over the screen, so i sent it back. (i tested this by taking the Sapphire card out and it was still doing it)
> 
> i just got the new one today.. well its not new.. it looks like they removed the heatsink as the screws are all stripped out, and the card smells like it was burnt.. like maybe they reflowed it?
> 
> its not the card i sent them, as the serial numbers dont match and it has hynx 4bfr ram instead of the 4afr ram my old one had.. it also shows the vrm temps in gpu-z and my old one did not.
> 
> the MSI card works great by itself.. the Sapphire card works great by itself..
> but, if i plug the crossfire cable into the 2 cards, its not working. as soon as i fire up a game, black screen.. a couple times windows 10 says it rebooted to keep my computer from being damaged or something and once i got a system thread not handled message, but i think it was from really old drivers i tried and it didnt like the game i tried to play with them.
> 
> when the crossfire cable is on the 2nd cards fans never turn on.. also the clocks on the primary card never go above 300/150.. right clicking the desktop or opening a window takes FOREVER.. its like my pc is going 1 or 2 fps.
> 
> i do have a couple crossfire cables and i tried them both.
> 
> i was thinking its a driver issue, but ive uninstalled and reinstalled several driver versions.. i also put my 7870XTs in, and they work with no issues.. as well as the rx 480s that i just bought to replace these ones...
> 
> im almost certain its the MSI card.. before i send it back i was seeing if anyone else had this issue..
> 
> i plan on selling these 270xs and i dont want to give someone a bad one.


well after all this time they finally just sent me an email saying they dont have any more r9 270x's and they will replace the card with a r9 380 gaming 4g.


----------



## xLPGx

I'm seriously considering swapping out my 280xs or just removing one of them for the sake of noise. I got myself a pair of HD558s which are open back so I can hear the fans now compared to my G430s.

I do really like knowing I have the extra GPU horsepower though, even though I've found myself to rarely use it. It's a difficult decision. My wallet won't allow an upgrade that can give me both quietness with a single card and the performance of two 280x.


----------



## Haunebu

Hello, I'm new here, is there a software to check video card VRM temperature? I have r9 280x gigabyte windforce rev 2.0 and on some games I see artifacts.


----------



## bardacuda

Try using gpu-z. If your card has a sensor for it you should see it under the "Sensors" tab. Not all cards have a sensor for this, however, so in that case you might need to use an infrared thermometer and take a reading from the back of the PCB where the VRMs are. If your card has a backplate you may need to remove it to do that.


----------



## MiladEd

Guys, anybody knows the exact dimensions of a reference R9 280X PCB, along with the screw holes and core/MOSFET locations? In an attempt to improve the aesthetics of my build, I was planning to design a backplate for my card, and have it lazer cut. FYI, I've got a Sapphire R9 280X Dual-X which uses HD 7970 reference PCB design.

Also, I did find some compatible EKWB backplates, but I couldn't find the dimensions for it anywhere.

Thanks a lot in advance.


----------



## bardacuda

Measuring it won't work?


----------



## MiladEd

Sorry, forgot to mention. I don't have the card with me right now. :|


----------



## Haunebu

Quote:


> Originally Posted by *bardacuda*
> 
> Try using gpu-z. If your card has a sensor for it you should see it under the "Sensors" tab. Not all cards have a sensor for this, however, so in that case you might need to use an infrared thermometer and take a reading from the back of the PCB where the VRMs are. If your card has a backplate you may need to remove it to do that.


Alright, thank you!


----------



## bardacuda

Quote:


> Originally Posted by *MiladEd*
> 
> Guys, anybody knows the exact dimensions of a reference R9 280X PCB, along with the screw holes and core/MOSFET locations? In an attempt to improve the aesthetics of my build, I was planning to design a backplate for my card, and have it lazer cut. FYI, I've got a Sapphire R9 280X Dual-X which uses HD 7970 reference PCB design.
> 
> Also, I did find some compatible EKWB backplates, but I couldn't find the dimensions for it anywhere.
> 
> Thanks a lot in advance.


Well I tried but I can't seem to find the dimensions anywhere either. All I could find out is that the PCB number is 109-C38637-00 but no schematic or datasheet or anything like that. However, I did find one thread where a guy made his own backplates and he mentioned that he got some dimensions by sizing a picture using photoshop but I'm not sure how to do that. His cards were in fact 7970s so you could try pm'ing him maybe and if you're lucky he still has the dimensions and they're compatible. Here is the thread:

http://www.overclock.net/t/1424387/gallery-build-log-ultimate-wall-mount-rig-maxxplanck-v2-completed/


----------



## MiladEd

Quote:


> Originally Posted by *bardacuda*
> 
> Well I tried but I can't seem to find the dimensions anywhere either. All I could find out is that the PCB number is 109-C38637-00 but no schematic or datasheet or anything like that. However, I did find one thread where a guy made his own backplates and he mentioned that he got some dimensions by sizing a picture using photoshop but I'm not sure how to do that. His cards were in fact 7970s so you could try pm'ing him maybe and if you're lucky he still has the dimensions and they're compatible. Here is the thread:
> 
> http://www.overclock.net/t/1424387/gallery-build-log-ultimate-wall-mount-rig-maxxplanck-v2-completed/


Thanks, for your effort.

I finally managed to do what that guy did. I found a PCB shot, sized the HDMI port on my laptop, and resized the picture enough so the HDMI on the PCB would the same size as what I measured.

Anyway, I designed this backplate in AutoCAD.



There are no screw holes because I decided I was no comfortable putting screws through my GPU (I know it already has the holes, but still). So I'll mount it with small pieces of double sided tape. Any suggestions on the backplate design?


----------



## bardacuda

Quote:


> Originally Posted by *MiladEd*
> 
> There are no screw holes because I decided I was no comfortable putting screws through my GPU (I know it already has the holes, but still). So I'll mount it with small pieces of double sided tape. Any suggestions on the backplate design?


I don't have any design suggestions, but as far as the mounting, I'm assuming the backplate is going to be made of conductive metal, in which case I would probably go for non-conductive adhesive thermal pads, to make sure that any components that need to can benefit from the passive cooling, and also it stands far enough off the board and solder points so you don't short anything out. Then again depending on how heavy it is and how good the adhesive is, it could potentially droop and fall off given enough time so a few screws with standoffs would probably be a good idea too.


----------



## MiladEd

Quote:


> Originally Posted by *bardacuda*
> 
> I don't have any design suggestions, but as far as the mounting, I'm assuming the backplate is going to be made of conductive metal, in which case I would probably go for non-conductive adhesive thermal pads, to make sure that any components that need to can benefit from the passive cooling, and also it stands far enough off the board and solder points so you don't short anything out. Then again depending on how heavy it is and how good the adhesive is, it could potentially droop and fall off given enough time so a few screws with standoffs would probably be a good idea too.


Actually, it's not made of metal. I'm having it made out of dark (or white, haven't decided yet), fogged out plexiglass. I don't think cooling would be any different from now, since the top of the core and VRM are exposed, and I don't think other components create that much heat to benefit from heat transfer through the back of the PCB.

Anyway, I'm personally quite pleased with how the design turned out. I also made a small revision, by shrinking the square in the middle of the AMD logo top of the core. I'm also thinking of possibly doing some kind of design for the whole above the VRMs, but not sure yet.


----------



## NameUnknown

Do you guys think its worth upgrading at this point or waiting until something better comes along than what we have available now? Or should I just get a second 280X and be done with it.


----------



## Haunebu

Guys, I've got Gigabyte Windforce R9 280X, I can't change voltage in MSI Afterburner. I need to flash the bios or is there any other way to unlock voltage control?


----------



## bardacuda

Quote:


> Originally Posted by *Haunebu*
> 
> Guys, I've got Gigabyte Windforce R9 280X, I can't change voltage in MSI Afterburner. I need to flash the bios or is there any other way to unlock voltage control?


Voltage control is locked by default. You have to unlock in settings > general. If you've done that and it's still not working I would try a different program first. If that doesn't work you might be able to do it with a BIOS edit or your card just might not have the circuitry for it...but I'd be very surprised if it didn't. I know on my 270 I can't change voltage even with a BIOS edit but it's a non-X variant and in a lower tier. I've never heard of a 280X that didn't allow voltage control AFAIK.
Quote:


> Originally Posted by *NameUnknown*
> 
> Do you guys think its worth upgrading at this point or waiting until something better comes along than what we have available now? Or should I just get a second 280X and be done with it.


Depends what you're using your system for and if it is struggling with anything. It's probably not worth it adding a second card to these old Phenom systems but it would definitely benefit you in GPU-limited scenarios.


----------



## NameUnknown

Quote:


> Originally Posted by *bardacuda*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NameUnknown*
> 
> Do you guys think its worth upgrading at this point or waiting until something better comes along than what we have available now? Or should I just get a second 280X and be done with it.
> 
> 
> 
> Depends what you're using your system for and if it is struggling with anything. It's probably not worth it adding a second card to these old Phenom systems but it would definitely benefit you in GPU-limited scenarios.
Click to expand...

Well, the issue I have is that ever since WIndows 10 my performance is hampered in most games and crippled in others. For example WoW went from 60FPS to 30, GTA4 went from 50 to 12ish, ME3 had a performance hit, even Red Faction 2: Guerilla took a major hit in high dust and destruction areas from butter smooth to choppy as hell unless its DX9. I have tried new drivers, old drivers, beta drivers, updates, etc and nothing works. I am to the point of either building a whole new system which is only of insignificant overall gain, a new GPU, or a second 280X.

With my time limited to game these days, when I have the chance I want the best experience possible.


----------



## MiladEd

Quote:


> Originally Posted by *NameUnknown*
> 
> Do you guys think its worth upgrading at this point or waiting until something better comes along than what we have available now? Or should I just get a second 280X and be done with it.


Depends! I was pretty determined to upgrade to a RX 480, but eventually I decided not to. Since I though that 50% more performance isn't worth that much money. I'll reconsider when RX 580 or GTX 2060 or whatever that's called is out, how much performance it provides against how much it costs.


----------



## bardacuda

Quote:


> Originally Posted by *NameUnknown*
> 
> Well, the issue I have is that ever since WIndows 10 my performance is hampered in most games and crippled in others. For example WoW went from 60FPS to 30, GTA4 went from 50 to 12ish, ME3 had a performance hit, even Red Faction 2: Guerilla took a major hit in high dust and destruction areas from butter smooth to choppy as hell unless its DX9. I have tried new drivers, old drivers, beta drivers, updates, etc and nothing works. I am to the point of either building a whole new system which is only of insignificant overall gain, a new GPU, or a second 280X.
> 
> With my time limited to game these days, when I have the chance I want the best experience possible.


Ouch! That is definitely weird behaviour. Personally I would just go back to Windows 7...even though backing up, reformatting, and restoring everything is a big headache...but that's me. Aside from that I think the second 280X is probably the best budget option. You'd probably have to get at least a 1070 to get better performance than XFired 280Xs, and that much GPU would probably be bottlecked by the CPU anyway.

EDIT: Just checked and, on FireStrike benchmarks at least, a 2x280X/Thuban setup is roughly equal to a 1x1070/Thuban setup...so I guess you would need a 1080 to outperform dual 280Xs.

http://www.3dmark.com/compare/fs/9295820/fs/9539937


----------



## Haunebu

Sometimes when I'm browsing with Google Chrome, some weird ass squares get drawn for a second and then it dissapears, kinda like artifacts. Also when playing some games I'm getting artifacts. I don't have card overclocked, is there any way I can fix that?


----------



## bardacuda

Quote:


> Originally Posted by *Haunebu*
> 
> Sometimes when I'm browsing with Google Chrome, some weird ass squares get drawn for a second and then it dissapears, kinda like artifacts. Also when playing some games I'm getting artifacts. I don't have card overclocked, is there any way I can fix that?
> 
> 
> Spoiler: Warning: Spoiler!


Sounds like a dieing card or unstable overclock. You might be able to fix it by downclocking or overvolting, but in the case of overvolting it might accelerate the card's death if it is in fact dieing.


----------



## Haunebu

Quote:


> Originally Posted by *bardacuda*
> 
> Sounds like a dieing card or unstable overclock. You might be able to fix it by downclocking or overvolting, but in the case of overvolting it might accelerate the card's death if it is in fact dieing.


It's not overclocked and I've been owning this card for like year and half, if not more and card was doing this for whole time, so I guess it won't die. Also I've been thinking about downvolting, but voltagr control is locked.


----------



## bardacuda

Quote:


> Originally Posted by *Haunebu*
> 
> It's not overclocked and I've been owning this card for like year and half, if not more and card was doing this for whole time, so I guess it won't die. Also I've been thinking about downvolting, but voltagr control is locked.


Downvolting would make the problem worse. You need to downclock or overvolt or some combination of the two.


----------



## Haunebu

Quote:


> Originally Posted by *bardacuda*
> 
> Downvolting would make the problem worse. You need to downclock or overvolt or some combination of the two.


I've tried downclocking it, but it doesn't fix it and I can't overvolt it, because it's locked.


----------



## bardacuda

Quote:


> Originally Posted by *Haunebu*
> 
> I've tried downclocking it, but it doesn't fix it and I can't overvolt it, because it's locked.


Sounds like your chip is just on the fritz then and there's no way to fix that. I actually have the same problem with my Gigabyte 270 (which also has been going on for over a year but hasn't died or even seemed to have gotten worse). I also have a Gigabyte 290, however I've only ever used it for mining or folding so I don't know if it would have any artifacting issues, but out of all my cards they have two lowest ASIC Quality scores. You can check this number if you open up GPU-Z and click the button up in the top right corner next to the camera button, then click on the ASIC Quality tab.

My 270 which has the artifacting issues has a score of 67.8% (in the lowest 12.5% of similar scores), and my 290 has a score of 71.8% (In the lowest 30% of scores). When I look at either of my two XFX cards or my old ASUS card, they all have ASIC quality scores between 75 and 90%, and they're all in the top 45-95% of similar scores. I thought maybe I just got unlucky, but I am starting to think that Gigabyte tends to use lower binned chips to save on production costs.


----------



## Haunebu

I'm gonna try to find a way to overvolt this card, I hope it'll fix this issue. My card has score of 70.4%, which is higher than 69.6% of similar GPUs.


----------



## bardacuda

Quote:


> Originally Posted by *Haunebu*
> 
> I'm gonna try to find a way to overvolt this card, I hope it'll fix this issue. My card has score of 70.4%, which is higher than 69.6% of similar GPUs.


Have you tried modding the BIOS yet? It's pretty easy although it has the potential to brick your card. It's better to have a 2nd GPU around that you can boot with as your primary, in case you have to reflash a corrupted BIOS.


----------



## Haunebu

Quote:


> Originally Posted by *bardacuda*
> 
> Have you tried modding the BIOS yet? It's pretty easy although it has the potential to brick your card. It's better to have a 2nd GPU around that you can boot with as your primary, in case you have to reflash a corrupted BIOS.


No, I haven't modded BIOS, but I've reflashed this card many times. Is there any custom BIOS which is already modded? This card has Hynix memory.


----------



## bardacuda

Quote:


> Originally Posted by *Haunebu*
> 
> No, I haven't modded BIOS, but I've reflashed this card many times. Is there any custom BIOS which is already modded? This card has Hynix memory.


You can download BIOSes from techpowerup but to avoid potential compatibility issues it's probably better to mod your own.

BIOS Editor:
https://www.techpowerup.com/forums/threads/vbe7-vbios-editor-for-radeon-hd-7000-series-cards.189089/

ATI Flashing Utility:
https://www.techpowerup.com/downloads/Utilities/BIOS_Flashing/ATI/

Couple of flashing guides:
http://www.overclock.net/t/1353325/tutorial-atiwinflash-how-to-flash-the-bios-of-your-ati-cards
https://www.techinferno.com/index.php?/forums/topic/1331-guide-amd-vbios-flashing/

BIOS Database:
https://www.techpowerup.com/vgabios/


----------



## Haunebu

Quote:


> Originally Posted by *bardacuda*
> 
> You can download BIOSes from techpowerup but to avoid potential compatibility issues it's probably better to mod your own.
> 
> BIOS Editor:
> https://www.techpowerup.com/forums/threads/vbe7-vbios-editor-for-radeon-hd-7000-series-cards.189089/
> 
> ATI Flashing Utility:
> https://www.techpowerup.com/downloads/Utilities/BIOS_Flashing/ATI/
> 
> Couple of flashing guides:
> http://www.overclock.net/t/1353325/tutorial-atiwinflash-how-to-flash-the-bios-of-your-ati-cards
> https://www.techinferno.com/index.php?/forums/topic/1331-guide-amd-vbios-flashing/
> 
> BIOS Database:
> https://www.techpowerup.com/vgabios/


Thanks! Currently I have latest bios from techpowerup site for my card.
I've never edited BIOS before, what should I do here? Is there any way that I can unlock voltage control, so I'll be able to adjust voltage from MSI Afterburner?


----------



## bardacuda

Quote:


> Originally Posted by *Haunebu*
> 
> Thanks! Currently I have latest bios from techpowerup site for my card.
> I've never edited BIOS before, what should I do here? Is there any way that I can unlock voltage control, so I'll be able to adjust voltage from MSI Afterburner?
> 
> 
> Spoiler: Warning: Spoiler!


Personally I just increase the VDDC in the last state (#0) and since I usually downclock the memory to 1375 or 1250, I change that value in the last 3 states (#5, #6, and #0 in your case). Then I just increase the overdrive limits so I have all the play I want in the sliders in afterburner, and do the actual overclocking there.

Example of my ASUS card's BIOS before and after modding: (ASUSorig being my original factory BIOS and ASUSgame being my modded BIOS)


Spoiler: Warning: Spoiler!










I also increased the TDP but that's just because this card would throttle if I left it at default. It has the same effect if I had left TDP at default and just increased the power limit slider to +20% in afterburner, but in this way I could leave the slider at 0% and still have more headroom if it needed it. Also my clock slider was capped at 1050 by default so I had to change that, but you might not have to in your case, since you won't get up to 1500MHz anyway. Basically you can probably leave the Overdrive & PowerTune tab untouched and just edit the voltages under the PowerPlay tab.

As far as I know there's no way to unlock voltage control through the BIOS, so you have to set the value in the BIOS itself when you edit it. This makes fine tuning a PITA since you have to re-edit, reflash, and reboot any time you want to change the voltage.


----------



## Antonius

@Haunebu I had a similar problem like this, With my old 5870m in my laptop, a fresh install of windows sorted it out, maybe you can give that a try.


----------



## Haunebu

Quote:


> Originally Posted by *bardacuda*
> 
> Personally I just increase the VDDC in the last state (#0) and since I usually downclock the memory to 1375 or 1250, I change that value in the last 3 states (#5, #6, and #0 in your case). Then I just increase the overdrive limits so I have all the play I want in the sliders in afterburner, and do the actual overclocking there.
> 
> Example of my ASUS card's BIOS before and after modding: (ASUSorig being my original factory BIOS and ASUSgame being my modded BIOS)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I also increased the TDP but that's just because this card would throttle if I left it at default. It has the same effect if I had left TDP at default and just increased the power limit slider to +20% in afterburner, but in this way I could leave the slider at 0% and still have more headroom if it needed it. Also my clock slider was capped at 1050 by default so I had to change that, but you might not have to in your case, since you won't get up to 1500MHz anyway. Basically you can probably leave the Overdrive & PowerTune tab untouched and just edit the voltages under the PowerPlay tab.
> 
> As far as I know there's no way to unlock voltage control through the BIOS, so you have to set the value in the BIOS itself when you edit it. This makes fine tuning a PITA since you have to re-edit, reflash, and reboot any time you want to change the voltage.


Thank you very much! I'll give it a try.
Quote:


> Originally Posted by *Antonius*
> 
> @Haunebu I had a similar problem like this, With my old 5870m in my laptop, a fresh install of windows sorted it out, maybe you can give that a try.


That won't change anything, I've reinstalled windows since I have this card like 200 times.


----------



## Haunebu

@bardacuda I just checked, the max VDDC is 1200, so I can't overvolt it...


----------



## bardacuda

@Haunebu That's weird that you only get a dropdown list. I get a box where I can just type in any number in 5mv increments. Must be different for every BIOS. I was using vbe 7.0.0.7b if that makes a difference.


----------



## Haunebu

Quote:


> Originally Posted by *bardacuda*
> 
> @Haunebu That's weird that you only get a dropdown list. I get a box where I can just type in any number in 5mv increments. Must be different for every BIOS. I was using vbe 7.0.0.7b if that makes a difference.


I was using the same version. What if I reflash some other BIOS and modify it? Will it be a problem for example if I flash XFX's BIOS if it's made for cards, which have Hynix memory? I have Gigabyte version of the card.


----------



## bardacuda

Quote:


> Originally Posted by *Haunebu*
> 
> I was using the same version. What if I reflash some other BIOS and modify it? Will it be a problem for example if I flash XFX's BIOS if it's made for cards, which have Hynix memory? I have Gigabyte version of the card.


I have no idea. I have only ever edited my original BIOS.


----------



## neurotix

Quote:


> Originally Posted by *NameUnknown*
> 
> Do you guys think its worth upgrading at this point or waiting until something better comes along than what we have available now? Or should I just get a second 280X and be done with it.


Quote:


> Originally Posted by *NameUnknown*
> 
> Well, the issue I have is that ever since WIndows 10 my performance is hampered in most games and crippled in others. For example WoW went from 60FPS to 30, GTA4 went from 50 to 12ish, ME3 had a performance hit, even Red Faction 2: Guerilla took a major hit in high dust and destruction areas from butter smooth to choppy as hell unless its DX9. I have tried new drivers, old drivers, beta drivers, updates, etc and nothing works. I am to the point of either building a whole new system which is only of insignificant overall gain, a new GPU, or a second 280X.
> 
> With my time limited to game these days, when I have the chance I want the best experience possible.


You posted this a while ago so sorry for being late. But I figured I would give advice.

The problem is basically your CPU, the Phenom II series is from 2009, it's going to be 2017 soon and that makes them 8 years old. Even my Haswell i7 will soon be 4 years old. I had Phenom IIs up till around 2012 myself.

My younger brother has a rig I built him out of old parts, he had a Phenom II X4 B50. With a 270X he was getting like 15 fps in Rise of the Tomb Raider with everything on low at 1080p. Well, his motherboard fried and instead of replacing it, he got a i3-6100, Z170 motherboard and DDR4 RAM. Now he plays Rise of the Tomb Raider at 60fps with everything maxed out using the same graphics card. So yeah it's your old Phenom processor that's the problem. He only spent like $250 for the setup too. Around the price of an i5.

I would personally say that if all you do is game, you don't need the extra threads on the Phenom II x6 anyway. Considering a Skylake i3 is probably 3x as fast as a Phenom II in single core performance.

Your R9 280X is still a pretty capable card assuming you don't go higher than 1080p. It should at least be able to run the latest games on high settings and older ones on Ultra. I would disagree in that you need a GTX 1070 for the upgrade to be worth it, the 1070 is too expensive for what it is ($400???) You can still find R9 Fury's for $300 or less right now, no one wants them, and that's what I paid for mine (I got two). In fact, newegg has a sale on the Sapphire R9 Fury Nitro right now for $269, it ends soon though. Couldn't be happier with my Fury's. And they're as good or better than the 1070 in DirectX12. They have nearly double the specs of a R9 280X.

Seriously I would consider a CPU upgrade before anything else. Not trying to be mean or anything, but that's my opinion.

Hope this helps.


----------



## SabbathHB

Good info on the bios stuff, thanks. My last driver update a few weeks ago corrupted my cards bios after uninstalling the previous one. Flipped the switch and installed the most recent update. I still need to reflash the corrupted one. With that said, has anyone been playing Battlefield: 1 with the newest drivers? Since I installed the newest drivers everything has been just fine. But since day 1 of playing BF1 my fans have developed a mind of their own. While playing BF1 everything is fine. Fans adjust to my temps which aren't hitting 60c iirc. Using both HWinfo64 and MSI AB my temps are great. But about 5-10 minutes AFTER playing BF1 the fans just spin up to 100% and stay there. And at that point my cards temps are at idle. Rebooting or a quick power cycle doesn't fix it. After sitting say overnight and powering up the system the fans go back to normal. Here's the kicker that's really got me baffled. If I start BF1 back up with them at 100% they drop back to normal once the game loads..?? I've reinstalled the newest drivers with no change. I haven't tried playing a different game. Just thought of that honestly. Thoughts? (HIS R9 280x Ice-Q X2 turbo boost)

Side note: If anyone is interested my 1100t @ 4ghz and stock clocks on the card plays BF1 on high/ultra custom settings runs really well. CPU usage averages around 80-90 (multiplayer) and the GPU runs happily at 100%. Nice to know the ole 1100T isn't a bottleneck for me yet!


----------



## radier

Only this one game.


----------



## SabbathHB

Quote:


> Originally Posted by *radier*
> 
> Only this one game.


not to shabby for a 6 year old cpu and a 5 year old gpu Hu? Plays black ops 3 quite nicely too. (that's 2 games BTW.)


----------



## MiladEd

So I finalized the backplate design for my R9 280X Dual-X, and I'll have it in a couple of days. Here's the AutoCAD design so far.


----------



## jclafi

He said ".....the issue I have is that ever since WIndows 10...."

So the performance issue started when he update the S.O to Windows 10. Before that everything was normal.

Looks like a software issue, not hardware issue.

ALso, great Owners Club this one, i do have one GA R9 280X WindForce @ 1100/1500. Such great GPU @ 1080p.

See ya.
Quote:


> Originally Posted by *neurotix*
> 
> You posted this a while ago so sorry for being late. But I figured I would give advice.
> 
> The problem is basically your CPU, the Phenom II series is from 2009, it's going to be 2017 soon and that makes them 8 years old. Even my Haswell i7 will soon be 4 years old. I had Phenom IIs up till around 2012 myself.
> 
> My younger brother has a rig I built him out of old parts, he had a Phenom II X4 B50. With a 270X he was getting like 15 fps in Rise of the Tomb Raider with everything on low at 1080p. Well, his motherboard fried and instead of replacing it, he got a i3-6100, Z170 motherboard and DDR4 RAM. Now he plays Rise of the Tomb Raider at 60fps with everything maxed out using the same graphics card. So yeah it's your old Phenom processor that's the problem. He only spent like $250 for the setup too. Around the price of an i5.
> 
> I would personally say that if all you do is game, you don't need the extra threads on the Phenom II x6 anyway. Considering a Skylake i3 is probably 3x as fast as a Phenom II in single core performance.
> 
> Your R9 280X is still a pretty capable card assuming you don't go higher than 1080p. It should at least be able to run the latest games on high settings and older ones on Ultra. I would disagree in that you need a GTX 1070 for the upgrade to be worth it, the 1070 is too expensive for what it is ($400???) You can still find R9 Fury's for $300 or less right now, no one wants them, and that's what I paid for mine (I got two). In fact, newegg has a sale on the Sapphire R9 Fury Nitro right now for $269, it ends soon though. Couldn't be happier with my Fury's. And they're as good or better than the 1070 in DirectX12. They have nearly double the specs of a R9 280X.
> 
> Seriously I would consider a CPU upgrade before anything else. Not trying to be mean or anything, but that's my opinion.
> 
> Hope this helps.


----------



## MiladEd

So I got the backplate made and mounted. You can see the results here.


----------



## TheLAWNOOB

Have anybody tried playing old games like L4D2 or Payday 2 on the 280X at 4k?

I'm going to downgrade my GPU and wait for Vega.


----------



## MiladEd

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Have anybody tried playing old games like L4D2 or Payday 2 on the 280X at 4k?
> 
> I'm going to downgrade my GPU and wait for Vega.


Don't know about L4D2 and PD2, but while playing Portal 2 and The Stanley Parable, I get at most like 20-35% GPU utilization. And I've tried GTA V on 4K (through in-game framescaling) and it ran great, oh high, I was getting about 40-45 FPS. I think you'll be fine.


----------



## TheLAWNOOB

Thanks


----------



## lanofsong

Hey AMD R9 280X / 280 / 270X / 270 owners,

We are having our monthly Foldathon from Monday 21st - 23rd 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

November Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## BruceB

Quote:


> Originally Posted by *MiladEd*
> 
> So I got the backplate made and mounted. You can see the results here.


That looks good! I also like the idea for the window on your case, good work!


----------



## MiladEd

Quote:


> Originally Posted by *BruceB*
> 
> That looks good! I also like the idea for the window on your case, good work!


Thanks a lot, fellow PCMR brother.


----------



## MiladEd

I had a bit of an issue with my R9 280X last night. I woke up to a blackout, but before I went to sleep, my PC was running downloading windows updates. After the power came back on, I turned the PC on, it turned on, but I had no display signal. Thinking it might be the monitor, I hooked it up to my laptop, but it worked completely fine. Hooked back to my 280X, and turned on the PC, nothing. Turned the PC off, thinking my GPU might've finally gave in after 2 years. I waited about 15 or so minutes, then turned the computer back on and it worked fine. Played some Rise of The Tomb Raider, and it went on fine without any issues. I even decided to give Radeon ReLive a spin, which worked great.

What do you think might've caused the initial lack of display output? During that time I checked the GPU fans and they were spinning, so it wasn't overheating or anything. Should I be worried? Might it die soon?


----------



## neurotix

Quote:


> Originally Posted by *MiladEd*
> 
> What do you think might've caused the initial lack of display output? During that time I checked the GPU fans and they were spinning, so it wasn't overheating or anything. Should I be worried? Might it die soon?


This could be related to OCP protection in your PSU.

Next time, if there's a power loss or something, try hitting the switch on your power supply, let the system sit for a few minutes, then hit the switch on the PSU again and finally the power button on the front to turn it on.

My machine has done similar things before with unexpected power loss. If you were playing ROTTR and all was well, and even installed Relive, then I'm guessing your card and your system is fine. If the card was dead then you'd simply get no video signal no matter what you did, and if you have a reputable motherboard, then you would probably get no POST- BIOS beeps instead.


----------



## MiladEd

Thanks for the reply. Yeah, haven't had any issues so far. No artifacts or overheating.

My motherboard is an ASUS, and I think the battery is dead since every time the power to the PSU is cut off, I've to reapply my OC. Anyway, good to know I'm fine for now. I don't wanna upgrade until probably next year, most likely to a RX 580.


----------



## neurotix

Get a new CMOS battery. If you had told me this, I would have said that that is part of the reason too.

Most of your hardware, and yes even your graphics card (especially if UEFI), depend on the real time clock to keep the time and function normally.

My board came from the factory with a dead/messed up battery and it caused me all kinds of headaches, problems with Windows (even with Win7!) and so forth until I replaced it.

The battery is most probably a CR2032 lithium battery. These can be had in a 5 pack for very cheap on Ebay, amazon and so on. Though you'll probably want name brand (Energizer, Duracell). The ones on ebay usually aren't name brand and aren't as good but they'll work.

Replace your battery and do the trick I mentioned with the PSU and you probably won't have any more problems.

Hope this helps.

neuro


----------



## MiladEd

Sure thing, I'll do that. Thanks for your help.


----------



## neurotix

No problem, you're welcome.


----------



## lanofsong

Hey AMD R9 270/270X/280/280X owners,

We are having our monthly Foldathon from Monday 19th - 21st 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

December Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## grasudan

in msi afterburner at the bottom of general settings window you need to select the type of the gpu choose ghz edition, and tick the unlock voltage. i dont know if it works with all 280x's


----------



## 1216

Memory speed matters according to this. Does anyone know whether this applies to games?

I ran Firestrike at different clocks. 270x 1.3 volts 2GB
Highest score first
1. 1250/1450
2. 1235/1475
3. 1260/1400

Edit: bought a new power supply, now doing 1250/1500 stable. clean power is good power


----------



## Lucifer 1972

Hi, I am new in this forum.
My GPU is Asus Radeon R9 280X Direct CU overclock by MSI Afterburn
Core Clock 1130 Mhz
Memory Clock 1600 Mhz
Core 1188mv


----------



## mAs81

Hey guys,It's been a while since I posted here..I migrated my msi 280X to my brother's rig, but he's going to build a new Ryzen rig , and he'll give it back to me ..

So,my plan is to use it in a new HTPC for the gf,and well,to cut a long story short,my bro is one of those guys who doesn't take care of his hardware...at all..My card is deep in dust and God knows what else..So I'll need it to gut it for cleaning..

Anyone else has experience with the card?It's the msi twin frozr gaming 3g edition..What thermal paste would you recommend ?

Cheers


----------



## Dhoulmagus

It's a glorious card mAs81. Still tears everything apart at 1080P today. I have the Asus DCIITOP version with shin etsu G751 on it, always ran nice and cool, but getting hot now and time for a replacement (Same TIM for 4 years or so now).

*QUESTION*: I can't seem to get the voltage beyond 1200mV on my Asus 280X Direct CU II TOP. If I unlock core voltage in afterburner, I still can't reach my old stable clocks and gpu-z is clearly reading 1200 or less mv. I tried installing Asus Tweak Tool, and enabling overclocking range enhancement and rebooting, but still 1200mv is the limit. I'm certain I used to run this card at 1300mV through afterburner.. what gives?

Should I try Sapphire Trixx? Or is there a voltage lock switch I somehow flicked on this card last time I took it apart? This is driving me nuts at this point.

EDIT: I forgot I had to set it to GHz edition in afterburner, all seems good now.


----------



## MrPerforations

please add me to the club. I have three r9 280's, one sapphire dual-x (sitting in it box still) and the 2x gigabyte r9 280 wind force that I use in crossfire.
I first brought the sapphire for £180 and wanted a second to crossfire, but I found the wind forces at £140 each and they had a much better clock (1070mhz). so I brought a pair.
great card the wind forces, 65c when doing bf1.


----------



## MiladEd

Welcome to the club! I'm leaving soon though. I've a Gigabyte RX 480 G1 Gaming 8 GB in the mail, and I'll be selling my Sapphire R9 280X Dual-X after it arrives. However, I'll be doing a benchmark/comparison between the 280X and the 480 after I get it.


----------



## BruceB

Quote:


> Originally Posted by *MiladEd*
> 
> Welcome to the club! I'm leaving soon though. I've a Gigabyte RX 480 G1 Gaming 8 GB in the mail, and I'll be selling my Sapphire R9 280X Dual-X after it arrives. However, I'll be doing a benchmark/comparison between the 280X and the 480 after I get it.


That'd be sweet! I'm still sitting on the fence about getting a RX480 myself, it'd be nice to have some Benchmarks I can trust


----------



## bardacuda

It's about the same as a 290 or 390 but uses much less power. If you can wait, there are supposed to be Polaris rebrands with a bit higher clocks coming in April. At the very least the 400-series should see a price drop or have lots of used ones go up on e-bay. Vega is supposed to be out in May, although that might only be their high end card to compete with 1080/1080ti...with the more mainstream cards even more months further down the line.


----------



## MasterBillyQuizBoy

I was wondering how my 3DMark Firestrike scores stack up here. I use the free version off Steam.

Dual XFX R9-270x's with latest as of 3/21/17 optional AMD drivers
MSI 990FXA with FX-6300 to 4.5ghz
2x8GB Kingston 1600mhz to 1644mhz

Before I added the second R9-270x I'd get about 5500 points. These are both improved scores from previous version of the drivers.

8569 points with the cards in crossfire, so the GPU crossfire scaling seems appropriate. As expected the real world gaming benefits of the crossfire setup is a mixed bag. I wouldn't attempt again.

I run a 34" 2560X1050 monitor, so things are generally OK. Honestly waiting on Vega to see how it affects the prices of other cards. Will stick with AMD video card to utilize the monitor FreeSync. I am still in awe that these video cards don't support it. I had to dig around after the fact to find this information out.

Thanks!


----------



## bardacuda

Almost identical to my score with X-Fired 270s @ 1150MHz. This was on a PhenomIIX6 @ 4GHz w/ 2x4GB DDR3 @ 1600MHz.

http://www.3dmark.com/3dm/8009453

Do you have the link for your run? I'm interested to know how the physics scores compare between Thuban and Vishera.


----------



## MasterBillyQuizBoy

Quote:


> Originally Posted by *bardacuda*
> 
> Almost identical to my score with X-Fired 270s @ 1150MHz. This was on a PhenomIIX6 @ 4GHz w/ 2x4GB DDR3 @ 1600MHz.
> 
> http://www.3dmark.com/3dm/8009453
> 
> Do you have the link for your run? I'm interested to know how the physics scores compare between Thuban and Vishera.


http://www.3dmark.com/fs/12074074

If I remember right, that Phenom II X6 of yours is very close to the FX-6300.

I haven't bothered to try and overclock these video cards. Can hit 90C at times if I'm not careful with my fan profile, they're a bit sandwiched. Fan profile helps with noise. Cranking them up to 90 percent drops temperature maybe max 5C. Not worth the noise. Usually hovers at 85C and then 88C is when the profile kicks in to crank the fans up.

By the way, XFX customer service is outstanding. I am sticking with them on my next video card.


----------



## bardacuda

Oh wow. This CPU is ahead by a good margin on the physics test with a 500MHz clock deficit. I knew IPC actually went down with bulldozer but didn't know by how much. Also your cards are ahead by a good margin with a clock deficit as well. I'm surprised how much driver improvements have helped in the last two years! AMD is awesome for driver improvements on old cards. Keep in mind these are basically just 7870s from early 2012.

Here's the link of the two side by side for easier comparison

http://www.3dmark.com/compare/fs/12074074/fs/5584428


----------



## MasterBillyQuizBoy

Quote:


> Originally Posted by *bardacuda*
> 
> Oh wow. This CPU is ahead by a good margin on the physics test with a 500MHz clock deficit. I knew IPC actually went down with bulldozer but didn't know by how much. Also your cards are ahead by a good margin with a clock deficit as well. I'm surprised how much driver improvements have helped in the last two years! AMD is awesome for driver improvements on old cards. Keep in mind these are basically just 7870s from early 2012.
> 
> Here's the link of the two side by side for easier comparison
> 
> http://www.3dmark.com/compare/fs/12074074/fs/5584428


I have limited access to the link right now. When you say "this CPU", which two are you comparing?

edit: Nevermind, got to the comparison link. Wow. I have a Phenom II X4 lying around somewhere. May be the 955 or 945. I wonder if it'd be better to run it instead. Sheesh!

AMD has made the driver update process very easy. I won't bother posting all my runs from the last few years but performance is always getting better.


----------



## bardacuda

I do mining and a lot of different algos work better on either 15.12 or 16.9.3 so I'm running 16.9.3 atm and haven't tried anything newer. That said I just bought a Ryzen setup and I'm going to want to compare this rig to that one, so I'll probably upgrade to the newest drivers and do a few benches before I format this thing. Unfortunately I sold one of my 270s so I'll only be able to get single card results.


----------



## MasterBillyQuizBoy

Cool, please do post up the results with the new CPU.

Just noticed you had the standard 270, I've got the X version. Also while 20 percent seems like a lot, the FPS isn't much. Still interesting to compare. Those Phenom chips were aptly named!


----------



## bardacuda

Yeah the standard version requires a BIOS edit in order to clock higher than 1050, but they can clock as high as 1270 after doing so (assuming you could get the voltage high enough to be stable which you couldn't really) but on this Gigabyte the voltage is actually locked.
My other one was an ASUS DCU II though and I was able to reconfigure the voltage in the BIOS on that, so it was basically no different than a 270X, except that changing the voltage requires slightly more effort than moving a slider in afterburner.

I was able to get that one to do a Valley run at 1250 I believe...although that clock wasn't stable and the chip's ASIC quality was over 90%...92% IIRC. There was some artifacting though and I had to make a few attempts to get through without crashing. It could run pretty stable at 1200-1225 with enough voltage...but for normal use 1150 was about the max on that.
In comparison the Gigabyte has 67.8% ASIC quality and could barely get through benchmarks at 1150. 1050 was about the max stable clocks for normal 24/7 usage and nowadays it will even crash at stock speed sometimes because it has degraded a bit from heavy mining and folding usage.

I'll be sure and do a run with it on the Ryzen system and post those results too. I'll do the runs at 1050 to match your card.

BTW, the memory timings on these cards loosen at 125MHz increments, so it is actually better to run at 1375 mem clock rather than 1400 due to tighter timings. As soon as you go to 1376 you lose performance, and unless you can get nearly all the way up to 1500MHz it's not worth going past 1375. Don't even bother going above 1500 as you won't reach 1625...and again as soon as you go to 1501 you lose performance.


----------



## MasterBillyQuizBoy

Quote:


> BTW, the memory timings on these cards loosen at 125MHz increments, so it is actually better to run at 1375 mem clock rather than 1400 due to tighter timings. As soon as you go to 1376 you lose performance, and unless you can get nearly all the way up to 1500MHz it's not worth going past 1375. Don't even bother going above 1500 as you won't reach 1625...and again as soon as you go to 1501 you lose performance.


I'm confused, isn't the memory clock 5600MHz?

I have the "Boost Ready" version stock GPU clock is 1050MHZ and memory is listed at 5600MHz.

Well I actually have two different part numbers. I think memory or mosfet brand is different along with size. But essentially they are the same card in Crossfire.


----------



## bardacuda

5600 is the effective data rate but the actual frequency is 1/4 of GDDR5 rate...so 1400 in that case. 1375 (or 5500 if you prefer) actually performs better.


----------



## MasterBillyQuizBoy

Quote:


> Originally Posted by *bardacuda*
> 
> 5600 is the effective data rate but the actual frequency is 1/4 of GDDR5 rate...so 1400 in that case. 1375 (or 5500 if you prefer) actually performs better.


Wow. I am having slight heat issues. So...I turn down the memory, get a faster card and possibly help with some heat issues. Sweet.

I've got two of these things and rarely use the Crossfire mode. With a custom cooling solution as simple as reapplying thermal paste all the way to liquid, could I see some modest gains in FPS with moderate to high OC?


----------



## bardacuda

It will only make the memory have lower latency but won't necessarily give you a noticeable performance increase (but it should be an increase nonetheless and have better stability too) or lower your heat either (it's just a slight mem downclock after all....no voltage tweaks). The low hanging fruit for better thermals is improving the airflow and lowering the _core_ voltage. If the two cards are sandwiched up against each other, then the one on top usually has a rough time.

If those aren't feasible (motherboard only has 1 option for placement...or downvolting forces you to have to downclock more than you want) or that fruit's already been picked then you can try reapplying the thermal paste but whether or not it will help or how much depends on the card.

My ASUS card had garbage cooling when I got it. It would run up above 90°C and keep climbing until I shut it down even at stock until I redid the TIM. After that it was about the same or maybe a little better than the Gigabyte. It would stay in the mid to low 60s at full load. The Gigabyte on the other hand would hover around the mid to high 60s at full load...and when I saw the gains from re-doing the TIM on the ASUS I tried it on the Gigabyte too, but it didn't actually make it any better. They already did a really good job with it from the factory.

I have a couple XFX Double Dissipation 290s and they seem to be about on par with the Windforce 3 290, so they're probably pretty good as is, but maybe you can improve them...hard to say. Just take a look and see if it uses any thermal pads on the RAM or VRMs before you pull it apart, cuz you'll need to have a pad on hand so you can replace those too if you do.

EDIT: Update

So I took apart one of my XFX cards which had a failing fan. Cleaned the heatsink, the failing fan's bearing, greased the fan bearing, and reapplied the TIM. It's now running at about 10°C cooler than the other XFX and the Gigabyte, with slightly better VRM temps as well (better than the other XFX but still about 10°C higher than the Gigabyte). I took a screenshot after a few minutes of ETH mining for comparison. The one I fixed up is referred to as "XFX Black Ed." in the screenshot and the one I left alone is "XFX Double D.".

I'm surprised it improved that much over the other XFX...but how much of that was due to the TIM reapplication alone, and how much was due to other maintenance I'm not sure. Keep in mind these are 290 cards so the cooler is probably a bit different than yours @MasterBillyQuizBoy but it should give you some idea of what you could expect for results from doing this.

I think I'll take the other two cards apart and do some maintenance on them as well to see how they improve and update this again.

Also I snapped some pics of the card while I had it apart. You can see that the factory TIM application was not the greatest but it wasn't terrible either. Also I didn't have to replace any thermal pads as that whole section of the cooler remained attached to the card and the heatsink/fan assembly was a separate part. BTW those silver and black pieces that stay attached to the card and touch the thermal pads are actually plastic, not metal, which I thought was weird. I would think the VRAM and mosfets would be better off just being uncovered and getting direct airflow from the fans. The Gigabyte's VRMs stay much cooler than the two XFXs so I think XFX definitely could have done better here.

Sorry for the crappy image quality. My smart phone is not that smart (because that's what desktops are for...phones are for phone calls and texts IMO







). It's a cheap LG Android.






EDIT 2: Update 2

So I did the same maintenance on the other XFX. It wasn't that dusty and the fans were already working pretty normal, so pretty much just redid the thermal paste on that one. Currently after 20 mins of mining it is at 70°C core and 82°C VRM...so a little worse, and fewer gains than the Black Edition (which is at 66°C core and 79°C VRM after 20 min), but still very much worthwhile.

For reference the Black Edition's VRMs would get up to 90°C and core was ~80°C or so before maintenance IIRC, but like I say it had a bad fan...so those gains weren't just from TIM.

I added in the GPU-Z for the Gigabyte card (which I haven't done any maintenance on yet) and it is at 76°C core and 71°C VRM (so already doing way better than the cleaned up XFXs on the VRM) and it has a fan on the fritz also...so I expect after the maintenance on that it will outperform both the XFX cards on core _and_ VRM temps. We'll see...



TL;DR: Re-doing the TIM on the XFX Double Dissipation cards is definitely worth doing.


----------



## MiladEd

Hey guys, it was fun being here. But I gotta go. I got myself a RX 480 G1 Gaming 8 GB, and I've sold my R9 280X.

Anyway, this is the old dog versus the new comer:


----------



## mAs81

Hadn't used my msi 280X in a long time;I had loaned it to my brother until he went Ryzen..

Got it back,cleaned it thoroughly,changed the stock TIM to Arctic MX-4 and put it in a new build for the gf..

Holy crap,that card is still a beast...It can pretty much max out new AAA games @ 1080p , no problem..AMD Fine Wine FTW,lol

The TIM change also worked wonders , after gaming for more than 3 hours (Witcher 3 in a mix of Ultra and Very High settings) , the card was well under 80c , with the stock boost and no OC

I'm very happy with it , and I might give it a slight OC in the future,I remember getting 1150-1200 core OC no problem-but since it's in the gf's PC now if she's happy, I'm happy


----------



## MiladEd

It certainly is a beast. I honestly didn't really need an upgrade, my old 280X suited most of my needs, but I don't regret selling the R9 280X and getting the RX 480. The 480 runs many of my games much better, Like Witcher 3 which gives me nearly double the framerate with even higher settings, GTA V doesn't really get a FPS boost (mostly CPU bound) but gives me similar FPS with Higher settings. DOOM FPS nearly tripled in open areas. Overwatch can also run on higher Setting with similar FPS.


----------



## mAs81

I might be getting my brother's XFX 480 if he gets a Vega in the near future...That card is a beast!

But until then , my 290 and 280X are suiting mine & my gf's gaming needs just fine..

I really love how good these cards perform


----------



## MiladEd

Yeah, part of the reason I only buy AMD GPUs for desktop. My laptop has an NVIDIA GPU (which is pretty good for its price, 940MX with 1 TFLOPS of compute performance and 2 GB GDDR5, can run GTA V and Overwatch just fine while sipping a little bit of power). Unfortunately, AMD hasn't been competitive in the laptop scene... or I might have gotten a laptop with Radeon as well.


----------



## mAs81

That might change with the new Ryzen APUs..


----------



## MiladEd

Hopefully, if I've got any money, I'll jump ship when they are available.


----------



## bardacuda

Quote:


> Originally Posted by *MasterBillyQuizBoy*
> 
> Cool, please do post up the results with the new CPU.
> 
> Just noticed you had the standard 270, I've got the X version. Also while 20 percent seems like a lot, the FPS isn't much. Still interesting to compare. Those Phenom chips were aptly named!


Hey I finally got a replacement fan for the 270 (took two months to get here!) so I'll finally be able to do those benchmark comparisons if you're still interested. Then again I'd like to get a new BIOS for the Ryzen system first and hopefully be able to get my RAM up to 2933 or 3200 because RAM speed makes a big difference on this platform. Then I'll probably do comparisons using the same drivers and CPU clock speed on both systems.


----------



## SaShaDelas

Hello everybody!!!
Help with this question I have a video card R9 280X matrix and it does not regulate the speed of one of the coolers switched between the bios did not help, tried to flash in another bios the same thing


----------



## jclafi

Perhaps a faulty cooler ?
Quote:


> Originally Posted by *SaShaDelas*
> 
> Hello everybody!!!
> Help with this question I have a video card R9 280X matrix and it does not regulate the speed of one of the coolers switched between the bios did not help, tried to flash in another bios the same thing


----------



## 2002dunx

I recently got a new Asus R9 280X Direct CU II TOP and after a few weeks one of the fans failed.... so I had to order one from the USofA at a stupid price.

dunx


----------



## SaShaDelas

Quote:


> Originally Posted by *jclafi*
> 
> Perhaps a faulty cooler ?


The cooler is fine!!!
When I press the button the 100% cooler is spinning, but the software is not controlled by one


----------



## Matt-Matt

Hey guys,

Purchased a very good condition used R9 270 recently for my spare LAN/Media PC.

I know that Pitcairn can overclock fairly well as I have done other systems with it before.

I have an R9 290 in my main PC, and understand that the new AMD drivers basically block/do not work with Afteburner and the likes.

My question is; How does one overclock past the maximum values in AMD Wattman? I'm running 1050/1500MHZ easily and want more if possible..

Wattman on my R9 290 lets me get to the clocks that I want, but that's likely because it's a reference card.


----------



## masteratarms

@Matt-Matt

This is a short 2 minute video which is very informative. Personally I have not encountered any problems possibly because I have edited my bios to suit my card and cooling solution:
https://www.youtube.com/watch?v=QGKr926VAJo


----------



## bardacuda

@Matt-Matt

You can increase the overclocking limits and TDP with VBE 7.

Hmm..wierd it's not in the OP. Anyway you can get it here:

https://www.techpowerup.com/forums/threads/vbe7-vbios-editor-for-radeon-hd-7000-series-cards.189089/


----------



## Matt-Matt

Quote:


> Originally Posted by *bardacuda*
> 
> @Matt-Matt
> 
> You can increase the overclocking limits and TDP with VBE 7.
> 
> Hmm..wierd it's not in the OP. Anyway you can get it here:
> 
> https://www.techpowerup.com/forums/threads/vbe7-vbios-editor-for-radeon-hd-7000-series-cards.189089/


I figured I can't be bothered as it's used for my secondary PC









1050/1500MHz will be fine I think, unless there is a competition or something at some point.


----------



## Pibbz

Anyone know where to find drivers for the Powercolor R9 280? Seems to not be on the official PowerColor website.


----------



## diggiddi

Try downloading from AMD support site


----------



## wallawallaman

Fan Modded Saphire Tri-X 280X here.

Stock voltage of 1.225, but superglued some fat 92mm deltas to the fan shroud.


Planning on boosting the voltage to 1.275, and trying to eek out 1250mhz. A 25% core clock improvement would be excellent.


----------



## ppn7

Hi

I gonna buy a 280x sapphire Tri-X. Not the vapor X. 
Is the voltage locked or unlocked? And if it's locked, can I flash to unlock the voltage ?

Watercooling will help to reach high overclock?


----------



## Retrorockit

Just a quick question. I need a 120mm/12cm Crossfire cable. I know they exist but I can't find one for sale. It's for a trifire setup.
It's on a locked BIOS Dell X58 OC project so don't flame me for using HD6990/6970 GPUs. There are pics. of this beast here.
https://www.techpowerup.com/forums/threads/throttlestop-overclocking-desktop-pcs.235975/page-9
Does anyone have one they're not using anymore they would be willing to sell?


----------



## Kaltenbrunner

Just running an r9 280x for now. What games will this have a hard time with @ 1440p ? Un-modded skyrim seems to run good. SC2 is almost fine. Haven't tried anything else as "new" as them yet.


----------



## SEKnox

I'm using this card , It's a very good card.


----------



## trihy

Guys, there is a bios editor for this cards?

I have a 270x and idle fan RPM are way too high (around 1500RPM) Would like to lower to something like 1000RPM, since the card idles at 30C, no point of that idle speed.


Thanks.


----------



## Josh154

8 year break and just built a cheap fx system with a r9 270x power color devil edition. Card has played some of my old games well and does great on fortnite.


----------



## ppn7

Hi,

i was able to reach 1240mhz/1.3v on core clock with watercooling R9 280X under valley 1.0 benchmark.
Crashing at 1250mhz... Memory at stock 1500mhz.

Do i need to overclock the memory to stabilize the OC or can i keep 1500mhz on RAM? 

Also for everyday use, i keep 1100mhz/1.2v.
1240mhz give me a +13,7% FPS under valley 1.0

Not bad, but the sad thing is there is no freesync support... !

Edit:

Temp under load : GPU 47°C / Vrm1 59°C / Vrm2 stuck at 26°C / Ambient 25°C / Water 35°C
240mm + 360mm CPU / GPU loop custom


----------



## LukaTCE

Jay1ty0 said:


> Nope, update to 015.039.xxxx if you can. 015.041. makes the graphics card throttle over 72ºC VRM temperatures.


I have GV-R928XOC-3GD with 015.041.000.003 F71 now
Downgrading to 015.039 would fix artifacts ? artifacts only happen when I test with any tool and CPU burner if there is no CPU burner there is no artifacts but Artifacts also happen when browsing or watching youtube and with a lot of other open tabs. It have probably also to do with some windows overlay ? I can't disable aero at Windows 10. I am having this problems a lot maybe even since I bought it but I had this same problem also with previous card (both gigabyte)
CPU tjmax don't go over 75°C stock I have more problems when OC but temp is not high at CPU usually at GPU all stock go to 75°C and when OC CPU GPU show more think even 90°C.

015.039 F12 says it's for ELPIDA and I have HYNIX Memory It matters ?
At Gigabyte site all revisions have same bios https://www.gigabyte.com/Graphics-Card/GV-R928XOC-3GD-rev-10#support-dl-bios
This is 015.039 DL https://www.techpowerup.com/vgabios/148082/gigabyte-r9280x-3072-131015-4
@ppn7 also try Heaven bench and also DX12 because, Heaven will stress CPU more and in games it stress everything CPU and GPU


----------



## NameUnknown

I've got 2 280X in SLI, are they still really able to hold their own? I've been away from hardware for far too long


----------



## FlawleZ

NameUnknown said:


> I've got 2 280X in SLI, are they still really able to hold their own? I've been away from hardware for far too long


A single 280X can still hold it's own in many games in 1080P. Crossfire sure its good when working.


----------



## NameUnknown

Good to hear, except the crossfire part. I heard some whispers that its been all but abandoned?


----------



## Ahnt

Just picked up a 280 Dual X. Sitting at 1160 / 1250 right now with 1.26v. Raising the memory ANY causes several games to lock up. 

How safe is 1.3v for daily use? I'm tempted to throw an H70 on it and see how it clocks, but wasn't sure how quickly Tahiti dies with voltage.


----------



## inlandchris

*R9 280x Matrix P*

I have given up on my PC so I want to sell my 2 R9-280x Matrix P with Ewk full water cover. What would be a good price on each?
Since the old fan is removed, I took off the “Matrix” LED and mounted it on a single 5” bay spacer so you can see the color change when the video is loading on the front of the PC, that would be included in only one GPU.


----------



## MrPerforations

hello,
i have two gigabyte 280's, and at CEX they sell £60 and buy for £30, the price is a joke and I refuse to sell them. CEX is the price guide in the used market.£30-40 each ??


----------



## inlandchris

Thanks for the info, maybe i will frame them on the wall


----------



## Marek270409

I'm looking for a R9 280X BIOS with Hynix H5GC2H24BFR memory support, but with upped 2D idle clock fix to fix flickering. I only got one for H5GC2H24AFR memory. It can't be from XFX.


----------



## boot318

Marek270409 said:


> I'm looking for a R9 280X BIOS with Hynix H5GC2H24BFR memory support, but with upped 2D idle clock fix to fix flickering. I only got one for H5GC2H24AFR memory. It can't be from XFX.


https://www.techpowerup.com/vgabios...l=R9+280X&interface=&memType=&memSize=&since=

There you can find all they R9 280x bios you want! There is another tool so you can edit clocks/voltage also. 

https://www.techpowerup.com/forums/threads/vbe7-vbios-editor-for-radeon-hd-7000-series-cards.189089/



I'm assuming you know how to flash a bios to a gpu.


----------



## Marek270409

I doesn't allow to edit clocks higher only lower.


Should be something like this: https://www.techpowerup.com/vgabios/170551/170551 with all memory clocks set to 1500 MHz but from other vendor than XFX as these are not supported under macOS.


----------



## boot318

Do you know the model of the card? I don't know if you're using a reference or custom PCB.

I'm not even sure you can flash a gpu bios in MacOs.


----------



## Marek270409

XFX R9 280X DD TDF V8.1


I'm using Windows or DOS for flashing.


This one also worked on other XFX R9 280X cards, but with AFR or Elpida memory: https://www.techpowerup.com/vgabios/170694/170694


----------



## Bride

R9 270 (officially HD7850), 1250MHz core, 1550MHz memory... I love this little baby


----------

